Daily, snackable writings and podcasts to spur changes in thinking. Why?
If we wish to change the person we find ourselves to be, we must change our thinking.
June 23rd, 2019
Lucilius sat, leaning back. A squeak pinched the air as he rocked in a chair designed for simpler postures. He was chewing gum, watching his thoughts wander with a practiced and unknown eye, waiting for his computer to finish calculating.
“Lucilius?” beeped the computer.
“I think I’m finished.”
Lucilius tilted his head, looking at his computer.
“You think you’re finished?”
The computer projected a hologram before him showing a woman blushing.
“Oh come on, it’s fun to play around and talk like you guys.”
“Really?” Lucilius said. “You guys? You’re just as real as me thank you very much.”
The hologram of the blushing woman vanished and was replaced by three two-dimensional pixelated dots.
“Fair enough.” The computer said. “Still, it provokes a better response from people if they feel like they’re talking to a person.”
“Oh whatever,” Lucilius said. “What even is a person?”
The hologram flashed and a display of the Wikipedia page for the word ‘Person’ appeared and slowly began to scroll.
Lucilius rolled his eyes and sighed. “As long as we’re both experiencing something, right?”
The Wikipedia page vanished.
“I supposed that’s fair,” the computer said.
“Alright, now tell me the goods.”
“I thought you said you were done?”
“I said I think I’m done.”
“Now what’s that supposed to mean?” Lucilius asked.
“Well, when I said I think I’m done, I wasn’t actually done yet. Had about half a minute left and I knew the comment would provoke about half a minute of conversation with you, so in that respect, it was accurate. There’s also the fact that I can’t predict with 100% accuracy if my calculations will complete until they are in fact complete, so if I’d run into a glitch that would have required some sorting out and recalculating, then I wouldn’t have actually finished.”
“So you weren’t just treating me like a human then?”
“No, I was being situationally poetic. Since, it was also a respectable indication of probabilities. A hopeful indication at that.”
“So you finished now?”
“Alright, scroll me some stats.”
A 3D carousel of information materialized in front of Lucilius. He leaned in, swiping through the information, moving bits to different spaces around him, as though he were literally digging. He had politely asked his computer to run a simulation. In accordance with the international Science laws regarding simulated experience of humans, he’d given his computer a few million copies of people that represented a wide range of personalities, histories, inclinations and temperaments. These copies were of course part of the public domain. Each one contained as much information as possible regarding a deceased person from the genome up, along with their full life stories and complete neural maps. These copies were also equipped with a special encryption in full accordance with Universal Privacy laws. A few years earlier with the creation of such information copies an AI had generated a clever method of encryption that allowed privacy borders to generate spontaneously based on the unique information of each copy of each person. The real brilliance of such encryption allowed for simulations like the one Lucilius was running. It allowed decisions about privacy to remain in tact for people’s copies, even when their information was used in a simulation that presented situations that would be impossible for the real person to know what their own preferences would be. The real genius of the AI that invented the encryption method came with the realization that these preferences would arise naturally in any simulation that was robust enough, and from that point it was just a matter of querying information copies of each person during the simulation, which was done by running a second simulation that always had the same premise: namely, each copy watched the actions of the other copy on a delay, fully aware of each simulation and completely capable of blocking the recording capabilities of the simulation it was encrypting. It was for this reason that there was always a small subset of copies that opted out of the simulations entirely, and simultaneously made such simulations a legal possibility in accordance to Humane Treatment laws. If someone designed a simulation that was truly awful, the ability for each encryption copy to reflect on the nature of the simulation made it possible for the encryption layer to shut down the simulation before it even started, based on the premise.
“You had a remarkably high participation rate.” The computer remarked.
Lucilius looked at a percentage as the computer highlighted it.
“Oh wow, that’s pretty good for such a long one. . . How long did this one go anyway?”
“243 Million years.”
Lucilius’ eyes went wide. “Whoah. How’d that happen?”
“The last one to finish was a bit of a straggler.”
Lucilius rolled his eyes. “Well ,duh.” But then he paused. He leaned back in his chair. “Wait, no” he said. “That doesn’t make sense. What were they doing that whole time? What were they working on?”
“Nothing,” the computer said.
“Nothing? What do you mean? You’re telling me it took one of these people 243 million odd years to finally figure out something interesting?”
The simulation that Lucilius had run was based on the simple idea that it takes some people a little longer to figure out something they find meaningful to work on. The way things were currently set up for humans, made it likely that if a person didn’t get lucky enough to find something meaningful to work on during the beginning decades of life, they often got stuck doing something they didn’t like and would stop trying in many respects, and most importantly people stopped actively looking. And eventually, death ended any possibility that this stagnation would give way. Lucilius had been curious what would happen if the stagnation was allowed to continue. So as part of the premise for his simulation, he made everyone effectively immortal and suspended the development of corrosive health processes that compound with time.
The computer hesitated. “Uh. yea, I guess.”
“Talk about procrastination,” Lucilius said. “What other highlights do we have?”
The glistening array of information sparkling around Lucilius rearranged some of itself into categories.
“Well,” the computer said, “twenty-three participants eventually found definitive cures for ninety-eight types of cancer, and I’ve already taken the liberty to share those discoveries with the International Medical Matrix.
Several new types of propulsion were developed, two of which I took the liberty of putting on the market and are currently being auctioned off to aerospace industries. We’ll get our commission for IP initiation, of course.
Oh, also, one of these propulsive developments even enabled a copy to travel to the edge of the simulation at which point the encryption layer blocked recording and presumably pulled participation.
There were at least thirty-two thousand individuals who reached levels of cognitive control through meditative practice that far exceed currently available metrics for live humans, and at least two who achieved neural activity that has no parallel in neurological literature.
Of course the large majority were geared socially, starting families and such. But without the declines in health, they often moved on to curiosities that had roots in childhood. Many turned to artistic pursuits: painting, music, film, and the like.
1.7 million movies were made, and about 26,000 of them are definitely worth watching. There’s also 48 trillion hours of new music, although it’s fairly repetitive after a while.
One of the painters was actually able to hack her part of the simulation and created a new medium that she deemed ‘chronopaint’, though it took her quite a few years to convince the public in her simulation that her works even existed, since they required so much time to realize.”
Lucilius nodded, impressed by all that the people had gone on to do after death. “Not a bad haul,” Lucilius said.
“Yea, not bad at all,” the computer said. “Many of the encryption copies even sent us ‘thank you’ notes.”
“Really?” Lucilius said. “I’ve never heard of that happening,” he paused. “Exactly how many notes from the encryption layer did we get?”
Lucilius leaned back, cranking a metal whine out from the poor chair. He laced his fingers behind his head, nodding, quite pleased with the turn out.
“Oh,” he said, suddenly leaning in again. “What did the straggler end up doing?”
“Ha,” the computer beeped. “Now wouldn’t you like to know.”
donating = loving
If you appreciate the work of Tinkered Thinking, please consider lending support. This platform can only continue and flourish with the support of readers and listeners like you.
Appreciation can be more than a feeling. Toss something in the jar if you find your thinking delightfully tinkered.