If dualism is wrong, which I think it is, then the notion of uploading consciousness becomes a metaphysical impossibility, an absurdity even. I argue this in a previous article. If you haven’t read it, I recommend reading that first as it underlies where I’m going here.
Warning: lots of spoilers.
In movies and TV shows, consciousness uploads are always based on brain scans, memory files, and/or social media data. The more I watched, the more baffling it seemed that they all made the same assumption, or at least it’s what they all implied, that a digital simulation of your consciousness getting an existence of its own, is the same as you getting a chance to live forever. As if you couldn’t make a hundred copies of the code that makes up the consciousness of Johnny Depp’s character Will in Transcendence, and have all of them active on separate devices, all thinking they’re the real one. Although in the film separate devices are not a thing anymore, of course, and the AI becomes godlike encompassing more and more.
For a long time, I thought that maybe there was something I wasn’t getting, but more and more it seemed to me that almost none of them had properly thought this through, not the makers of Transcendence, the San Junipero Black Mirror episode, Amelia 2.0, Devs, etc. Either none of them thought it through, or something else is going on.
In Devs, it’s pretty obviously absurd. Don’t get me wrong, I liked the show a lot, but how can people as intelligent as the scientists who are creating a machine, that will in turn create an exact full simulation of reality, be convinced that this will provide the boss with a chance to be reunited with his dead daughter? He misses her so bad he would do anything to be with her again. Is he so deranged by grief that he believes that when the simulation of his consciousness will get to be with the simulation of her consciousness, that that will be him, that he is going to get to experience being with her again, as if he is not forever stuck on the other side of the screen, even after his death?
Here I want to make the argument that these are actually all religious movies and TV shows in a high-tech costume.
It’s all about life after death.
In Devs, it’s a brilliant rich dude consumed by grief, creating his own afterlife with a team of brilliant scientists, to be sure to be reunited with his lost one.
In San Junipero, the SIM is a buffet of potential afterlives for you to sample. You can visit the SIM with a little gizmo that they put on the side of the forehead. Then as you die you can pass over, and stay in the SIM. Here it’s explicitly a high-tech afterlife. But what passes over here? It implies a hardcore dualist view in which consciousness does not require the brain. It can keep going after death in the SIM without the brain, even though the brain was instrumental in connecting to the SIM.
In Upload, in the aftermath of a car crash, as the protagonist is dying, he has to choose between the operating table and the upload division. Even though he’s pressured into it, there was not much doubt what he would choose of course, as the show’s called Upload. He’s basically choosing death so that his digital clone can live in SIM heaven. They disintegrate his head in the upload process, leaving a headless corpse. The disintegration process somehow captures his consciousness and they awaken it in the SIM. I would say that whatever is created afterward, his mind and consciousness have been thoroughly destroyed. In episode 7 when they’re discussing if there’s a soul, he even says that his consciousness is a simulation. And With Nora’s dad, who’s facing death because of illness, they act as if it’s a choice between real heaven and upload. If there were a real heaven wouldn’t his soul go there anyway, whether or not he uploaded so that his soul goes to heaven and his simulated consciousness goes to the SIM? In the end, there would be two of him. Unless the premise is that the upload process actually captures the soul and traps it in a simulation? It seems that’s the only way it can make sense. Life after death’s techno upgrade, as a more sure thing than the old beliefs.
But then what about substrate compatibility? If at this moment in time we can not point to anything measurable that would constitute the classic soul, you know, the one sent by God and trapped in a sinful body, then how are you ever going to catch it? How are you going to digitize it while containing it? How are you going to chain it to a computer program? And capturing the soul to put it into a computer program, wouldn’t this then be a type of uninformed voluntary imprisonment, like a digital purgatory that for all eternity, or at least as long as they keep the power on in the system, would prevent you from passing on to heaven or hell? I guess if you’re going to hell anyway, then a digital afterlife resort purgatory situation might be preferable.
Here’s how they imply continuity, the real person has to die first, and then you see the same person seemingly resurrected as a digital consciousness. But what if the person survived and they activate the simulation anyway? What if Johnny’s character survived while his simulated consciousness does all that it does? Do you think the real person would take credit for all the digital one does? Could you ask the real person, ‘hey how does it feel to be free on the web as a digital entity?’ Would he say, ‘I don’t know, I seem to have lost contact’? Was there ever any contact? And would the digital Will try to convince meat Will’s partner, Evelyn, he’s the real one? Could be a romantic drama with rivalry between the Wills for Evelyn. I guess they could then upload Evelyn as well to complicate things in the second season.
In Amelie 2.0 scientists have developed the technology to create perfect synthetic replicas of the physical body. The brain scanning requires the removal of the physical brain. So there’s no question of the original surviving. In this movie, there is debate in the media about the issue, but there’s really no one making reasonable arguments. There’s the developers acting like the synthetic clone is the real person, arguing that making a clone like this is a way of saving a person’s life, from Alzheimer’s or cancer or being quadriplegic, etc., and then there’s the religious right arguing that this technology is an affront to God and lacks a soul. Those are the sides of the debate.
Might the desire of people to believe in upload as a way to live forever, and the ease with which people go along with how this is framed, be because it has the exact same structure as belief in an afterlife? Does it just piggyback on the same archetypal narrative structure, slotting into place so people automatically accept it? Is it the lack of God in people’s lives, after a painful divorce or even demise replaced with emptiness, that makes them hungry for this, to the point of very smart people losing their critical judgment? Is it lack of meaning?
Transcendence at least grapples with a lot of the questions. It just answers them poorly. Here the mind is seen as a pattern of electrical signals. If you can establish what that pattern is, you can upload it, and thus the mind is uploaded. It is even said as a counterargument, that at the very best they’ll be making a digital approximation of him. But co-scientist enabling genius, Evelyn, doesn’t hear it. ‘We can save him’ she says, equating the digitized pattern with the dying man.
The first doubt that comes is with questioning ‘How do we know this is actually him?’ ‘We don’t know how much of Will’s consciousness actually survived.’ Max seems to believe, like Evelyn that if you can reproduce the pattern exactly, you can reproduce the consciousness, and that it would be his consciousness and not just a digital simulation. He just doubts they were successful, thinks it’s not actually Will, that they created something new. Evelyn disagrees, takes the copy for the real thing. It has his memories, so it’s him.
Most of the rest of the movie is about the questions around AI. Things go bad for the AI because the humans panic. They see something they can’t control, so they make it their mission to stop it, arguing that it’s not really him. But also this is addressed in passing: after the degree to which the intelligence has evolved, does it matter whether it’s Will or not? Honestly, doesn’t it kind of look like humans rebelling against God’s digital resurrection?
The characters’ thought processes get pretty wonky in service of the conclusion they want to end with. If in reality, it went as far as it does in the film, that would be it, the apes would not succeed in maintaining their dominance over the planet they’ve been destroying. God lives! And because the AI in the end gives in and sacrifices itself in response to Will’s friends’ panic, after her misgivings, Evelyn does conclude it’s really him after all. And that’s the conclusion of the film, that the pattern, copied and digitized, is the original. Again, why? Because then investing in the creation of this technology might give us a chance at confirming or establishing an afterlife? How much do we want people on board with this? Can we fill up the meaninglessness with investment into bringing God back? The 2.0 version? Are we ready for this? Max got it right early on, but then the film seems crafted to disprove it. It really is like one of those Christian movies where the Christian metaphysic turns out to be correct after struggling against healthy skepticism.
Now I do not want to imply that the creators were doing all this intentionally. I have no knowledge of the specific creative processes involved, nor of the actual belief systems of the creators. I could imagine the creation of these narratives being intuitive processes, in that case then perhaps pointing to certain unconscious drives and needs. I don’t know.
In conclusion, I do think these are religious creations, thought up by people who might be feeling a lack of meaning in their lives, helping them believe it’s possible to get to an afterlife this way. It seems you’d have to have a pretty religious constitution to either have these blind spots or to be actively trying to manifest the possibility of an afterlife by creating a narrative. Or maybe they don’t care what’s possible and what’s not, and don’t worry about it and are just trying to make something easily digestible for a wide audience, and it’s the ones actually preaching the gospel, like Musk and Kurzweil and co that are trying to fill the gaping maw of meaninglessness…
To end with, I want to mention there is a Black Mirror episode where they do get it right. That is the first episode of season 2, called ‘Be right back’. I recommend that one. It feels honest. Interesting though that this is an episode from when the show was still fully British, whereas San Junipero is several seasons later with an American production and catering to an American audience. I find myself wondering if that made the difference.