I guess this is a philosophical question on what is “you?” Personally, I would say “you” are your consciousness, memories and personality. The rest is a vessel that the consciousness is contained in. Like you driving a car are not the car. Thus, we can plop that consciousness into anything.
Some sci-fi deals with this. In the TV show “Black Mirror” the episode “San Junipero” downloads people’s consciousness to a virtual world that is indistinguishable from the real world when they are about to die. Supposedly, those in that sim eventually get tired of that and then “turn off” (so to speak) of their own free will.
In the TV show “Altered Carbon” everyone has a backup system implanted in their neck. If they die and they are poor (most people) they get whatever body is available to be put back in which leads to some weird results (like a grandma being put back in a male, tattooed gangbanger’s body or a little kid being put in an adult but they are still the original person…just a different body akin to you driving a different car). Poor people can do this one time for free. Rich people can get custom clones made and have as many as they want (but they are very expensive). That is also how most space travel is done. You do not physically go, you send your consciousness to a body on the other planet.
As you can see, there are lots of ways we could suppose this would work (more than those two).
To be frank: No, I don’t see how this could work. Those are nice SF plots, interesting with enough suspension of disbelief, like Harry Potter or Superman are nice to read because you know they are not true, so you don’t mind the contradictions. But if you think of it seriously, it is bullshit (and you have spoilt the party, because that was not the point).
Yes you can, but if it’s not the consciousness existing in my bio-brain then it’s not me. You can put my head in a jar Futurama style and that might still be me, but a copy of my consciousness in some other construction of any kind is not me, it’s someone or something else. Even if I die in the copy process the copy is still not me, and you won’t find many volunteers for a system that works that way. If I don’t die then I keep living, experiencing and feeling new things, totally apart from that copy of me.
Without getting into the weeds of mind uploading and the challenges of whole-brain emulation, I think the above are misconceptions arising from a poorly worded question.
No one is proposing that we explicitly upload “consciousness” (or download, if you prefer that term). The subject of the speculation is, as I said earlier, the ability to scan and upload all the relevant information in the brain into a suitably functional artificial substrate. “Consciousness” is just a philosophical abstraction that arises as an emergent property of such a system, whether the system is natural (a human being) or artificial. We don’t need in any sense to understand or explain what it is in order to create and populate such an artificial substrate.
Well no one knows how to upload all the relevant information from a human mind either. And there is no reason to believe that just uploading all the information produces a living, sentient being rather than just a bunch of media files.
Though is that any more feasible. If you able to upload all the relevant information, so that the entity uploaded to the cloud is indistinguishable from the fleshy meat-based entity IRL, is that functionally any different to uploading consciousness? Its certainly not clear how we’d achieve such a thing regardless.
It’s a great book, occupying that rare and hard to hit sweet spot right at the intersection of obsessively researched, brilliantly presented space travel facts and fart jokes.
As for the mind uploading question, fwiw, a majority of philosophers believe you wouldn’t survive, with only a little over a quarter thinking you would.
I’ve often thought that these sorts of missions should be crewed be experienced SSBN (nuclear ballistic missile submarine) personnel, though I don’t know how that would jive with the skills and knowledge required of being an astronaut.
IMO the skills of a SSBN crew are probably more qualified for the job than astronauts. Throw in a SeaBee constuction battalion for the building work and Bob’s your uncle.
No it isn’t, and that’s precisely the point. In order to replicate your mind, I only need to extract all the necessary information and download it into the appropriately functional device. The term “information” is doing some heavy lifting here, but it really is ultimately an issue of information processing unless one believes in mysticism, souls, and other forms of supernatural magic.
Hey, you and I have had some interesting and perhaps frustrating debates, but I’ll say two things here. First, I respect philosophers for their insights over the centuries into the human condition, but their track record on issues of technology is pretty poor. Second, thank you for the tip about the book. I love Zach’s SMBC comics but haven’t paid enough attention to realize that he and his wife Kelly had a book out.
Oh good god, you’re bringing back memories of one of the University of Chicago entrance essays c. 1992, and the reason I did not apply there.
These days, I really don’t know. I think colonizing Mars would be slightly more realistic, but I’m not convinced that the idea of downloading consciousness is that much farther off. Hell, for all I know we’ve already done that and that’s what we’re living in. There are days I swear that it feels like it’s true.
I don’t think we’ll download the contents of our brains into machines. I think we will end up adopting intelligent* machines instead of having biological children and those machine-children will eventually replace us entirely.
Then the machine-children will colonise Mars. The machine children might also be able to transfer their minds into other devices too.
*Or machines that appear in every way to be intelligent; it won’t matter if they actually are sentient in the same way that we believe we are, just as long as they behave in every way as if they were. They might be philosophical zombies, but carry on behaving as if they are a society of intelligent beings, long after the last human is gone.
We’ll probably colonise another planet in this system first - perhaps Mars, but possibly Venus, which has remarkably hospitable conditions at certain levels in its atmosphere. If we can count the Moon as a planet (which some do) then the first extra-terrestrial colony will no doubt be on the Moon, if anywhere.
But;
…interstellar colonisation is so difficult and far in the future that we will probably develop some sort of consciousness uploading before we get to Alpha Centauri. Sending digitally- or analog-encoded consciousnesses to another star may be much easier than sending living humans, who may easily die on route.
I am familiar with the philosophical arguments about the concept of uploading, and in many ways, both sides are correct - yes, your biological body and consciousness will eventually die, even if you successfully upload yourself. On the other hand, something that thinks it is you might survive, and that would be enough for many people. Over time, the population would shift from being mostly biological disbelievers in artificial consciousness, to being mostly artificial emulations that can persist for an arbitrary period into the future, and the arguments would be largely forgotten. Even if the disbelievers are technically right.
Something that thinks it is me woke up this morning; the only evidence it has that it actually is me is a large coincidence of matter and location, plus memories of being me yesterday.
On the other hand, my poor old geriatric brain is constrained by existing inside a meat shell. If I were to upload myself, I expect I’d be taking advantage of the latest cybernetic upgrades within a week - and my humanity would eventually be lost to the digital winds of time.
Or perhaps, as in Greg Egan’s short story - Learning to be Me - you get fitted with a device that, over time, learns to duplicate the functions of your brain in real time, until such point that it does so flawlessly; you now have two brains running in parallel, both connected to the same body. If the meat brain is destroyed, the redundant parallel system contains* your mind (it’s just running with no backup now - on one set of hardware instead of two in parallel).
*Or it just behaves exactly as if it did. There’s no way to know if it actually does, or if, internally, it’s having the same experience.
Unless you’re Riker. In which case they might make two copies, both of which go on to demonstrate that they are in fact not each other despite staring from the same point. It would be the same with any other attempt to download a consciousness. Which is why I think the odds of getting a living human to a planet in another solar system are higher, even if those odds are one divided by TREE(3) or some other absurdly small number.
I think that really just demonstrates that we don’t have adequate tools to really think about what identity really is - or to divide the concepts of identity and instance because in the current state, mere circumstances dictate that one instance is one identity.