Consciousness can’t be downloaded into a computer, for the simple reason that computation is an act of interpretation, which itself depends on a mind doing the interpreting. Thinking one could download consciousness is the same category error as thinking the sentence ‘it is raining’ is the same sort of thing as it actually raining. But the former is merely a symbolic vessel, filled with content by an intentional mind; while the latter is water falling from the sky.
It’s easy to see that a system only computes if it is properly interpreted. It’s only our use of ‘transparent’ symbols that makes us think that what a computer computes is an inherent feature of the computer, when in truth, it’s not anymore inherent to it than it’s inherent to the word ‘rain’ to mean ‘water falling from the sky’.
Consider a computer with less obviously transparent symbols. Say you find a device that’s constructed as follows: it has four switches, and three lights. If you flip the switches, the lights come on. In your experimentation, you find that there are certain rules according to which these lights light up. If you consider the switches in groups of two, and consider a switch being ‘up’ to mean ‘1’, while ‘down’ means ‘0’, and furthermore, consider each light to mean ‘1’ when it’s lit, and ‘0’ when it isn’t, you can use the device to add binary numbers (of a value up to three).
Now, suppose somebody else examines that same system. They might well come up with an entirely different interpretation: they could, for example, consider ‘switch down’ to mean ‘1’, and ‘light out’ likewise. Then, to them, the system would compute an entirely different function of binary numbers.
Many more interpretations are possible. You could take ‘switch up’ and ‘light out’ to mean ‘1’. You could interpret them as bits of different significance—say, you’re used to reading Hebrew, and thus, consider the rightmost light to map to 2[sup]2[/sup], the middle one to give 2[sup]1[/sup], and the leftmost one to yield 2[sup]0[/sup].
And so on. Each change of interpretation in that way will yield to the device computing a perfectly sensible binary function; each person with a different interpretation could use it as a computer to compute that function.
Thus, what computation a system performs is not inherent to that system, but is, exactly like what message a text conveys, a matter of interpretation. But if that’s so, then computation can’t be what underlies consciousness: if there’s no fact of the matter regarding what mind a given system computes unless it is interpreted as implementing the right computation, then whatever does that interpreting can’t itself be computational, as otherwise, we would have a vicious regress—needing ever higher-level interpretational agencies to fix the computation at the lower level. But if minds then have the capacity to interpret things (as they seem to), they have a capacity that can’t be realized via computation, and thus are, on the whole, not computational entities.