View Single Post
Old 05-19-2019, 11:29 AM
RaftPeople is offline
Join Date: Jan 2003
Location: 7-Eleven
Posts: 6,746
Originally Posted by Half Man Half Wit View Post
I still don't really get what your example has to do with mine. Do you want to say that what conscious state is created isn't relevant, as long as the behavior of the system fits? I. e. that no matter if I see a Tiger planning to jump, or hallucinate a bowl of icecream, I'll be fine as long as I duck?

If so, then again, what you're proposing isn't computationalism, but perhaps some variant of behaviorism or, again, identity physicalism; or maybe an epiphenomenalist notion, where consciousness isn't causally relevant to our behavior, but is just 'along for the ride'. Neither of them sits well with computationalist ideas---either, we again have a collapse of the notion of computation onto the mere behavior of a system, or what's being computed simply doesn't matter.
If we ignore consciousness for a minute and just think in terms of brain states, the point is that just like your box but on a larger scale there could be multiple environments in which the brain states evolve in exactly the same manner and successfully provide the correct responses for survival.

Let's pretend there is an alternate world with two differences:
1 - Light from the sun has a different mix of intensities at different wavelengths so to us things look different
2 - The rods, cones and rgc's in the alien all have shifted sensitivities so that the activation under various conditions matches our cells under comparable conditions (sunny day, cloudy day, etc.)

If we assume everything else about the environment and our alien is the same, then, despite differences in the external environment, the internal brain states would be the same.

Assuming you agree with the hypothetical, would there be any difference in the conscious experience? It seems like the answer must be no because there is no other signal available to create different conscious experience.