I’ve given three main examples so far:
- The fact that physics as a whole, according to our best current theories, is non-computable, in particular as regards quantum phenomena. Thus, it is possible that these phenomena are pertinent to conscious brain activity (see the Posner molecule paper you’ve already commented upon), and consequently, no computation can give rise to conscious experience.
- That one well-known formulation of a general reasoning agent, Marcus Hutter’s AIXI, is uncomputable in its full formulation, and thus, if general reasoning necessarily includes something comparable, likewise no computation can exactly reproduce it.
- That in my model, a mental state, faced with the challenge of adapting to environmental changes, faces a question that is undecidable by Löb’s theorem, namely, whether a possibly modified version will generally be right about what it believes (can prove) about the world.
Either of these would preclude the emergence of either conscious experience or general reasoning capabilities (actually intentionality in the last case, with conscious experience being brought in as the solution to that problem, representing a non-theoretical access of a mental state to its own properties).
But of course, that computation doesn’t suffice for conscious experience follows from the much more general fact that computation itself is mind-dependent, hence trying to explain the mind by computation just puts the cart before the horse. But that’s another discussion.