“As an academic discipline, philosophy of mind may be, of its nature, especially immune to conclusive statements of any kind; there really is no other area in philosophy in which the outlandish, the vague, the willfully obscure, and the patently ridiculous are tolerated in such lavish quantities. This may be inevitable, though. Perhaps it really is the case that, as ‘new mysterians’ like Colin McGinn have argued, our brains simply have not been equipped by evolution to think about thought, or at least to understand it. I think it much more likely, however, that our conceptual limitations are imposed upon us not just by our biology but also by the history of intellectual fashions. What makes the question of consciousness so intractable to us today, and hence so fertile a source of confusion and dashingly delirious invention, is not so much the magnitude of the logical problem as our inflexible and imaginatively constrained loyalty to a particular ontology and a particular conception of nature. Materialism, mechanism: neither is especially hospitable to a coherent theory of mind. This being so, the wise course might be to reconsider our commitment to our metaphysics.
“Whatever picture of reality we choose to cling to, however, we should at least be able to preserve some appropriate sense of the sheer immensity of the mystery of consciousness — just a humble sense of how much it differs from any other, obviously material phenomenon — no matter how far afield our speculations might carry us. We should be able to notice that we are talking about something that is so unlike anything else known to us empirically that, if it can be explained in physical terms, it also demands a radical revision of those terms. To account for subjective consciousness in a way that shows some grasp of its apparent resistance to the mechanistic understanding of things, it simply cannot suffice to offer hypotheses concerning what functions consciousness might serve in the general mechanics of the brain. J. J. C. Smart, an atheist philosopher of some real acuity, dismisses the problem of consciousness practically out of hand by suggesting that subjective awareness might be some kind of ‘proprioception’ by which one part of the brain keeps an eye on other parts of the brain, rather as a device within a sophisticated robot might be programmed to monitor the robot’s own systems; and one can see, says Smart, how such a function would be evolutionarily advantageous. So the problem of how the brain can be intentionally directed toward the world is to be explained in terms of a smaller brain within the brain intentionally directed toward the brain’s perception of the world. I am not sure how this is supposed to help us understand anything about the mind, or how it does much more than inaugurate an infinite explanatory regress. Even if the mechanical metaphors were cogent (which they are not, for reasons mentioned both above and below), positing yet another material function atop the other material functions of sensation and perception still does nothing to explain how all those features of consciousness that seem to defy the physicalist narrative of reality are possible in the first place. If I should visit you at your home and discover that, rather than living in a house, you instead shelter under a large roof that simply hovers above the ground, apparently neither supported by nor suspended from anything else, and should ask you how this is possible, I should not feel at all satisfied if you were to answer, ‘It’s to keep the rain out’ — not even if you were then helpfully to elaborate upon this by observing that keeping the rain out is evolutionarily advantageous.
“This is the great non sequitur that pervades practically all attempts, evolutionary or mechanical, to reduce consciousness wholly to its basic physiological constituents. If there is something structurally problematic about consciousness for a physicalist view of things, a strictly genetic narrative of how consciousness might have evolved over a very long time, by a very great number of discrete steps, under the pressure of natural selection, cannot provide us with an answer to the central question. What, precisely, did nature select for survival, and at what point was the qualitative difference between brute physical causality and unified intentional subjectivity vanquished? And how can that transition fail to have been an essentially magical one? It makes sense to say that a photosensitive cutaneous patch may be preserved by natural selection and so become the first step toward the camera eye; but there is no meaning-sensitive or category-sensitive patch of the brain or nervous system that can become the first step toward intentionality, because meanings and categories are not physical things to which a neural capacity can correspond, but are instead products of intentional consciousness. By the same token, these questions are not answered by trying to show how consciousness can be built up from the raw accumulation of the purely physical systems and subsystems and modular concrescences constituting conscious organisms. At some stage of organic complexity in that process — amoeba, cephalopod, reptile, viviparous mammal, Australopithecines, what have you — a qualitative abyss still must be bridged. It might be tempting to imagine that we could imaginatively dissolve consciousness into ever smaller and more particular elements, until we reached the barest material substrate, and then conceptually reconstitute it again without the invocation of any immaterial aptitudes, just as we can make the image on that pointillist canvas I mentioned above dissolve before our eyes simply by drawing as near to it as possible and can then make it reappear simply by stepping sufficiently far back again. There is no magic in any of that, no matter what those credulous savages stupefied by the superstitions of ‘folk psychology’ might believe. But, then again, there is something usefully recursive about this metaphor: Who does the standing back, after all? Where is this point of perspective that allows for the appearance of an ordered unity located? At what point does the chaos of sensory processes somehow acquire a singular point of view of itself? These are not facetious questions. There is a troubling tendency among materialist philosophers of mind and cognitive scientists to indulge in analogies that, far from making consciousness more intelligible, are themselves intelligible only because they presume the operations of consciousness. It is not uncommon to find cameras or televisions mentioned as mechanical analogies of the mind’s processes of representation; but, of course, a camera does not look at pictures and a television does not watch itself, and there is nothing even remotely representational in their functions apart from the intentions of a conscious mind, which is to be found not in those devices but in a person. Something very similar is true of attempts to explain human thought in terms of computers, as I shall discuss shortly. All such analogies terminate precisely where they began: in the living mind, imperturbable in its incommunicable subjectivity and awareness, still the mysterious glass in which being shines forth as thought.”
— from David Bentley Hart, The Experience of God: Being, Consciousness, Bliss (Yale University Press, 2013). Hart talked about this book on Volume 122 of the Journal.