I’ve never viewed the so-called “hard problem” as any problem at all. According to David Chalmers, who coined the term, the hard problem is supposed to be the problem of figuring out what our idea of consciousness refers to in the real world. The obvious answer is that it refers to brain processes that feel like something. What’s so hard about that?
The only reason that many people feel there’s a problem is that they can’t stop thinking in dualist terms. They have a strong intuition that the brain is one thing, and that the conscious feelings are something extra, some kind of spooky force field that floats above the physical matter of the brain. And then of course they do have a problem, indeed a slew of problems.
What special feature of the brain allows it to generate the extra feelings? Why does it generate those feelings rather than other ones? Why, for that matter, does it generate any feelings at all?
If only we could stop ourselves seeing things through dualist spectacles, we’d no longer feel that there is anything puzzling about consciousness.
Ludwig Wittgenstein thought that all philosophical problems are due to nothing but conceptual confusion. As he saw it, the aim of philosophy is solely to “show the fly the way out of the fly-bottle”. I find this a deeply dispiriting view of philosophy in general. Still, when it comes to consciousness, I’d say Wittgenstein had things exactly right. If only we could stop ourselves seeing things through dualist spectacles, we’d no longer feel that there is anything puzzling about consciousness.
Certain brain states are like something for the subjects that have them. What’s so puzzling about that? How would you expect them to feel? Like nothing? Why? That’s how they feel when you have them.
As long as we remain within the grip of dualist intuitions, we can’t help but think that consciousness involves some mysterious light that attaches itself to certain physical processes. And then we wonder what turns this light on, and where in nature it is to be found. But in reality there’s no such light. There are just physical processes, all of which have the potential to become conscious by coming to play a role in intelligent reasoning systems.
Conscious states are just ordinary physical states that happen to have been co-opted by reasoning systems.
Consciousness doesn’t depend on some extra shining light, but only on the emergence of subjects, complex organisms that distinguish themselves from the rest of the world and use internal neural processes to guide their behaviour. Once these neural processes are so present for these subjects, they are like something for them. Supposing that this involves some extra light is like thinking televised events attract the cameras by displaying a special lustre. In truth, of course, they are just ordinary events that happen to get pointed at by the cameras. So too with conscious states. They are just ordinary physical states that happen to have been co-opted by reasoning systems.
If I am right, and the so-called “hard problem” is nothing but a by-product of misbegotten dualist intuitions, then the next question is why all of us—and I don’t except dyed-in-the-wool materialists like myself—find it so difficult to shake off these intuitions. This is a very good question, and deserves more attention than it has received up till now, though a number of developmental psychologists and cognitive anthropologists have already made some interesting suggestions.
Join the conversation