The Hard Problem of Consciousness: A Cognition Problem?

A couple month’s ago, I was at a conference where Anthony Jack proposed a very interesting theory: maybe we have two neural systems (Task Positive Network [TPN] and Default Mode Network [DMN]) that produce conflicting intuitions about some age-old philosophical puzzles. These conflicting intuitions lead us to get stuck when thinking about these puzzles (e.g. the hard problem of consciousness, the explanatory gap, or qualitative consciousness) are the result of conflicting intuitions (Jack et al 2013).

I was struck by Jack’s presentation for two reasons: (1) I was presenting a poster with a similar motivation at the same conference and (2) I have long been interested in a biological examination of (academic) philosophers.

The View

To get a bit clearer about what Jack has in mind, let me give a rough summary of his claims. The first system gives rise to our intuitions about mindedness. It’s this system that is working when we attribute invisible beliefs or thoughts to the people around us. The second system implies a sense of brute physical cause and effect. This system is what helps us understand how to (among other things) effectively skip rocks at the local reservoir. Coordinated thoughts about mindedness seem to activate the first system, and (more importantly) deactivate the second system—and vice versa. Each system can be primed and thus sway our intuitions in one direction or another. This could imply a couple things: (a) we have trouble recruiting both systems and intuitions about a single puzzle and (b) our dominant intuition about a topic could be somewhat dependent upon which system is recruited when we are thinking about it.

The Implications

If Jack is correct about these two systems, then we might be able to appeal to these intuitions in explaining why the hard problem of consciousness seems so puzzling to us. For instance, we could hypothesize that one system fosters an intuition that we have non-material minds and the other system fosters an intuition that immaterial minds are crazy-talk—how could they have any causal relationship with bodies?! It’s these two orthogonal intuitions that make the hard problem seem so puzzling. If we are to resolve this conflict, it seems that we will have to choose from one of two winner-take-all conclusions: either our minds are immaterial and “purely mental” or physical and purely determined by physical causation.

But if the intuitions in conflict were selected for in contexts other than philosophy (which seems right), then we might wonder if these intuitions are reliable enough for use in philosophy. Perhaps ignoring these intuitions could be actually be helpful philosophy of mind. I admit that I have no idea how to ignore these powerful intuitions or where we would begin once we successfully ignore them, but it’s not like we’re making much progress on the hard problem as it is. So I am open to a new approaches.

Image Credit: “Figure 6: Prevalent Networks Found in the PMC” via Cauda, F., Geminiani, G., D’Agata, F., Sacco, K., Duca, S., Bagshaw, A. P., & Cavanna, A. E. (2010). Functional connectivity of the posteromedial cortexPLoS One5(9), e13107 licensed under CC BY

Published by

Nick Byrd

Nick is a cognitive scientist at Florida State University studying reasoning, wellbeing, and willpower. Check out his blog at byrdnick.com/blog

5 thoughts on “The Hard Problem of Consciousness: A Cognition Problem?”

  1. 1. It should seem as no surprise, thst the brain has organisational/evaluative mechanisms for determining action, and that having more than one must reflect the fact that they compete with one another (in some sense of the word) in a manner that depends on the kind of stimuli requiring action.
    Nevertheless, I do think that you have a point that is very interesting:
    2. In ‘The Republic’, Plato suggests that through the main social classes there correspond three competing “parts of the soul”, those being, reason, spirit, and desire. Through hundreds of years of reflective analysis, philosophers have continued to reason that there are different components responsible for action – which to the cognitive scientist correspond with neural mechanisms.
    In my (reductive) Hierarchical Systems Theory of consciousness (mind-phronesis.co.uk), I classify these different components – They can be view as a series of articles and book chapters on my website which currently has a new post twice weekly if you are interested. Any cognitive correlation with my theory would be of interest to me…

    1. Mark:

      1. Thanks!
      2. I will look into your Hierarchical Systems Theory when I have a chance. If I notice anything from the cognitive science domain that speaks to the theory, I will be sure to send my thoughts to you.

      Thanks for sharing!

    1. Great question. Every Google image search that I have performed just pulls up wallpaper download websites (e.g., this one). If you ever find the creator, I would very much like to know so that I too can give credit.

Comments are closed.