A while back I was having dinner with a friend of mine who holds a PhD in neuroscience. We were discussing, or rather criticizing, some critical aspects of behavioural science. One of those aspects is the devotion to dual system reasoning. I’m not saying it doesn’t exist, it’s a useful analogy about thinking about certain aspects of decision-making, but it finds little to no grounding in our hardware. You know, the brain.
My friend pointed me to this article by Cisek and Hayden, who argue a very similar point: “Likewise, cognitive science, a major influence on much of modern systems neuroscience, was originally conceived as a study of the ‘software’ of the mind as opposed to the ‘hardware’ of the brain. While modern cognitive and systems neuroscience seeks to connect across levels, the fundamental concepts (e.g. ‘attention’, ‘working memory’, etc.) are still those outlined by psychological traditions. Consequently, the mechanisms of animal behaviour are often interpreted in terms of theories designed to explain human cognition, as if evolutionary history and diversity are irrelevant.”
Issue is, evolutionary history (and diversity) are really quite important when it comes to understanding the brain (or understanding anything really). We are still very much dependent on our hardware. And our hardware, I’m afraid to say, is very far from optimal.
We are not designed to be optimal. We have existed for a long time, our hardware changing to fit our environment and to become optimal (or at least decent enough) for survival. But that often doesn’t involve a “system reboot” (which is also mainly software based but I’m trying here). We can’t get rid of most of our hardware. No, changes required tend to be added on, change interaction or both. And that can get rather quite messy. Especially if the psychological or cognitive concepts don’t seem to watch what’s going on.
Let me give an example of this, as it does all sound rather abstract. How many systems of memory do we have? Looking at the psychological literature, I’d say about 2: short-term (or working) memory and longer-term memory. There’s also the functional neuroanatomy of memory: this view (the traditional view) holds that there is a single episodic memory system in the brain and evidence suggests it is situated in the medial temporal lobe. However, if we do take the evolutionary history of primates into account, this account becomes a lot less plausible. Murray et al. argue that we may have so much as seven different memory systems, which evolved at different times, and serve distinct roles:
(1) reinforcement memory,
(2) navigation memory,
(3) biased competition memory,
(4) manual foraging memory,
(5) feature memory,
(6) goal memory and
(7) social subjective memory.
This categorization of memory is different from the more traditional: this view is driven by evolutionary understanding as each form of memory is associated with a specific time in evolutionary history. Reinforcement memory evolved first along with control of movement; navigation memory arose later as early creatures could not just move locally but began to navigate; and social memory systems evolved for humans as a way to navigate the increasingly complex social situations.
The real kicker is that these seven forms of memory correspond to distinct brain substrates, as in, they have a distinct place in our hardware: navigation memory is associated with the hippocampal complex whereas biased competition memory is associated with agranular prefrontal cortex. Of course, not all seven have just a single brain part that they map onto, social memory is associated with granular prefrontal regions – which is not super surprising as this a highly complex form of memory informing a highly complex set of behaviours.
Now you might be wondering why this is of interest – not everyone has a keen interest in neuroscience or the hardware of the brain, I get it, I really do. But taking it back to behavioural science, don’t we think it might be a bit of a problem if we keep coming up with key concepts (e.g. dual system reasoning), if it has no basis in our hardware?
Cisek and Hayden mentioned that a lot of the fundamental concepts are outlined by psychological traditions, rather than those grounded in neuroscience. Is it then not possible that we are thinking about this the wrong way round? Trying to explain our behaviours and decision-making processes and mapping them onto our brain might not be the best way to go about this. Mapping the software onto the hardware seems like a rather foolish pursuit. What if we’re just a whole bunch of kludges, both hardware and software wise? Where do we stand then?
I’m not unaware of the issues, or rather, limitations in neuroscience, however. Neuroscience has a technological issue: you can’t send someone into a store or track someone going about their day with an fMRI machine on their head (if only). The technology simply doesn’t do that (yet). But how amazing would that be? As things stand, even with its limitations, I think we are doing neuroscience a disservice by forcing psychological concepts onto it, and hoping that somehow, they find a grounding in the brain. I know it's currently one of the few things we can do, but even then, it’s important to see our brain, and really our entire nervous system in the context it was shaped: through an evolutionary process filled with kludges, trying to adapt to the environment, ensuring our survival. Just not always optimally so.
I used to be much more critical than I am now, and would complain that System 1/2 are just overplayed metaphors, and shouldn’t be taken as a real theory.
…But then I just continued to use the terms anyways because they’re useful. I don’t think they are reified in the brain anywhere, or that there is a principled way to divide the “systems”. But nothing else seems to stick better.
Turns out cognitive ontology is hard. There are serious methodological and theoretical constraints that prevent us from mapping the brain to cognition. And likely if we were ever able to accurately create an ontology that exactly mapped onto the brain as it is, it would be way too complicated because…