Theory of Mind and Moral Judgments

This is part 5 of 6 in my series on morality. Links to previous entries are below the post.

excel 2013 essential training eplan electric p8 manual buy oem Best price autodesk autocad revit architecture suite 2012 Avid Media Composer autodesk buy oem
Nik Software Complete 5 student price Buy imagineer systems mocha pro v3 revit architecture 2015 tutorial Adobe deluxe iso
In our recent discussion on morality, I’ve been putting forth the argument that to understand the nature of our morality we must look to science. So far we’ve discussed issues relating to moral decisions as they pertain to our personal actions. But there is another side to moral reasoning, and this has to do with how we view the actions of others. There is a presupposition inherent in this endeavor though, which is that other people have their own thoughts and beliefs and desires and can be judged objectively on the basis of this knowledge of them being separate and distinct beings from us. This understanding of the intentional states of others is called Theory of Mind. What are the mechanisms that underlie Theory of Mind, and how important are they in judgments regarding the actions of others?

Let me describe for you two experiments. The first of which finds an intriguing correlation between Theory of Mind and moral judgments, and a second which goes a step further and finds a causal connection. In the first experiment patients were given four distinct situations and were to judge how much blame to give an individual who committed an act. The situations were as follows:

1)      Harm was intended, but none occurred.

2)      Harm was intended, and harm occurred.

3)      No harm was intended but harm occurred

4)      No harm was intended, and no harm occurred

Not surprisingly, participants in general attributed blame based on intention more than whether harm occurred or not. If the situation was such that harm was intended, but no harm occurred, participants still gave a high degree of blame to the individual attempting to do harm. Importantly, though most participants attributed low amounts of blame to individuals who caused unintended harm, they still on average attributed more blame to them than when no harm was intended or incurred.

Here was the most important thing they got from this experiment: The degree of blame attributed to the individual committing unintended harm was inversely correlated to activity in one particular brain region, the Right Temporoparietal Junction (RTPJ). (the more activity in the RTPJ, the less blame was attributed. Less activity in the RTPJ, more blame attributed).

Their follow up experiment is where things get really interesting. Using Transcranial Magnetic Stimulation (TMS), the researchers were able to temporarily disrupt the neuronal firing in the RTPJ as participants made these same moral judgments. What they found was that they were actually able to change the participants’ moral judgments. In cases where harm was intended, but no harm was committed, participants gave less overall blame. And where no harm was intended, but harm was committed, participants gave more overall blame.

These findings are interesting not just for localizing moral decision making regarding other people’s behavior to a particular brain region, but for indicating the causal role that same region plays in influencing the judgment. When we make a moral judgment regarding someone else’s actions, we automatically incorporate our understanding of their intentional stances. We analyze not just what happened, but what someone intended to happen. We show compassion and empathy when appropriate. We attribute blame if we feel the intent of the individual deserves it. This is obvious in how parents interact with their children as well as in how our justice system functions. Attempted murder comes with a lesser sentence than murder, but it still comes with a sentence. And yet, it turns out that our ability to make these sorts of judgments is dependent on a very specific region in the brain, without which our judgments suddenly become more utilitarian in nature, contemplating only the result of the action committed.

So much of the way we interact with each other is dependent our ability to see other people as conscious and intelligent agents. People with beliefs and desires, fears and hopes. It’s so integral to our nature that we set up our society with this idea as a presupposition. Any parent will tell you that this understanding is not present in young children, and is something that develops over time. Psychology has been able to tell us the age this change occurs (around 4 years old). And now neuroscience can tell us what brain region is responsible. But only a more thorough philosophical exploration can help us think about how this affects the way we as humans interact with each other. Next time we wrap up this entire series with a quick recap of the ideas we’ve explored.

  1. Science and Morality
  2. Moral Intuitions vs. Moral Standards
  3. Philosophical Hypotheticals
  4. Emotion and Rationality
  5. Theory of Mind and Moral Judgments
  6. Morality Wrap-up