Moral Intuitions vs. Moral Standards

This is Part 2 of 6 of my series on science and morality.

In my first post on morality I discussed some general ways of analyzing our moral standards from a philosophical and scientific perspective. There was an important distinction implicit in that discussion and I’d like to focus on that distinction today. When we talk about morality we too often neglect to consider the relationship our moral decisions have with innate moral intuitions. While moral standards could be described as prescriptive statements regarding human behavior (how should I act?), moral intuitions could be better described as descriptive statements regarding the cause of that behavior (why did I act this way?).

Classical models of moral reasoning describe it as a rational process of weighing pros and cons, judging evidence, and coming to a conclusion, a moral judgment, based on explicit moral values we are conscious of. Psychologist Jonathan Haidt argues that this isn’t the case at all, but rather, that moral intuitions are “automatic evaluative feelings, lacking in a conscious process of decision making.” Haidt believes that many of our current moral standards originate as innate moral intuitions, which are then expressed by particular behaviors, and which go on, through societal pruning, to become moral standards. He further argues that these moral intuitions themselves account for our moral judgments, and that the process of moral reasoning is simply a post hoc (after the fact) rationalization of this preconscious intuition.

That last point is certainly an interesting claim. Haidt is in a sense arguing that the process that leads to a moral decision is akin to the process that was involved in you choosing whether you think vanilla or chocolate ice cream tastes better. This process wasn’t logical, one just DOES taste better, and your choice of ice cream reflects that. This is an extreme position in the literature, and one I won’t defend here today, but what I do want to say is a less controversial thing to accept; which is that our moral intuitions, in some quantifiable way, effect our moral standards. This is obvious when considering standards like the incest taboo, which has strong evolutionary roots, but not quite as strong logical roots (and I bet everyone reading this just had a moment of disgust). Illogical intuitions play massive roles in our lives. People turn the heat up on a thermostat higher than they want it at so it will heat up quicker (this doesn’t work). A majority of people think it’s warmer in the summer because the earth is closer to the sun (it isn’t). People view the following coin flipping result, HHHTTT, as less likely than this one, HTTHTH, even though their likelihood is exactly the same. The field of behavioral economics if overflowing with examples of intuitive decisions we make that are not only completely illogical, but often not in our best interests at all. The important point here is that we often think that when we make a moral decision we come to a judgment based on the facts of a situation and how those facts relate to certain values we hold. We imagine that these values we have attained through experience and culture and thought and development. This is to a large degree a convenient fiction.

The other day I made the argument for considering the fact of our evolutionary development when evaluating moral standards. This same mode of thinking is useful when considering moral intuitions, even more so. Though evolution is concerned only with genetic self interest, the process of evolution is not capable of creating perfectly selfish brains (since evolution works at the genetic level, not at the level of the organism, a point often misunderstood about the nature of evolution). The best our selfish genes can do is to create brains with particular tendencies that benefit the survival of the organism, and thus make it more likely to pass on its genes (when large things with teeth and claws are coming towards you, run away). The important point here is that evolutionary tendencies that benefit survival don’t always translate into specific actions that are in the genetic self interest of the organism that the genes reside in. This is a particularly salient point when considering the social nature of much of our recent evolutionary history, and the tendencies that would have been rewarded during most of our primate evolution (it’s in your best interest to protect others in your tribe, since they will also protect you. Though it’s possible this strategy will lead to your harm, statistically you’re better off). Considerations of this sort not only account for altruism towards family and friends, but many of our innate senses of fairness and justice, punishment and reward. It explains why participants in games will often times forgo their own reward to ensure that a cheater is punished.

In this light, moral systems can be seen to have evolved over time to solve particular problems in our complex social environments, and organisms with built in tendencies (moral intuitions), which coincided with these moral systems, would have a huge survival advantage. Understanding this, and using the tools that neuroimaging has made available, we have been able to explore much of the neuronal and biological basis of morality. Some of the specific breakthroughs we’ve made will be discussed in the coming days. My purpose here was to first introduce the idea that to have any sort of substantive conversation about prescriptive moral standards, we first need to come to an understanding of where our moral intuitions come from, and the fact that these intuitions are  a natural phenomena. As I mentioned in the first post, that which is natural is not necessarily good. Our tendency towards violence and anger, and the proliferation of rape and murder throughout history may be natural phenomena, ingrained in us for evolutionary reasons, but that doesn’t change the fact that these behaviors are despicable.

  1. Science and Morality
  2. Moral Intuitions vs. Moral Standards
  3. Philosophical Hypotheticals
  4. Emotion and Rationality
  5. Theory of Mind and Moral Judgments
  6. Morality Wrap-up