Blages

Wednesday, November 24, 2010

Irony Alert: Marc Hauser on moral judgments

So, PNAS has just published a brief exchange on the nature of moral judgments, including a letter where one of the coauthors is the man who put the a** in a**ertainment bias.

Marc Hauser is a Professor in the Psychology Department at Harvard. He made a name for himself publishing a variety of behavioral and cognitive studies on both humans and non-human primates, with the goal of understanding the evolutionary origins of human cognition, including complex traits such as language, economic decision-making, and moral judgments. More recently, he has made a name for himself by allegedly falsifying data and allegedly bullying the people in his lab who naively thought that the data published by the group should be . . . I don't know . . . NOT falsified. I won't repeat what this more recent name that he's made for himself is, as it would violate the norms of internet civility. Over his career, Hauser has published something like 200 articles and 6 books, many of which probably contain certain things that are not entirely false. At the moment, he is on leave from Harvard, following an investigation's finding him solely responsible for 8 counts of scientific misconduct. Presumably, he is working on his next book, allegedly titled Evilicious: Explaining Our Evolved Taste for Being Bad.

Snarking aside, the two letters that were just published follow from an interesting article published in PNAS earlier this year, where Hauser is the third of four co-authors. For those not familiar with authorship conventions in biology and related fields, here is what is typically implied by the order of authorship on a four-author paper. The first author probably did all of the experiments. The second author helped with some of the experiments, and/or some of the data analysis. The third author probably didn't directly participate, but contributed ideas and/or reagents and/or equipment. The last author probably runs the lab where the experiments were done. In fact, the other three authors are all at the other Cambridge, in England, where the experiments were actually done. I point all this out just because I don't want to leave the impression that we should be suspect of the results in the paper just because Hauser's name is on it.

The original paper, which can be found and freely downloaded here, tests the effect of enhancing serotonin activity on a variety of tasks or decisions, some of which had a moral flavor. Serotonin enhancement was achieved by giving some of the subjects the drug citalopram, which is a selective serotonin reuptake inhibitor (SSRI), like Prozac or Zoloft. The finding was that enhancing serotonin made subjects less willing to take an action that required them to inflict harm on another individual in an emotionally salient context.

This work fits in with a substantial literature on moral dilemmas. I'll just briefly outline the gist of that literature here in the context of one particular dilemma that often makes an appearance in these studies. The scenario is this: there are five people tied to a train track, and there is a train rushing towards them. You have the opportunity to save them, by stopping the train or switching it to a different track, but the only way to do it involves killing one person. What do you do?

Most people find that they have two conflicting impulses. On the one hand, killing one person to save five makes sense from a utilitarian perspective. That's four fewer dead people. On the other hand, you are the one who has to kill the one person, and most people feel a moral repulsion to killing someone, even if it is for the greater good.

In these studies, which of the two impulses seems to win depends on how personal the killing is. If all you have to do is pull a switch, and the train will go on another track, which, for unknown reasons, has one person tied to it, the killing is fairly impersonal, and many people will choose this utilitarian, four-fewer-dead-people option. On the other hand, if the only way to stop the train is to chop off someone's head and throw it through a magical basketball net woven of human entrails (I'm making this up), many people will find this too emotionally and morally problematic, and will let the train go on its merry five-corpse-making way. Researchers have mapped out a whole continuum between these two extremes: pushing someone off a bridge with your hands is more emotionally salient (and therefore less morally acceptable) than pushing someone off a bridge with a stick, and so forth.

What the original paper finds is that giving someone an SSRI does not have much effect on decisions that are morally neutral, or where the harm that must be inflicted is impersonal (like throwing a switch to divert the train). However, in cases where one decision would require the subject to harm someone in a personal and emotionally salient way (like pushing them off the bridge with their bare hands), the SSRI seems to enhance the emotional/moral aversion to taking that action.

So, in addition to nausea, insomnia, and diarrhea, add to the list of possible side effects of antidepressants: "may reduce willingness to harm others in emotionally charged situations." Maybe Charlie Sheen should be on one of these.

The letters commenting on the original paper can be found here and here, but require a subscription to PNAS to access. I wouldn't go to great lengths to get them, however. There is some quibbling about terminology – driven more by a commentary on the original article than by the article itself – and some tiresome academic "Get off my lawn!" moments, but probably nothing of interest to most of the reader(s) of this blog.

Crockett MJ, Clark L, Hauser MD, & Robbins TW (2010). Serotonin selectively influences moral judgment and behavior through effects on harm aversion. Proceedings of the National Academy of Sciences of the United States of America, 107 (40), 17433-8 PMID: 20876101

2 comments:

  1. He couldn't find his "asc" with both hands?

    BTW: I think Hauser is not so much unethical as he was, well, stuck in confirmation bias. Unless you know something about the man the rest of us don't, his commenting upon moral evolution is hardly ironic. In fact, his views may still be right, for all his lack of objectivity, although I do not think they are.

    ReplyDelete
  2. The stories that appeared in the Boston Globe seemed to me to cross the line from confirmation bias into actual falsification. Those stories also portrayed him as getting angry with people in the lab whenever they questioned results. Given the power structures in an academic lab, and the suggestions that this is a long-term pattern of behavior, I don't think that unethical is not completely off base.

    But, in general, I completely agree with you that an extremely elaborated confirmation bias is exactly what we are looking at in the Hauser case. I think it is a sort of perfect storm of extremes: he is extremely smart, extremely arrogant, extremely charismatic, and working on an extremely hard problem. I suspect that he is so deeply convinced of the correctness of his own ideas – and has received enough positive reinforcement – that I almost picture him discounting counterevidence as the subjects being wrong, rather than the theory.

    As Norma Desmond would say, "It's the monkeys that got small."

    ReplyDelete