Since antiquity, philosophers and political theorists have been obsessed with the idea of morality. Their treatises explore, among other ideas, general conceptions of morality and the process by which individuals adopt a sense of 'right' and 'wrong.'
But while some insist on abstract frameworks — such as the rational approach espoused by many social scientists these days — as tools to describe human behavior, many times their hypotheses do not pan out empirically. In other words, some actions by individuals do not seem to adhere to a concise framework.
Intrigued by seeming inconsistencies in individuals' moral judgement, philosophy graduate student Joshua Greene and pscyhology professor Jonathan Cohen collaborated on a recent paper investigating the brain's role in the decision-making process.
With the help of members of the psychology department — Professor John Darley, research scientist Leigh Nystrom and former research assistant Brian Sommerville, who is now at Columbia Medical School — Cohen and Greene used imaging machines to investigate how the brain responds to different moral dilemmas.
In the study — which has received much attention in the psychology world — subjects were asked to respond to sample situations. As they responded, Cohen and Greene examined the subjects' brain activity.
The situations presented often involved extreme moral dilemmas. For example, one asked the following questions:
There is a runaway trolley heading for five people. All will die unless you flick a switch that will cause the trolley to veer into a spur, killing one person. Should you hit the switch? Or in another version of the situation, you are standing next to a stranger on a footbridge above the tracks. The only way to save the five people is to push the stranger. He will fall to a certain death, but his heavy body will block the trolley, saving five lives. Should you push him?
In the study, many people responded that, in the first case, it would be okay to kill one person to save the other five. In the second, however, they said pushing the stranger would not be appropriate.
According to Cohen and Greene, the first case is classified as impersonal-moral. The second falls into another category called personal-moral because it involves real personal contact — an action of singling out a particular individual. A third class of situation considered, the non-moral, does not require moral decisions to be made.
...The study's hy-pothesis was that there are two aspects of moral reasoning processes: the dry, analytical side, and the side influenced by emotion.
"People may not be aware if they used emotions or not," Cohen said. He noted the benefits of being able to actually measure scientifically the involvement of areas of the brain responsible for emotions.

When there is increased activity in a given part of the brain, blood flow increases in this section. Using imaging machines that record blood flow, the level of activity in various parts of the brain can be determined. The study used this idea to find which areas of the brain were involved when two groups of nine subjects read 60 moral dilemmas.
The study uncovered some important data. Cohen and Greene found that both the impersonal and non-moral problems activated the parts of the brain that are responsible for reasoning and analytical thought: the prefrontal cortex and posterior parietal cortex.
In contrast, the personal-moral situations registered responses from areas of the brain that handle emotions, particularly in social situations. Scientists currently know less about this part of the brain than they do of the rational side, although previous studies have observed and mapped its activity.
These findings raise some intriguing questions, Greene said. For example, are answers to moral dilemmas hard-wired into the brain? What role do cultural forces and education play in moral problem solving?
"You can't take an individual and tease apart which influences are cultural and which aren't," Greene explained.
Greene noted that responses did not vary much from person to person. "Overall they were more similar than different," he said.
Future studies may involve testing for cultural factors by investigating whether responses differ or are uniform across a given culture. In this scenario, if there are individual differences, the possibility is raised that people make decisions based on emotions rather than depending on a rational framework.
The other factor Greene and Cohen considered was the amount of time people took to make their decisions. Did subjects who paused before answering use their emotions more than those who quickly replied? Was an emotional response considered and overridden?
Another route of further study may be studying people with brain damage to see if they make moral decisions differently, Greene noted.
Understanding the basis of how people make judgments has useful applications in moral education. Greene and Cohen's study may answer many questions. For example, in teaching morals to children, should we appeal to emotions or to reason?
It also puts us in a better position to diagnose and to treat psychopathical people. However, Cohen emphasized, "None of this has anything to do with what's right or wrong." The study is concerned solely with the inner workings of the brain. Whether what the brain concludes and the "moral" answer to the problem coincide is "up to philosophers and ethicists," Cohen said.
Both Cohen and Greene explained the positive role Princeton played in the study.
Whereas most functional imaging laboratories are in medical centers, Princeton's is located in the psychology department. This arrangement presents the valuable opportunity to bring neuroscientists into contact with specialists in humanities and other fields.
Cohen declared, "We now have the tools available to study the neural underpinnings of the processes that characterize us as humans."