CLICK BELOW TO REDISCOVER HUMANITY

A DECADE+ OF STORYTELLING POWERED BY THE BEST WRITERS ON THE PLANET

Gut Check – How Do You Measure Morality?

Image 20170329 8587 1w2hgpf
Can moral sentiments be measured?
James Willamor, CC BY-SA

C. Daryl Cameron, Pennsylvania State University

Imagine picking up the morning newspaper and feeling moral outrage at the latest action taken by the opposing political party. Or turning the page and seeing people around the world suffering famine and heartbreak, and flinching with empathy at their pain. The Conversation

One of the most fundamental tasks we have as social creatures is to figure out whom we can trust, whom we should help and who means us harm. These are questions that are central to morality in everyday life.

In our work, we use tools from psychology to better understand these gut-level moral reactions that matter for everyday life. My research focuses on two facets of morality: moral judgments and empathy for the pain of others. Below, I discuss two new behavioral measures I have developed with my colleagues to capture these moral sentiments.

Why not just ask people?

One way to get a sense for people’s moral beliefs is to simply ask them. A researcher could ask you to rate on a one-to-five scale how morally wrong is a particular action, such as assaulting someone. Or to report on how frequently you tend to have empathy for other people in everyday life.

Relying on self-reports for questions on morality may not be enough.
Dietmut Teijgeman-Hansen, CC BY-NC-ND

One potential problem with asking people to self-report their reactions is that these reports can be influenced by a lot of factors, especially when the topics are sensitive, such as morality and empathy. If people think their reputation is at stake, they may be very good at reporting what they think others want to hear.

So, sometimes self-reports will be useful, but sometimes people edit these reports to give a good impression to others. If you want to know who is likely to feel your pain, and not make “you” feel the pain, then relying on self-report, although a good start, may not always be enough.

A new measure of moral judgment

Rather than asking people what they think is moral, or how much empathy they feel, our work attempts to assess people’s immediate, spontaneous reactions before they have had much time to think at all. In other words, we examine how people behave to get a sense for their moral reactions.

For example, consider the new task that my collaborators and I developed to measure people’s gut reactions that certain actions are morally wrong. Gut reactions have been thought by many psychologists to play a powerful role in moral decision-making and behavior.

In this task, people go through a series of trials. In each trial, they see two words flash, one after the other. These words are actions typically thought either to be morally wrong or morally neutral. People are asked to judge whether the second words describe actions that are morally wrong, while avoiding being influenced by the first words. So, for example, in a particular trial, people might see “murder” immediately followed by “baking.” Their task is to judge whether “baking” is wrong while ignoring any influence of “murder.”

Student actors in an Intro to Philosophy project.
Rafael Castillo, CC BY

People are also not given much time to respond. If they take longer than half a second to respond, they get an annoying warning to “Please respond faster.” This is meant to make sure people respond without thinking too much.

My collaborators and I find that people make a systematic pattern of mistakes. When they see morally wrong actions such as “murder” come first, they make mistaken moral judgments about the actions that come second: They are more likely to mistakenly judge neutral actions such as “baking” as morally wrong. The idea here is that people are having a gut moral reaction to the words that come first, which is shaping how they make moral judgments about the words that come second.

This effect described above happens even when people are intending for it not to. So even if you are trying to stop that first word from influencing you, it still does.

You might think, does this connect to real-world morality? After all, responding quickly to words on a screen may not track the moral values we care about.

We find that people who show a stronger response on our task have features of a “moral personality.” We correlated the effect on our morality task with people’s self-reported measures of morally relevant traits.

People who show a stronger response on our task are more likely to feel guilt when considering doing unethical actions. They are more likely to indicate caring about being a moral person. And they report fewer psychopathic tendencies such as callousness. These associations are modest, but suggest that we’re capturing something relevant to morality.

A new measure of empathy

My collaborators and I have taken a similar approach to understanding empathy, or the tendency to vicariously feel the pain of others. Empathy research has often gone beyond self-report to use brain imaging or physiology as measures. But these are often quite costly to implement and may not always provide a clear lens on social emotions.

Could seeing images of pain help measure empathy?
Army Medicine, CC BY

We created a new empathy task that’s very similar to the morality task except this time, people see two images rather than two words. The images depict hands being pierced with needles or brushed with Q-tips, which are implements that are considered respectively painful and nonpainful by most people.

People are asked to judge whether the experiences of the second images are painful or not, while avoiding being influenced by the first images.

As with the morality task, people show a systematic and robust pattern of mistakes; when they see painful experiences (i.e., needles) come first, they are more likely to mistakenly judge nonpainful experiences (i.e., Q-tips) as painful.

Importantly, we found that the empathy measured in our behavioral task connected to costly prosocial behavior: In one of our experiments, people who showed stronger empathetic reactions donated more of their own money to cancer charities when given the opportunity to do so.

Where do we go from here?

So, how can researchers use these tasks, and what can they imply for everyday moral interactions?

The tasks could help suggest who lacks the moral sentiments that support moral behavior. For example, criminal psychopaths can self-report normal feelings of empathy and morality and yet their behavior speaks otherwise. By assessing their gut-level behavioral responses, researchers may be better able to detect whether such offenders differ in morality and empathy.

In terms of everyday interactions, it might be good to understand people’s gut-level moral reactions: This may provide some indication of who shares your values and moral beliefs.

More research needs to further understand the nature of these moral sentiments that are captured by our tasks: These moral sentiments could also change over time, and it is important to know if they could predict a broader range of behaviors that are relevant to ethics and morality.

In sum, if we want to know who shares our moral sentiments, maybe just asking others isn’t quite enough. Self-reports are useful, but may not provide a complete picture of human morality. By looking at how people behave when they don’t have much time to think, we can see whether their moral sentiments compel them even when they are intending otherwise.

C. Daryl Cameron, Assistant Professor of Psychology and Research Associate in the Rock Ethics Institute, Pennsylvania State University

This article was originally published on The Conversation. Read the original article.

CLICK HERE TO GET TODAY'S BEST WRITING ON THE PLANET DELIVERED TONIGHT

THE CONVERSATION
THE CONVERSATIONhttps://theconversation.com/us
THE CONVERSATION US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public. Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public. We aim to help rebuild trust in journalism. All authors and editors sign up to our Editorial Charter. All contributors must abide by our Community Standards policy. We only allow authors to write on a subject on which they have proven expertise, which they must disclose alongside their article. Authors’ funding and potential conflicts of interest must also be disclosed. Failure to do so carries a risk of being banned from contributing to the site. The Conversation started in Melbourne Victoria and the innovative technology platform and development team is based in the university and research precinct of Carlton. Our newsroom is based in Boston but our team is part of a global newsroom able to share content across sites and around the world. The Conversation US is a non-profit educational entity.​

DO YOU HAVE THE "WRITE" STUFF? If you’re ready to share your wisdom of experience, we’re ready to share it with our massive global audience – by giving you the opportunity to become a published Contributor on our award-winning Site with (your own byline). And who knows? – it may be your first step in discovering your “hidden Hemmingway”. LEARN MORE HERE


CONVERSATIONS

  1. Can morality be measured? Yes it can, but we need a baseline to determine that measurement. Once we have that doing a yard stick is very easy. To do that base line we need to define what is a good moral decision, a bad moral decision, and what is considered an amoral decision. Then we compare the three together seeing what are the edges of each decision that changes the decision from good, to bad, to amoral. But this is pretty abstract. I’m in an odd mood today so here’s an “exaggerated” example.

    Though I was born as a man, in my mid twenties I realized that I am a reincarnated dragon, not the kind that eats people, but the kind that is more good spirit than monster. When I share this with other people I do face quite a bit of discrimination. For instance I was the best choice to get the promotion, I was passed over because I am a dragon. Feeling I have been discriminated against I go immediate to HR and share my grievances.

    What is the moral thing to do? Should I have been granted that promotion? Should HR get involved in this? Should policies regarding dragons and dragon like issues be reviewed and corporate policies be amended?

    Let’s substitute dragon with any of the following
    – man
    – woman
    – aboriginal
    – gay

    When I’m a dragon, people will look at me and say I’m a nut bar. They wouldn’t take me seriously. There is no moral decision needed to be made. But once I’m not a dragon that’s when all our prejudices, fears, and anger comes. Though morality is built by the individual and the individual’s history, a person’s morality is reinforced by the group of people they are always interacting with. Morality can be measured, but only at the “group” level.

TIME FOR A "JUST BE." MOMENT?

TAKE STROLL INSIDE 360° NATION

ENJOY OUR FREE EVENTS

BECAUSE WE'RE BETTER TOGETHER