March 12, 2020
How Do Your Emotions Affect Your Moral Compass?
Would you sacrifice one person to save five? What if you had to cause harm with your own hands? Your answer may depend on the emotions you’re feeling
By Jade Wu Savvy Psychologist
Trifonov Evgeniy Getty Images
Join Our Community of Science Lovers!
We’d like to believe that we make moral judgments based on rational thought, but the truth is that our moral thinking cannot escape our emotions. Let’s take a look at how anxiety, empathy, anger, and disgust shape our moral thinking, and how we can harness these emotions for making better moral decisions.
The infamous Trolley Problem
Imagine this scenario: As you’re walking by a train station, you notice there are some construction workers working on the tracks. There’s a fork in the track, so a train could either go left or right. On the left track, there’s only one person working. On the right track, there are five people working. They all have noise-canceling headphones on and don’t seem to know what’s going on around them.
On supporting science journalism
If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
Suddenly, you see an out-of-control train car coming down the tracks—it must have gotten loose from a train! The fork in the track is directed towards the right side, so the out-of-control car is headed straight for the five workers, certain to kill them all. There’s no way to stop the train car. The only thing you can do is to pull a switch to redirect the car towards the left track, which would kill the one worker there.
Do you pull the switch?
It’s a tough one, isn’t it? On the one hand, it seems like a no-brainer that killing one person to save five is better than killing five to save one. On the other hand, redirecting the track would require you to purposely cause someone’s death rather than letting the accident take its course. Most of us feel at least squeamish about the kill-one-to-save-five choice, but most of us, when pressed, agree with it . At least when asked about it hypothetically, that is.
This is the classic moral dilemma called the Trolley Problem . Many philosophers and psychologists have used it to study and ponder the way we think about morality. One big question they’ve asked is: “Do people make these decisions based on rational thinking, or are they influenced by other factors?”
Well, consider this twist to the Trolley Problem for your answer: What if there is no switch to redirect the train car, but there is a large stranger walking by that you could push onto the track? This person would be killed, but their body would stop the train from killing the five construction workers. Would you push the stranger?
Here, the math is the same—sacrificing one to save five. But I bet you had a different gut reaction. If so, then it shows that something else is helping you make this decision. What is that something else?
It turns out that emotions play a big role in the way we judge morality and make moral decisions. What did you feel when considering the Trolley Problem and the Stranger variation of it? Fear? Empathy? Disgust? Let’s take a look at the science behind how these emotions—and your relationship to them—affect your moral compass.
»Continue reading “How Do Your Emotions Affect Your Moral Compass?” on QuickAndDirtyTips.com
Do Emotions and Morality Mix?
A philosopher explains how feelings influence right and wrong.
Daily life is peppered with moral decisions. Some are so automatic that they fail to register—like holding the door for a mother struggling with a stroller, or resisting a passing urge to elbow the guy who cut you in line at Starbucks. Others chafe a little more, like deciding whether or not to give money to a figure rattling a cup of coins on a darkening evening commute. A desire to help, a fear of danger, and a cost-benefit analysis of the contents of my wallet; these gut reactions and reasoned arguments all swirl beneath conscious awareness.
While society urges people towards morally commendable choices with laws and police, and religious traditions stipulate good and bad through divine commands, scriptures, and sermons, the final say lies within each of our heads. Rational thinking, of course, plays a role in how we make moral decisions. But our moral compasses are also powerfully influenced by the fleeting forces of disgust, fondness, or fear.
Should subjective feelings matter when deciding right and wrong? Philosophers have debated this question for thousands of years. Some say absolutely: Emotions, like our love for our friends and family, are a crucial part of what give life meaning, and ought to play a guiding role in morality. Some say absolutely not: Cold, impartial, rational thinking is the only proper way to make a decision. Emotion versus reason—it’s one of the oldest and most epic standoffs we know.
Could using modern scientific tools to separate the soup of moral decision-making—peeking into the brain to see how emotion and reason really operate—shed light on these philosophical questions? The field of moral cognition, an interdisciplinary effort between researchers in social and cognitive psychology, behavioral economics, and neuroscience, has tried to do just that. Since the early 2000s, moral psychologists have been using experimental designs to assess people’s behavior and performance on certain tasks, along with fMRI scans to glimpse the brain’s hidden activity, to illuminate the structure of moral thinking.
One pioneer in this field, the philosopher and Harvard University psychology professor Joshua Greene, combined an iconic and thorny ethical thought experiment—the “ trolley problem ,” when you must decide whether or not you’d flip a switch, or push a man off a footbridge, to cause one person to die instead of five—with brain imaging back in 2001 . Those experiments, and subsequent ones, have helped to demystify the role that intuition plays in how we make ethical tradeoffs—and ultimately showed that moral decisions are subject to the same biases as any other type of decision.
I spoke with Greene about how moral-cognition research illuminates the role of emotion in morality—scientifically, but perhaps also philosophically. Below is a lightly edited and condensed transcript of our conversation.
Lauren Cassani Davis : Your research has revealed that people’s intuitions about right and wrong often influence their decisions in ways that seem irrational. If we know they have the potential to lead us astray, are our moral intuitions still useful?
Joshua Greene : Oh, absolutely. Our emotions, our gut reactions, evolved biologically, culturally, and through our own personal experiences because they have served us well in the past—at least, according to certain criteria, which we may or may not endorse. The idea is not that they’re all bad, but rather that they’re not necessarily up to the task of helping us work through modern moral problems, the kinds of problems that people disagree about arising from cultural differences and new opportunities or problems created by technology, and so on.
Recommended Reading
Would You Pull the Trolley Switch? Does it Matter?
Is Human Morality a Product of Evolution?
Why Can't We All Just Get Along? The Uncertain Biological Basis of Morality
Davis : You describe moral decision-making as a process that combines two types of thinking: “manual” thinking that is slow, consciously controlled, and rule-based, and “automatic” mental processes that are fast, emotional, and effortless. How widespread is this “dual-process” theory of the human mind?
Greene : I haven’t taken a poll but it’s certainly—not just for morality but for decision-making in general—very hard to find a paper that doesn’t support, criticize, or otherwise engage with the dual-process perspective. Thanks primarily to Daniel Kahneman [the author of Thinking, Fast and Slow ] and Amos Tversky, and everything that follows them, it’s the dominant perspective in judgment and decision making. But it does have its critics. There are some people, coming from neuroscience especially, who think that it’s oversimplified. They are starting with the brain and are very much aware of its complexity, aware that these processes are dynamic and interacting, aware that there aren’t just two circuits there, and as a result they say that the dual-process framework is wrong. But to me, it's just different levels of description, different levels of specificity. I haven't encountered any evidence that has caused me to rethink the basic idea that automatic and controlled processing make distinct contributions to judgment and decision making.
Davis : These neural mechanisms you describe are involved in making any kind of decision, right?— the brain weighs an emotional response with a more calculated cost-benefit analysis whether you’re deciding whether to push a guy off a bridge to save people from a runaway train, or trying not to impulse buy a pair of shoes.
Greene : Right, it’s not specific to morality at all.
Davis : Does this have implications for how much we think about morality as special or unique?
Greene : Oh, absolutely. I think that's the clearest lesson of the last 10 to 15 years exploring morality from a neuroscientific perspective: There is, as far as we can tell, no distinctive moral faculty. Instead what we see are different parts of the brain doing all the same kinds of things that they do in other contexts. There’s no special moral circuitry, or moral part of the brain, or distinctive type of moral thinking. What makes moral thinking moral thinking is the function that is plays in society, not the mechanical processes that are taking place in the brain when people are doing it. I, among others, think that function is cooperation, allowing otherwise selfish individuals to reap the benefits of living and working together.
Davis : The idea that morality has no special place in the brain seems counterintuitive, especially when you think about the sacredness surrounding morality in religious contexts, and its association with the divine. Have you ever had pushback—people saying, this general-purpose mechanical explanation doesn’t feel right?
Greene : Yes, people often assume that morality has to be a special thing in the brain. And early on, there was—and to some extent there still is—a lot of research that compares thinking about a moral thing to thinking about a similar non-moral thing, and the researchers say, aha, here are the neural correlates of morality. But in retrospect it seems clear that when you compare a moral question to a non-moral question, if you see any differences there, it’s not because moral things engage a distinctive kind of cognition; instead, it’s something more basic about the content of what is being considered.
Davis : Professional ethicists often argue about whether we are more morally responsible for the harm caused by something we actively did than something we passively let happen—like in the medical setting where doctors are legally allowed to let someone die; but not to actively end the life of a terminally ill patient, even if that’s their wish. You’ve argued that this “action-omission distinction” may draw a lot of its force from incidental features of our mental machinery. Have ideas like this trickled into the real world?
Greene : People have been making similar points for some time. Peter Singer, for example, says that we should be focused more on outcomes and less on what he views as incidental features of the action itself. He’s argued for a focus on quality of life over sanctity of life. Implicit in the sanctity-of-life idea is that it’s ok to allow someone to die, but it’s not ok to actively take someone’s life, even if it’s what they want, even if they have no quality of life. So certainly, the idea of being less mystical about these things and thinking more pragmatically about consequences, and letting people choose their own way—that, I think, has had a very big influence on bioethics. And I think I’m lending some additional support to those ideas.
Davis : Philosophers have long prided themselves on using reason—often worshipped as a glorious, infallible thing—not emotion, to solve moral problems. But at one point in your book, Moral Tribes, you effectively debunk the work of one of the most iconic proponents of reason, Immanuel Kant. You say that many of Kant’s arguments are just esoteric rationalizations of the emotions and intuitions he inherited from his culture. You’ve said that his most famous arguments are not fundamentally different from his other lesser-known arguments, whose conclusions we rarely take seriously today—like his argument that masturbation is morally wrong because it involves “using oneself as a means.” How have people reacted to that interpretation?
Greene : As you might guess, there are philosophers who really don’t like it. I like to think that I’ve changed some people's minds. What seems to happen more often is that people who are just starting out and confronting this whole debate and set of ideas for the first time, but who don’t already have a stake in one side or the other and who understand the science, read that and say, oh, right, that makes sense.
Davis : How can we know when we’re engaged in genuine moral reasoning and not mere rationalization of our emotions?
Greene : I think one way to tell is, do you find yourself taking seriously conclusions that on a gut level you don’t like? Are you putting up any kind of fight with your gut reactions? I think that’s the clearest indication that you are actually thinking it through as opposed to just justifying your gut reactions.
Davis : In the context of everything you’ve studied, from philosophy to psychology, what do you think wisdom means?
Greene : I would say that a wise person is someone who can operate his or her own mind in the same way that a skilled photographer can operate a camera. You need to not only be good with the automatic settings, and to be good with the manual mode, but also to have a good sense of when to use one and when to use the other. And which automatic settings to rely on, specifically, in which kinds of circumstances.
Over the course of your life you build up intuitions about how to act, but then circumstances may change over the course of your life. And what worked at one point didn’t work at another point. And so you can build up these higher-order intuitions about when to let go and try something new. There really is no perfect algorithm, but I would say that a wise mind is one that has the right levels of rigidity and flexibility at multiple levels of abstraction.
Davis : What do you think about the potential for specific introspective techniques—I’m thinking about meditation or mindfulness techniques from the Buddhist tradition—to act as a means of improving our own moral self-awareness?
Greene : That’s an interesting connection—you’re exploring your own mental machinery in meditation. You’re learning to handle your own mind in the same way that an experienced photographer learns to handle her camera. And so you’re building these higher-order skills, where you’re not only thinking, but you’re thinking about how to think, and monitoring your own lower-level thinking from a higher level—you have this integrated hierarchical thinking.
And from what I hear from the people who study it, certain kinds of meditation really do encourage compassion and willingness to help others. It sounds very plausible to me. Tania Singer , for example, has been doing some work on this recently that has been interesting and very compelling. This isn’t something I can speak on as an expert, but based on what I’ve heard from scientists I respect, it sounds plausible to me that meditation of the right kind can change you in a way that most people would consider a moral improvement.
About the Author
More Stories
What It’s Like to Be a Doctor for Sport Horses
The Economic Case for Worldwide Vegetarianism
Home — Essay Samples — Business — Decision Making — Moral Decision Making: Analysis of Rae’s Seven-Steps Model
Moral Decision Making: Analysis of Rae's Seven-steps Model
- Categories: Decision Making Moral
About this sample
Words: 1119 |
Published: Aug 14, 2023
Words: 1119 | Pages: 2 | 6 min read
Table of contents
Scott b. rae's seven steps-model of moral decision making.
Cite this Essay
To export a reference to this article please select a referencing style below:
Let us write you an essay from scratch
- 450+ experts on 30 subjects ready to help
- Custom essay delivered in as few as 3 hours
Get high-quality help
Prof Ernest (PhD)
Verified writer
- Expert in: Business Philosophy
+ 120 experts online
By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email
No need to pay just yet!
Related Essays
2 pages / 853 words
1 pages / 489 words
3 pages / 1335 words
6 pages / 3643 words
Remember! This is just a sample.
You can get your custom paper by one of our expert writers.
121 writers online
Still can’t find what you need?
Browse our vast selection of original essay samples, each expertly formatted and styled
Related Essays on Decision Making
Mr. Robot. Created by Sam Esmail, USA Network, 2015-2019.Freud, Sigmund. The Uncanny. Penguin Classics, 2003.Giddens, Anthony. Modernity and Self-Identity: Self and Society in the Late Modern Age. Stanford University Press, [...]
Consequentialism is a prominent ethical theory that places primary emphasis on the outcomes or consequences of actions in determining their moral value. Rooted in the belief that the rightness or wrongness of an action is [...]
Decision making is an essential aspect of human life that impacts various aspects of our personal and professional lives. It involves the process of choosing a course of action from several alternatives to achieve desired [...]
Decision making is an essential skill that individuals use to navigate through life's complex and challenging situations. Whether it's choosing a career path, making financial decisions, or deciding on personal matters, the [...]
The purpose of this paper is to bring forth examples of the moral values and philosophies I hold and use to guide my decision making skills. I examine the self-determining impacts of my personal moral philosophy on Ethical [...]
To conclude, criminal responsibility in terms of the punishment of psychopaths is a contentious topic however it is one that needs more focus in terms of alternative avenues of punishment. While psychopaths are impaired in their [...]
Related Topics
By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.
Where do you want us to send this sample?
By clicking “Continue”, you agree to our terms of service and privacy policy.
Be careful. This essay is not unique
This essay was donated by a student and is likely to have been used and submitted before
Download this Sample
Free samples may contain mistakes and not unique parts
Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.
Please check your inbox.
We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!
Get Your Personalized Essay in 3 Hours or Less!
We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .
- Instructions Followed To The Letter
- Deadlines Met At Every Stage
- Unique And Plagiarism Free
IMAGES
VIDEO
COMMENTS
This module entitled Feelings and Moral Decision addresses the following questions: What are feelings/emotions? What is the role of feelings in moral decisions? What are the disadvantages of overreliance on feelings?
Let’s take a look at how anxiety, empathy, anger, and disgust shape our moral thinking, and how we can harness these emotions for making better moral decisions. Imagine this scenario: As...
Could using modern scientific tools to separate the soup of moral decision-making—peeking into the brain to see how emotion and reason really operate—shed light on these philosophical...
Here, we propose a new way of thinking about emotion within the context of moral judgment, one in which affect is generated and transformed by both automatic and controlled processes, …
It shows that decision makers sometimes follow their gut feeling when deciding. These demonstrations of affective influences on decision making are important, and the …
This extensive note synthesizes current psychological and neuroscientific research on how people make decisions with moral implications.
Scott B. Rae's Seven Steps-Model of Moral Decision Making. To ensure the rationality and impartiality of moral decisions, it is good to follow the seven-step moral …
Preview text. Feelings as Instinctive Response to Moral Dilemmas. Some ethicists hold that moral judgments at their best should also be emotional. Emotions can be rational in being based at least sometimes on good …