When a reward is tempting enough, people will break their own moral codes to gain the desired prize. Afterward, they’ll tell you exactly how they were justified: “It wasn’t as if anyone was harmed,” “I was only borrowing …,” “My boss told me to” or “It’s our customers’ responsibility to read the fine print.”
It’s a rationalizing process called “moral disengagement” that Darden Professors Sean Martin and Jim Detert have studied. People are self-interested, but we don’t like to face that about ourselves because we also have a strong need to see ourselves as good people, they argue, so we unintentionally, and quite effortlessly, use a series of cognitive maneuvers to justify self-interested choices that don’t align with who we say we want to be or what we want others to think about us.
Moral disengagement can accompany small transgressions — pocketing extra change given in error at a coffee bar (“It was the cashier’s fault”) — and major scandals, such as those that have happened when employees at major U.S., European and Japanese automakers rationalized away behaviors that led to false advertising about fuel efficiency or potential risks to consumer safety.
In a series of experiments, Detert, Martin and their colleagues found that the more tempting the potential personal gain is, the more likely people are to break their own internal standards (their conscience) by utilizing morally disengaged thinking to distort the ethical consequences of a behavior.
Martin says companies should be aware of this universal human tendency and learn how to challenge moral disengagement before it leads to bad decisions or creates an unethical culture.
LISTEN FOR CUES
When people are about to do something wrong because they’re morally disengaging, their language often changes. Euphemisms replace plain-spoken language. For example, Wells Fargo employees opening fake banking accounts called it “gaming” rather than “fraud.” When people pirate music or break licenses on software they may call it “file sharing” instead of “stealing.” When they’re about to distort some accounting or sales numbers, employees may say “Everyone does it” (diffusion of responsibility) or “It’s no big deal” (distortion of consequences). In a letter to employees, United Airlines CEO Oscar Munoz reportedly victim-blamed passenger David Dao, who was dragged out of his paid-for seat on an overbooked United flight, by calling him “disruptive and belligerent.” That’s an example of attribution of blame.
Here are eight common moral disengagement strategies and what they sound like:
- Moral justification (“This is actually the morally right thing to do; we’re actually helping them by doing this.”)
- Euphemistic labeling (“I’m just ‘borrowing’ this.” “It’s ‘collateral damage.’”)
- Advantageous comparison (“Doing A, is not as bad as doing B.” “It’s not like I’m doing B.”)
- Displacement of responsibility (“My boss told me to do it.” “I’m just following orders.”)
- Diffusion of responsibility (“Everyone’s doing it.” “It’s a group decision.” “This is just a small part of a bigger system.”)
- Distortion of consequences (“This is a victimless crime.” “No harm done.” “It’s no big deal.”)
- Attribution of blame (“They brought it on themselves.” “Buyer beware.”)
- Dehumanization (“They’re a bunch of dogs.” “They’re like robots.”)
A MORAL GUIDE
Martin has handed out a laminated list of these rationalizations to his leadership students in the past. They refer to it in class, using it to challenge each other and launch critical discussions. Former students have even taken it to work, hanging it on their office walls to remind themselves.
“These phrases are verbal indicators that something is triggering our moral circuitry, even if we’re not aware of it at the time,” Martin says. “When we learn to stop and identify the mental gymnastics, it gives us an opportunity to step back and ask ‘Why do we feel the need to use these phrases — what’s underlying this? Is this statement in line with what we believe to be right? Are these choices aligned with who we say we want to be and the values we claim to hold?’”
Using this guide is “not about determining who is good or bad, or condemning a person who uses this framing,” Martin emphasizes. Everyone falls prey to these rationalizations at one time or another. “It’s just part of being a human being,” Martin notes. But being aware of what they sound like — and when we might be particularly susceptible because we have a lot to gain (or lose) — helps “remove [self-interested] bias from decision-making so that individuals and companies make decisions that are tightly aligned with their values and with public trust.”
Says Detert, “Because it’s very hard to consistently catch ourselves from morally disengaging, people who really want to avoid these cognitive traps create systems that involve others — lists of common rationalizations on office walls, a trusted co-worker who confronts them when they hear a rationalization, a group decision-making process that includes at least one person who has no self-interest in the decision (and is thus less likely to morally disengage about it).”
Additionally, this work highlights other ways companies can reduce self-interest in decision-making. In experiments, Detert, Martin and their colleagues showed that when participants were aware of people who would be harmed by their choices, they were less likely to make a self-interested choice. For organizations, this means regularly reminding workers of the connections between their actions and the impact on customers’ lives, communities in which they operate, and other countries where they source and sell. “Requiring employees to analyze potential harm to stakeholders in any new project, product or decision may keep the ‘do no harm’ standard in the forefront,” and thereby reduce moral disengagement, they wrote in a Journal of Business Ethics article.
They also found that conscientious workers — those who described themselves as always prepared, detail-oriented and diligent — were less likely to be influenced by personal gain and thus less likely to make self-interested choices. Conscientiousness isn’t a flashy trait, they note, but something companies should consider as they hire and promote.
Finally, Martin says companies need to scrutinize the goals they set. When unachievably high goals are combined with tempting rewards or threats (e.g., getting a massive bonus for making sales goals — or losing a job for missing them), it’s far more likely that unethical behavior will occur. For example, Wells Fargo employees were aggressively incentivized to sell eight products to each customer, an unrealistic goal that led to massive cheating.