The ethic of Wall Street is the ethic of celebrity. It is fused into one bizarre, perverted belief system and it has banished the possibility of the country returning to a reality-based world or avoiding internal collapse. A society that cannot distinguish reality from illusion dies.
This site may contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in an effort to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. we believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law.
In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml
If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.
Cheating is astoundingly common. One survey revealed that about three
fourths of 1,800 students at nine universities admitted to cheating on
tests or assignments.
Fraudulent work squanders resources. It may also harm patients and misdirect scientists or public policy makers.
Researchers are trying to better understand cheating in hopes of
minimizing its impact on society. So far they have discovered that
creativity, fear of loss and observing dishonest behavior can motivate
cheating or make it more likely to happen.
Cyclist Lance Armstrong has apologized for using
performance-enhancing drugs to win seven Tour de France titles. He
attributed his cheating to a determination to “win at all costs.”
Psychologist Marc Hauser of Harvard University, who once wrote an
article entitled “Costs of Deception: Cheaters Are Punished…,” is now
out of a job after the U.S. Office of Research Integrity concluded that
he “fabricated data, manipulated results in multiple experiments, and
described how studies were conducted in factually incorrect ways.”
Sixteen banks have agreed to settlements or are under investigation for
manipulation of the Libor, an interest rate at which banks may borrow
from other banks, in what is said to be the largest financial scam in
the history of markets.
These cases are only part of a seemingly unending stream of cheating
scandals in the news, affecting sports, science, education, finance and
other realms. Although it is comforting to think that most people are
essentially honest, cheating—defined as acting dishonestly to gain an
advantage—is actually astoundingly common. In a 1997 survey, management
professor Donald McCabe of Rutgers University and Linda Klebe Treviño, a
professor of organizational behavior at the Pennsylvania State
University, revealed that about three fourths of 1,800 students at nine
state universities admitted to cheating on tests or written assignments.
In 2005 sociologist Brian Martinson of the HealthPartners Research
Foundation in Bloomington, Minn., and his colleagues reported that one
third of scientists confessed to engaging in questionable research
practices during the previous three years.
This article was originally published with the title Why We Cheat.
A lack of morality can lead to bad behavior—but can behaving badly make
us lose our morals? Casey Schwartz on how lying, cheating, and stealing
warps our sense of right and wrong.
A
lack of morality can lead to bad behavior—but can behaving badly make
us lose our morals? Casey Schwartz on how lying, cheating and stealing
warps our sense of right and wrong.
Before you
hack into your boyfriend’s email account, or sleep with the married guy,
or overstate your billable hours, take note: Telling yourself it is
"just this once” is an unlikelier story than ever before.
Or at
least that is the conclusion of intriguing new research that examines
the way our actions influence our beliefs, reversing the traditional
direction of cause and effect. In their study, published in the current
issue of Personality and Social Psychology Bulletin, Lisa Shu and her
colleagues at Harvard University found that behaving badly actually
altered their subjects’ sense of right and wrong.
Humans
are invested in seeing themselves as ethical creatures. We want to
believe in the rightness of our own conduct, to see our lives as a
series of mostly well-intentioned decisions. And it appears that we'll
go to great lengths to feel that way, even if it means warping our own
sense of morality to suit our needs.
The famous psychologist
Albert Bandura coined the term “moral
disengagement” to capture the
process by which people pervert their own sense of right and wrong in
order to give into a questionable temptation.
Yes I know he’s married, but it’s OK to sleep with him, the logic of moral disengagement goes, because, insert excuse here:
I can’t stand his wife. If not with me, it would be with somebody else. This is his moral dilemma, not mine. The institution of marriage is a meaningless concept.
The options are many.
Moral
disengagement essentially allows people to behave in ways that, at
another moment, in a different mood, that same person would never
consider. For years, research has shown again and again that moral
disengagement influences how people will behave in a given situation.
But now, in a chicken-and-egg twist, Shu and her team have shown that it
works both ways: How people behave influences the moral beliefs they
have about their behavior. Moral disengagement is the result of
unethical behavior, they have now shown, not just the cause.
Shu’s
research is based on a string of four related studies, each using a
different group of undergraduates as subjects. In one, 138 subjects were
asked to read an academic honor code that reinforced in their minds the
idea that cheating is wrong. Then they were given a set of math
problems to solve, and an envelope of cash that they would be rewarded
from, according to how many problems they answered correctly. The
subjects were divided into two conditions: one where it was possible for
them to cheat by misreporting their own scores, and a control condition
where their scores were tallied by a proctor in the room. Perhaps not
surprisingly, some of the subjects in the first group, who were allowed
to report their own scores, inflated those scores in order to get more
cash.
We want to believe in the rightness of our own conduct, to see our lives as a series of mostly well-intentioned decisions.
Afterward,
they were given a questionnaire to fill out that they’d also been given
at the beginning of the study, consisting of questions designed to
measure moral engagement with a focus on cheating. Shu and her
colleagues developed this measure themselves, and tested its validity in
other circumstances before using it for the current research. The
results? Those subjects who had cheated on the math problems
demonstrated a greater degree of moral disengagement in their responses
the second time they filled out the questionnaire.
What's more,
Shu found that the students who had cheated also had a harder time
remembering the academic honor code that they’d been given to read
before the task, compared to those subjects who hadn’t cheated. Shu
calls this phenomenon “motivated forgetting,” citing it as yet another
strategy we deploy to avoid the disquieting recognition that we’ve done
something wrong.
In fact, what initially led Shu to this research
was her sense that beliefs and values are not fixed, stable traits that
we tote with us like a wheelie bag everywhere we go. On the contrary,
she believes we bend or break them according to circumstance.
“It
didn’t seem intuitive to me that our beliefs never change,” Shu said.
“But what really led me to the question was the debate, both in academia
and in the business world, about how much of peoples’ dishonest
behaviors and bad actions is due to the situation, versus who that
person is and how their upbringing was.”
Shu
notes that given the “permissive environment” that she created in the
lab by allowing one group of subjects the opportunity to cheat, she
produced a greater likelihood of cheating—which in turn produced a shift
in the way the cheaters thought about cheating.
On
the bright side, Shu found that if participants did something as simple
as sign their names to the honor code, rather than just passively read
it, they were less likely to cheat on the math problems they were given
to solve.
As a whole, Shu and her
colleagues’ study is further reason to doubt that people have an
unbudging, ingrained ethical compass guiding their every action. Indeed,
ignoring that compass seems to make us forget we have it at all, at
least temporarily.
The implications of
Shu’s findings align with the existing research and paint a troubling
picture of how morality can easily spiral out of our grip without us
even noticing. If both things are true—that attitude influences action
and action influences attitude—it becomes easier to understand scenarios
of runaway transgressions. You do something you know isn't good, you
talk yourself out of feeling bad about it, you become more likely to do
it again—and, having done it again, you’re back to telling yourself it
doesn’t matter, it’s no big deal, it was just this once…
And just like that, you’ve done nothing wrong.
Casey
Schwartz is a graduate of Brown University and has a master's in
psychodynamic neuroscience from University College London. She has
previously written for The New York Sun and ABC News. Currently, she's
working on a book about the brain world.
We like to believe that a few bad apples spoil
the virtuous bunch. But research shows that everyone cheats a
little—right up to the point where they lose their sense of integrity.
—Mr. Ariely is the James B. Duke Professor of
Behavior Economics at Duke University. This piece is adapted from his
forthcoming book, "The (Honest) Truth About Dishonesty: How We Lie to
Everyone—Especially Ourselves," to be published by HarperCollins on June
5.
Research shows that nearly everyone cheats a
little if given the opportunity. Dan Ariely, author of the new book,
"The (Honest) Truth About Dishonesty," explains why. (Photo courtesy
Shutterstock)
Not too long
ago, one of my students, named Peter, told me a story that captures
rather nicely our society's misguided efforts to deal with dishonesty.
One day, Peter locked himself out of his house. After a spell, the
locksmith pulled up in his truck and picked the lock in about a minute.
"I was amazed at how quickly and easily
this guy was able to open the door," Peter said. The locksmith told him
that locks are on doors only to keep honest people honest. One percent
of people will always be honest and never steal.
Another 1% will always
be dishonest and always try to pick your lock and steal your television;
locks won't do much to protect you from the hardened thieves, who can
get into your house if they really want to. The purpose of locks, the
locksmith said, is to protect you from the 98% of mostly honest people
who might be tempted to try your door if it had no lock.
We are all cheaters, but we don't do it
rationally, according to Duke University's Dan Ariely. In a Big
Interview with WSJ's Rolfe Winkler he explains the psychology that makes
us willing to cheat more or less depending on the circumstances and
what we can do to resist temptation.
We tend to
think that people are either honest or dishonest. In the age of Bernie
Madoff and Mark McGwire, James Frey and John Edwards, we like to believe
that most people are virtuous, but a few bad apples spoil the bunch. If
this were true, society might easily remedy its problems with cheating
and dishonesty. Human-resources departments could screen for cheaters
when hiring. Dishonest financial advisers or building contractors could
be flagged quickly and shunned. Cheaters in sports and other arenas
would be easy to spot before they rose to the tops of their professions.
But that is not how dishonesty works.
Over the past decade or so, my colleagues and I have taken a close look
at why people cheat, using a variety of experiments and looking at a
panoply of unique data sets—from insurance claims to employment
histories to the treatment records of doctors and dentists. What we have
found, in a nutshell: Everybody has the capacity to be dishonest, and
almost everybody cheats—just by a little. Except for a few outliers at
the top and bottom, the behavior of almost everyone is driven by two
opposing motivations. On the one hand, we want to benefit from cheating
and get as much money and glory as possible; on the other hand, we want
to view ourselves as honest, honorable people. Sadly, it is this kind of
small-scale mass cheating, not the high-profile cases, that is most
corrosive to society.
Which two numbers in this
matrix add up to 10? Asked to solve a batch of these problems, most
people cheated (claiming to have solved more of them than they had) when
given the chance.
Much of
what we have learned about the causes of dishonesty comes from a simple
little experiment that we call the "matrix task," which we have been
using in many variations. It has shown rather conclusively that cheating
does not correspond to the traditional, rational model of human
behavior—that is, the idea that people simply weigh the benefits (say,
money) against the costs (the possibility of getting caught and
punished) and act accordingly.
The basic matrix task goes as follows:
Test subjects (usually college students) are given a sheet of paper
containing a series of 20 different matrices (structured like the
example you can see above) and are told to find in each of the matrices
two numbers that add up to 10. They have five minutes to solve as many
of the matrices as possible, and they get paid based on how many they
solve correctly. When we want to make it possible for subjects to cheat
on the matrix task, we introduce what we call the "shredder condition."
The subjects are told to count their correct answers on their own and
then put their work sheets through a paper shredder at the back of the
room. They then tell us how many matrices they solved correctly and get
paid accordingly.
In a variety of experiments,
Dan Ariely and his colleague have identified many factors that can make
people behave in a more or less honest fashion.
What happens
when we put people through the control condition and the shredder
condition and then compare their scores? In the control condition, it
turns out that most people can solve about four matrices in five
minutes. But in the shredder condition, something funny happens:
Everyone suddenly and miraculously gets a little smarter. Participants
in the shredder condition claim to solve an average of six matrices—two
more than in the control condition. This overall increase results not
from a few individuals who claim to solve a lot more matrices but from
lots of people who cheat just by a little.
Would putting more money on the line
make people cheat more? We tried varying the amount that we paid for a
solved matrix, from 50 cents to $10, but more money did not lead to more
cheating. In fact, the amount of cheating was slightly lower when we
promised our participants the highest amount for each correct answer.
(Why? I suspect that at $10 per solved matrix, it was harder for
participants to cheat and still feel good about their own sense of
integrity.)
Would a higher probability of getting
caught cause people to cheat less? We tried conditions for the
experiment in which people shredded only half their answer sheet, in
which they paid themselves money from a bowl in the hallway, even one in
which a noticeably blind research assistant administered the
experiment.
Once again, lots of people cheated, though just by a bit.
But the level of cheating was unaffected by the probability of getting
caught.
Knowing that most people cheat—but just by a little—the next logical question is what makes us cheat more or less.
One thing that
increased cheating in our experiments was making the prospect of a
monetary payoff more "distant," in psychological terms. In one variation
of the matrix task, we tempted students to cheat for tokens (which
would immediately be traded in for cash). Subjects in this token
condition cheated twice as much as those lying directly for money.
Another thing that boosted cheating:
Having another student in the room who was clearly cheating. In this
version of the matrix task, we had an acting student named David get up
about a minute into the experiment (the participants in the study didn't
know he was an actor) and implausibly claim that he had solved all the
matrices. Watching this mini-Madoff clearly cheat—and waltz away with a
wad of cash—the remaining students claimed they had solved double the
number of matrices as the control group. Cheating, it seems, is
infectious.
Other factors that increased the
dishonesty of our test subjects included knowingly wearing knockoff
fashions, being drained from the demands of a mentally difficult task
and thinking that "teammates" would benefit from one's cheating in a
group version of the matrix task. These factors have little to do with
cost-benefit analysis and everything to do with the balancing act that
we are constantly performing in our heads. If I am already wearing fake
Gucci sunglasses, then maybe I am more comfortable pushing some other
ethical limits (we call this the "What the hell" effect). If I am
mentally depleted from sticking to a tough diet, how can you expect me
to be scrupulously honest? (It's a lot of effort!) If it is my teammates
who benefit from my fudging the numbers, surely that makes me a
virtuous person!
The results of these experiments
should leave you wondering about the ways that we currently try to keep
people honest. Does the prospect of heavy fines or increased enforcement
really make someone less likely to cheat on their taxes, to fill out a
fraudulent insurance claim, to recommend a bum investment or to steal
from his or her company? It may have a small effect on our behavior, but
it is probably going to be of little consequence when it comes up
against the brute psychological force of "I'm only fudging a little" or
"Everyone does it" or "It's for a greater good."
What, then—if anything—pushes people toward greater honesty?
There's a joke
about a man who loses his bike outside his synagogue and goes to his
rabbi for advice. "Next week come to services, sit in the front row,"
the rabbi tells the man, "and when we recite the Ten Commandments, turn
around and look at the people behind you. When we get to 'Thou shalt not
steal,' see who can't look you in the eyes. That's your guy." After the
next service, the rabbi is curious to learn whether his advice panned
out. "So, did it work?" he asks the man. "Like a charm," the man
answers. "The moment we got to 'Thou shalt not commit adultery,' I
remembered where I left my bike."
What this little joke suggests is that
simply being reminded of moral codes has a significant effect on how we
view our own behavior.
Inspired by the thought, my colleagues
and I ran an experiment at the University of California, Los Angeles.
We took a group of 450 participants, split them into two groups and set
them loose on our usual matrix task. We asked half of them to recall the
Ten Commandments and the other half to recall 10 books that they had
read in high school. Among the group who recalled the 10 books, we saw
the typical widespread but moderate cheating. But in the group that was
asked to recall the Ten Commandments, we observed no cheating
whatsoever. We reran the experiment, reminding students of their
schools' honor codes instead of the Ten Commandments, and we got the
same result. We even reran the experiment on a group of self-declared
atheists, asking them to swear on a Bible, and got the same no-cheating
results yet again.
This experiment has obvious
implications for the real world. While ethics lectures and training seem
to have little to no effect on people, reminders of morality—right at
the point where people are making a decision—appear to have an outsize
effect on behavior.
Another set of our
experiments, conducted with mock tax forms, convinced us that it would
be better to have people put their signature at the top of the forms
(before they filled in false information) rather than at the bottom
(after the lying was done). Unable to get the IRS to give our theory a
go in the real world, we tested it out with automobile-insurance forms.
An insurance company gave us 20,000 forms with which to play. For half
of them, we kept the usual arrangement, with the signature line at the
bottom of the page along with the statement: "I promise that the
information I am providing is true." For the other half, we moved the
statement and signature line to the top. We mailed the forms to 20,000
customers, and when we got the forms back, we compared the amount of
driving reported on the two types of forms.
People filling out such forms have an
incentive to underreport how many miles they drive, so as to be charged a
lower premium. What did we find? Those who signed the form at the top
said, on average, that they had driven 26,100 miles, while those who
signed at the bottom said, on average, that they had driven 23,700
miles—a difference of about 2,400 miles. We don't know, of course, how
much those who signed at the top really drove, so we don't know if they
were perfectly honest—but we do know that they cheated a good deal less
than our control group.
Such tricks aren't going to save us
from the next big Ponzi scheme or doping athlete or thieving politician.
But they could rein in the vast majority of people who cheat "just by a
little." Across all of our experiments, we have tested thousands of
people, and from time to time, we did see aggressive cheaters who kept
as much money as possible. In the matrix experiments, for example, we
have never seen anyone claim to solve 18 or 19 out of the 20 matrices.
But once in a while, a participant claimed to have solved all 20.
Fortunately, we did not encounter many of these people, and because they
seemed to be the exception and not the rule, we lost only a few hundred
dollars to these big cheaters. At the same time, we had thousands and
thousands of participants who cheated by "just" a few matrices, but
because there were so many of them, we lost thousands and thousands of
dollars to them.
In short, very few people steal to a
maximal degree, but many good people cheat just a little here and there.
We fib to round up our billable hours, claim higher losses on our
insurance claims, recommend unnecessary treatments and so on. Companies also find many ways to game
the system just a little. Think about credit-card companies that raise
interest rates ever so slightly for no apparent reason and invent all
kinds of hidden fees and penalties (which are often referred to, within
companies, as "revenue enhancements"). Think about banks that slow down
check processing so that they can hold on to our money for an extra day
or two or charge exorbitant fees for overdraft protection and for using
ATMs.
All of this means that, although it is
obviously important to pay attention to flagrant misbehaviors, it is
probably even more important to discourage the small and more ubiquitous
forms of dishonesty—the misbehavior that affects all of us, as both
perpetrators and victims. This is especially true given what we know
about the contagious nature of cheating and the way that small
transgressions can grease the psychological skids to larger ones.
We want to install locks to stop the
next Bernie Madoff, the next Enron, the next steroid-enhanced all-star,
the next serial plagiarist, the next self-dealing political miscreant.
But locking our doors against the dishonest monsters will not keep them
out; they will always cheat their way in. It is the woman down the
hallway—the sweet one who could not even carry away your flat-screen TV
if she wanted to—who needs to be reminded constantly that, even if the
door is open, she cannot just walk in and "borrow" a cup of sugar
without asking.
—Mr. Ariely is the James B. Duke Professor of
Behavior Economics at Duke University. This piece is adapted from his
forthcoming book, "The (Honest) Truth About Dishonesty: How We Lie to
Everyone—Especially Ourselves," to be published by HarperCollins on June
5.
No comments:
Post a Comment