People’s natural social instinct to cognitive flaws — a few main ones
Accepting More Responsibility for Success Than Failure, for Good Deeds Than Bad
Time and again, experimenters have found that people readily accept credit when told they have succeeded (attributing the success to their ability and effort), yet they attribute failure to external factors such as bad luck or the problem’s inherent “impossibility.” These self-serving attributions have been observed not only in laboratory situations, but also with athletes (after victory or defeat), students (after high or low exam grades), drivers (after accidents), and married people (among whom conflict often derives from perceiving oneself as contributing more and benefiting less than is fair). Self-concept researcher Anthony Greenwald summarizes, “People experience life through a self-centered filter.”
Favorably Biased Self-ratings: Can We All Be Better Than Average?
In virtually any area that is both subjective and socially desirable, most people see themselves as better than average. Most business people see themselves as more ethical than the average business person. Most community residents see themselves as less prejudiced than their neighbors. Most people see themselves as more intelligent and as healthier than most other people. When the College Board asked high school seniors to compare themselves with others their own ages, 60 percent reported themselves better than average in athletic ability, only 6 percent below average.
In leadership ability, 70 percent rated themselves above average, 2 percent below average. In ability to get along with others, zero percent of the 829,000 students who responded rated themselves below average, while 60 percent saw themselves in the top 10 percent and 25 percent put themselves in the top 1 percent. If Elizabeth Barrett Browning were still writing she would perhaps rhapsodize, “How do I love me, Let me count the ways.”
The Totalitarian Ego
At the University of Waterloo, Michael Ross has repeatedly found that people will distort their past in ego-supportive ways. In one experiment he exposed some people to a message about the desirability of frequent tooth brushing. Shortly afterwards, in a supposedly different experiment, these students recalled brushing their teeth more often during the preceding two weeks than did an equivalent sample of people who had not heard the message. Noting the similarity of such findings to happenings in George Orwell’s Nineteen Eighty Four—where it was “necessary to remember that events happened in the desired manner”—Anthony Greenwald surmised that human nature is governed by a totalitarian ego that continually revises the past in order to preserve a positive self-evaluation.
Because of our mind’s powers of reconstruction, we can be sure, argues Mike Yaconelli, that “Every moving illustration, every gripping story, every testimony, didn’t happen (at least, it didn’t happen like the storyteller said it happened).” Every anecdotal recollection told by a Christian superstar is a reconstruction. It’s a point worth remembering in times when we are feeling disenchanted by the comparative ordinariness of our everyday lives.
Self-Justification: If I Did It, It Must Be Good
If an undesirable action cannot be forgotten, misremembered, or undone, then often it is justified. Among psychology’s best-established principles is that our past actions influence our current attitudes. Every time we act, we amplify the idea lying behind what we have done, especially when we feel some responsibility for having committed the act. In experiments, people who oppress someone—by delivering electric shocks, for example—tend later to disparage their victim.
Cognitive Conceit: Belief in One’s Infallibility
Researchers who study human thinking have often observed that people overestimate the accuracy of their beliefs and judgments. As Baruch Fischhoff and others have demonstrated, we often do not expect something to happen until it does, at which point we overestimate our ability to have predicted it—the “I knew it all along” phenomenon. People also fail to recognize their vulnerability to error.
Unrealistic Optimism: The Pollyanna Syndrome
Margaret Matlin and David Stang have amassed evidence pointing to a powerful Pollyanna principle—that people more readily perceive, remember, and communicate pleasant than unpleasant information. Positive thinking predominates over negative thinking. At Rutgers University, Neil Weinstein also has discerned a consistent tendency toward unrealistic optimism about future life events. Most students perceive themselves as far more likely than their classmates to experience positive events such as getting a good job, drawing a good salary, and owning a home, and as far less likely to experience negative events such as getting divorced, having cancer, and being fired.
Overestimating How Desirably One Would Act
In various experiments, most people have been observed to act in rather inconsiderate, compliant, or even cruel ways. When other people are told about these conditions and asked to predict how they would act, nearly all will insist that their own behavior would be virtuous. Similarly, when researcher Steven Sherman called Bloomington, Indiana, residents and asked them to volunteer three hours to an American Cancer Society drive, only 4 percent agreed to do so. Meanwhile, a comparable group of other residents were being called and asked to predict how they would react were they ever to receive such a request. Almost half claimed they would help.
An political, real-life, costly and hurtful example of “Totalitarian Ego”, “Belief in One’s Infallibility”, “If I Did It, It Must Be Good” is this:
refusal to accept and repent to our painful awareness of information that is inconsistent with our actions. To reduce this unpleasantness, we’re predisposed to justify our behavior. Smokers persuade themselves that smoking is a relatively harmless pleasure. Aggressors blame their victims. Attitudes follow behavior.
“After the Iraq invasion, many Americans were awash in cognitive dissonance. The war’s main premise was that Hussein had potentially devastating weapons of mass destruction. As the war began, only 38% said in a Gallup Poll that the war was justified even if Iraq did not have weapons of mass destruction. Nearly 4 in 5 Americans believed their troops would find such weapons, and a similar percentage supported the just-launched war. Surely most Americans, and John Kerry and his Senate colleagues, would not have supported the war had they known then what they know now.
But when no WMD were found, Kerry and many others experienced dissonance, which was heightened by their awareness of the war’s financial and human costs, by scenes of Iraqi chaos, by surging anti-American attitudes in Europe and in Muslim countries and by inflamed pro-terrorist sentiments. Even Defense Secretary Donald H. Rumsfeld wondered whether we were creating terrorists faster than we were eliminating them.
To reduce dissonance, some people revised their memories of their government’s primary rationale for going to war. The reasons now became construed as liberating an oppressed people from tyrannical rule and laying the groundwork for a peaceful Middle East. So as time went on, the once-minority opinion became the majority view: 58% of Americans said in one poll that they supported the war even if there were no WMD, and today most of those still do.
“Whether or not they find weapons of mass destruction doesn’t matter,” suggested GOP pollster Frank Luntz, “because the rationale for the war changed.”
With national commissions having now declared that there were no WMD and that Hussein played no part in 9/11—nor was his dilapidated army much of a threat—do any politicians who supported the war live with regret?
Sen. John D. “Jay” Rockefeller IV (D-W.Va.), realizing the war rationale has lost its legs, openly regrets his vote. But he is among the few. Sens. Kerry and John Edwards have not been able to say their vote was wrong.
Bush has declared that “although we have not found stockpiles of weapons of mass destruction, we were right to go into Iraq,” and he offers a new justification: Hussein “had the capability of producing weapons of mass murder and could have passed that capability to terrorists bent on acquiring them…. The decision I made was the right decision.”
Such self-justification reminds me of what every social psychology text teaches: Once made, decisions grow their own self-justifying legs of support. Often, these new legs are strong enough that when one leg is pulled away—perhaps the original one—the decision does not collapse. Not only do we sometimes stand up for what we believe, we come to believe in what we’ve stood up for.” (David Myers, 2009)
Kerry cannot bring himself to say that, knowing what he knows today, “the Iraq war was a big screw-up” (as even Bill O’Reilly recently acknowledged to Tim Russert). No doubt, his mental machinery, like Bush’s and yours and mine, makes him believe in his own decisions.