Los Angeles Times blogger/former Bush aide Andrew Malcolm writes:
Back in 1994 when Bill Clinton received a lesser shellacking from voters angry over his liberal policies, he took three months to follow Dick Morris' advice, adopt some Republican goals like welfare reform as his own and declare out of the blue, "The era of big government is over." The result: An easy 1996 reelect for him.
Like Fred Barnes before him, Malcolm inexplicably ignores the effect an improving economy had on Clinton's re-election. As political scientist Brendan Nyhan has noted, "Clinton's move toward the center … may have helped somewhat to boost his margin above what we would have otherwise expected, but the driving force in 1996 (as in every election) was the state of the economy." (As Nyhan acknowledges later in the post, there is evidence that Clinton's move to the center is itself overstated.)
And as for Malcolm's suggestion that Clinton's adoption of "Republican goals like welfare reform as his own" resulted in an "easy 1996 reelect": Nonsense. First, Clinton had long made welfare reform a goal of his own. And by the time he signed legislation on August 22, 1996, Clinton had already built a comfortable lead over Bob Dole -- even after vetoing welfare legislation twice. When Clinton vetoed a welfare reform bill on January 9, 1996, he was trailing Dole in Gallup polling. By the time he signed a bill in August, Clinton had established a solid double-digit lead that reached 20 points on multiple occasions. So, basically, Malcolm is completely wrong.
Political scientist Brendan Nyhan points to a Boston Globe essay by Joe Keohane (based largely on research conducted by Nyhan) about the stickiness of misperceptions and the challenge this poses for those who think it is important for people to not be wrong.
Keohane notes that "Americans lack even a basic understanding of how their country works." Or, as Princeton's Larry Bartels put it in 1996: "the political ignorance of the American voter is one of the best documented data in political science." That, I think, is quite clearly true, and is an indictment of the news media as much as (or more than) it is a criticism of American voters.
Keohane also points out that studies have found that "misinformed people often have some of the strongest political opinions." Again, probably not surprising.
The really troubling part, though, is that several studies have concluded that presenting people with the facts may not do much to convince them. Keohane summarizes Nyhan's findings:
Facts don't necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
What's going on? How can we have things so wrong, and be so sure that we're right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn't. This is known as "motivated reasoning." Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.
New research, published in the journal Political Behavior last month, suggests that once those facts — or "facts" — are internalized, they are very difficult to budge.
Nyhan suggests one solution to this problem:
Nyhan ultimately recommends a supply-side approach. Instead of focusing on citizens and consumers of misinformation, he suggests looking at the sources. If you increase the "reputational costs" of peddling bad info, he suggests, you might discourage people from doing it so often. "So if you go on 'Meet the Press' and you get hammered for saying something misleading," he says, "you'd think twice before you go and do it again." Unfortunately, this shame-based solution may be as implausible as it is sensible.
I am a huge fan of increasing the "reputational costs" of peddling misinformation -- of not only shaming, but shunning, too.
But, as Keohane suggests, that isn't sufficient, both because there are plenty of people who are incapable of being shamed, and because neither journalists nor politicians demonstrate much interest in shaming their peers.
Keohane's essay, I think, reinforces something I've been arguing for years: the importance of repetition. It isn't enough for news organizations to occasionally correct false statements; they must do so every time they quote, paraphrase, or refer to a false statement. And it isn't enough occasionally give readers and viewers basic information about public policy debates -- it must be done over and over again. Such an approach would, I think (hope?) have two benefits: It could make it more likely that voters internalize the truth before misinformation takes hold and the repetition could break through the barriers presented by preconceived notions -- it seems likely that it's harder to dismiss something you hear a dozen times than something you hear once.
And that, by the way, is why I keep coming back to this point...
UPDATE: Also, the way in which false claims are debunked is important...
UPDATE 2: See also: Assessing the media's health care coverage