Depending on how you look at it, we're roughly six months or 60 years into the debate over whether and how the government should ensure universal health care for all Americans. And yet if there's one thing polling on the public's opinions about health care makes clear, it's that people are confused, holding a disparate mix of often contradictory views and frequently clinging to incorrect beliefs.
For reporters, there is a clear lesson in this: Put the polls down. Just walk away. Pay them no attention. Pretend they don't exist.
For one thing, whatever you think they mean, there's plenty of evidence to support the opposite interpretation.
For another, there just isn't anything particularly noteworthy in the results. People favor significant reform, think all Americans should have coverage, are concerned about how much it will cost, worry that change could make their own situation worse -- is any of this really surprising? Does anyone actually need a poll to tell them these things?
Put another way: When is the last time you saw a truly surprising poll result? When is the last time you saw poll data that showed that people don't care whether others have health insurance and don't think the government should have any role in health care whatsoever? Or the last time you saw a poll that found people were willing to pay higher taxes and lose the ability to choose their own doctors and cede health care decisions to the government if that's what it takes to get coverage for their neighbors?
Not only is the overwhelming majority of health care polling unsurprising, much of it is essentially meaningless. Take the oft-asked question of whether people approve of President Obama's handling of health care. That's a question the media love to tout -- but what does it mean? Basically nothing. If 60 percent disapprove, what does that tell us? Without knowing how many disapprove because they don't think the government should be involved in health care and how many disapprove because they think Obama should have won passage of a public option by now, the result doesn't tell us anything. Likewise, if 60 percent approve, we don't know why. Is it his emphasis on bipartisanship? His deference to Congress? His advocacy for universal coverage and a public option?
Then there's the truly meaningless. Take a look at this question from a new CNN poll:
If your member of Congress came to your community and held a town hall meeting or some other public forum where voters got a chance to speak, how likely is it that you would attend that event to tell your member of Congress what you think about health care? Would you be very likely, somewhat likely, not very likely, or not likely at all to do that?
Forty-one percent said "very likely" and 30 percent said "somewhat likely," and that doesn't tell us anything. Why not? Because we have nothing to compare it to. CNN has apparently never asked a question like this before -- about health care or any other issue. So we don't know whether those numbers are high or low; we don't know what the baseline is.
My suspicion is that if you asked people five questions in a row about, say, education -- an issue that hasn't gotten much attention in quite a while -- and then asked them if they would take the opportunity to tell their member of Congress what they think about education, a large number of respondents would answer affirmatively.
Here's an illustration of the importance of having points of comparison for poll data like this: In February, a CNN poll asked respondents how important it was for the president and Congress to deal with several issues. Eight-one percent said it was "extremely" or "very" important that they deal with education. Wow, 81 percent! That's huge, right? Well, no. The economy came in at 95 percent, terrorism at 82 percent, health care at 77 percent, Social Security and Medicare at 83 percent, taxes, 76 percent, Iraq, 75 percent, Afghanistan, 76 percent, energy policy, 73 percent ... you get the point.
So when a CNN poll finds that 71 percent of Americans say they're likely to attend a town hall meeting to tell their members of Congress what they think about health care but provides absolutely no other data to measure that result against, it doesn't really have much value at all. It tells us next to nothing.
Now add in the fact that it doesn't tell us how many of those 71 percent want to tell their member of Congress to stop screwing around and pass a public plan, and how many want to tell their member of Congress to keep the government's hands off their health care. It's pretty clear now that that 71 percent figure means much less than it seems, isn't it?
In fact, it means so little that I have a hard time believing it was actually intended to measure anything important. I suspect the sole reason it was included in the poll was so that CNN could include the result in their news reports about angry town hall attendees -- not because they thought it would actually be illuminative. It isn't compelling information; it's a prop.
Speaking about angry town hall attendees: Ignore them, too. A dozen people shouting at a town hall meeting -- even a dozen people shouting at each of a hundred town hall meetings -- just doesn't tell us anything meaningful about public opinion. It tells us that there are at least few thousand angry people, and that they're organized. We already know that.
Look: Sarah Palin drew big crowds last year -- and a lot of those people were angry. They yelled, they held up nasty signs, and they convinced a lot of the media there was some huge groundswell of opposition to Barack Obama. Then he went out and won North Carolina and Indiana.
Video of people yelling about health care may make for good television, but it makes for lousy journalism. It exaggerates the numbers and significance of the people who yell the loudest, whichever side they're on. (And this should go without saying, but a shaky cell-phone video that shows a half-dozen of the hundred people at a meeting, and that was provided by people who are trying to "artificially inflate" their numbers, is not a particularly reliable indication of what happened at that meeting.)
So, basically, there's no real value in reporting on polls or protests. How should news organizations cover health care reform?
Simple: Cover health care reform.
All those polls showing that people hold contradictory views and false -- or at least highly questionable -- beliefs about health care and efforts to reform it are a pretty good indication of what reporters should be doing: Reporting the truth, and doing it often. Giving people the facts about health care and about proposals to reform it.
When you see people yelling, "Keep your government hands off my Medicare," that's a pretty good indication that the public could use some solid facts. How many people do you think know that health care reform with a strong public option would cost taxpayers less than a plan without such an option? I would bet that a distressingly large number of members of Congress don't know that -- and that very, very few voters do.
People are understandably confused and unfamiliar with the facts -- there are an awful lot of people spending an awful lot of money to confuse them and keep them in the dark. And they don't have the time or the resources to sort through it all and find out whether reform would mean that a government bureaucrat is really going to show up at their door and tell them it's time to die in order to save taxpayers money. (No: That would not happen.)
As Brendan Nyhan notes, the media bear significant responsibility for this confusion:
Who's to blame for this problem? I largely fault the media. ... [I]t's extremely difficult to myth-proof a bill or to effectively counter these claims once they are made. Until the media stops giving airtime and column inches to proponents of misinformation, the playbook is going to keep working.
Nyhan doesn't go quite far enough, though. The media should not only stop giving airtime and column inches to liars and the lies they tell, they should affirmatively and aggressively report the truth. And they need to do so over and over again. Once is not enough. (To those who would respond that repetition is, by definition, not "news": Are you really prepared to argue that newscasts and newspapers don't repeat the same ideas over and over again? Really?)
If news organizations want to produce health care reporting that actually has some value, some utility to their readers and viewers, they'll forget about the polls and the protests and the politics and focus on making the actual facts about health care, and efforts to change the system, as clear as they can.
I know what many journalists will say: This is how things are. Political intrigue, controversy, polling, strategy, demonstrations -- these are the things the media cover. That's how it works.
No. That's how it doesn't work. That's how we have a public that is so badly confused about health care reform that polling on the topic is basically a useless bundle of contradictory results. That's how we have a situation in which more than half of the Republican Party doesn't know Barack Obama was born in the United States. And how is this approach working for the media? Public trust in and respect for journalists is not exactly strong -- and, as I'm sure most reporters have noticed, news organizations across the country are shedding employees in a desperate struggle to stay afloat.
So who are the old ways working for?
Jamison Foser is a Senior Fellow at Media Matters for America, a progressive media watchdog and research and information center based in Washington, D.C. Foser also contributes to County Fair, a media blog featuring links to progressive media criticism from around the Web as well as original commentary. You can follow him on Twitter and Facebook or sign up to receive his columns by email.