Wednesday, February 09, 2011

Internets Polls and other Cons

The book I’m currently reading has a whole chapter on opinion polls. Specifically, it focuses on how systematic errors in the polls can cause error bars so broad that the data is completely worthless. Unless your goal in the first place isn’t to measure public opinion but to shape it, then they aren’t worthless at all.
Yesterday on my facebook feeds I got a request to answer a survey about where I get my news. Sounds good on the surface, but then the problems start popping up.
What’s wrong with this kind of a survey? Well first off, it’s voluntary. They aren’t gonna get any mediocre opinions. People don’t log on to a volunteer survey to say that they really don’t have an opinion. So right off the bat the survey will be artificially polarized, since it will only take responses from people passionate enough to participate.
Second, I didn’t see this same survey come across any other media; radio, TV etc. This isn’t a problem by itself. They may have been specifically looking for the opinions of facebook users. It’s only a problem if they then try to extrapolate from there out to the general population. Many surveys often do exactly that.
But the big death nail in this survey’s credibility is the surveyed audience. This came across my NPR feed. Yup, this survey was only sent out to people who are already self declared fans of NPR. Are you kidding me? You’re taking a survey of people who are already fans of NPR and want to know where they get their news? Gee, I wonder how that will turn out.
Of course this is nothing new. Fox news can’t seem to go a whole hour without asking you to log in and tell them what you think. Then they come back with some ridiculous misinterpretation of the data like, “55% of Americans think Obama is Muslim.” As if the opinions of their viewers makes it reality. I’ve grown to expect this kind of meaningless polling from most news outlets. I was just a little bit surprised the see if from NPR. In fairness to them, I don’t think they were being partisan. They were just trying to create a poll that disproportionately favored NPR itself.
So If you’re ever around me when somebody tells me about a recent poll, you’re liable to hear sigh or a snicker and then a series of follow up questions about things like statistical errors v systematic errors, controlling for sample bias, error bars, etc. You see polls themselves aren’t news. At best, they are what news organizations talk about while they are waiting for real news to happen. At worst they are an attempt to manipulate opinion or politics.

4 comments:

  1. Yeah, I occasionally check in on the 538 blog because Nate Silver really knows whats a real poll versus the media-generated fluff.

    I also enjoy it when PZ Myers gets his hoards of readers to wreck an internet poll. You'd think that would stop the practice, but it never seems too.

    ReplyDelete
  2. the book actually mentioned things like Steven Colbert crashing internet polls. I think Colbert is brilliant and understand this weakness and is exploiting it brilliantly.

    ReplyDelete
  3. this survey was only sent out to people who are already self declared fans of NPR

    It's only a problem if they are using the results improperly. If they genuinely want to know, for example, roughly how much NPR Facebook fans (presumably listeners, as you implied) make use of newspaper, TV, news outlet websites, blogs, etc., then what's the issue?

    Heck, there are probably some people who adore Car Talk and The Splendid Table but never listen to the news shows.

    ReplyDelete
  4. I agree. That's why I qualified it by saying, It’s only a problem if they then try to extrapolate from there out to the general population.

    ReplyDelete