Are most polls useless?

The whole Nate Silver hoopla during the elections got me to thinking…

When a company, let’s say, Gallup, does a poll about something random; let’s say, American’s opinion on gun control…is that poll useful? I mean, considering that there aren’t tons of other polls to compare it with?

Am I just misunderstanding how polls work entirely? I find it very confusing, I’m embarrassed to admit.

There are two types of polls - those that say what you want them to say, and those that tell you what people think.

For the first, the problem is how a poll is worded and how it is asked can make a big difference to how people answer. If you ask people “on a scale of 1 to 10, how bad is it to torture a puppy?” you would likely get a much bigger response to “how likely are you to adopt an orphaned puppy?”. Ask people how much they value education and opportunity, and they are more likely to next say they are in favour of more tax money for education.

For the second type of poll, they know the other type. To get a real answer, they genrally will ask the same question a dozen different ways buried in a slew of other questions so they don’t get people skewing their answers, and so they can see whether the person’s view is consistent.

After all that, a person may be inclined one way before the election and another way when it finally rolls around.

the more valuable and important polling that political parties do, tells them exactly why voters feel the way they do and what is important to them. This allows the aprties to emphasize the message that is most likely to appeal to the group of voters they need to persuade.

It depends on the poll, of course; you could certainly conduct a poll in such a way that would render it useless.

But for an organization like Gallup, you can probably assume that they’re polling in a mostly intelligent way. The particular issue with American election polls is that there’s a big difference between:

“52% would vote for Candidate A, 48% would vote for Candidate B”
and
“48% would vote for Candidate A, 52% would vote for Candidate B”

So that’s why Mr. Silver has to jump through a bunch of meta-analysis hoops trying to get a very precise, unbiased measurement. On the other hand, “52% are in favour of gun control and 48% are opposed” tells you that opinion is roughly split, even if the percentages are off by a few points.

Any particular poll tells you how people responded to that particular poll. If you know what the poll questions are, how it was presented, and some information about who responded, you can make pretty good guesses about what the results mean. But even still they often don’t mean much. And there’s also ‘push polls’, where the results aren’t important. The idea there is to spread ideas through the questions, and the results only tell you if your message is getting across.

There’s a very old story about polls where groups of seminary students were asked in one poll “Is smoking while praying a good thing to do?” The responses were 100% negative to that question. When asked “Is praying while smoking a good thing to do?” the responses were 100% positive.

Assuming that the poll is done properly–which is a big issue–it gives you a good but not exact idea of what the people being polled think about a question without having to ask everyone. It’s important to understand the limitations of polls, but they’re not useless by any means.

To make this a little more concrete, let’s imagine a situation where 53% of the population will answer yes to a question being asked. If I simulate 1000 polls that are all executed exactly the way they should be and survey 1000 people each, the average proportion saying yes is 52.9%, and 95% of the poll results fall between 50% and 55.9%. So as I said above, it’s a pretty good measurement of the truth, but it’s not exact.

Why isn’t this post a poll?

As a general rule, single polls mean very little if anything. Many polls are a bit more reliable. Trend lines from a series of polls are the best indicator.

Single polls have a variety of known problems. A 1000 person representative sample will give a statistically valid reading of a population, but it’s become notoriously difficult to find people to talk to pollsters. Some firms need to make 8000 calls to find 1000 that will answer. We don’t know exactly what the differences are between those who will and those who won’t, but we do know that the 1000 normally do not correspond to known divisions from census data, so each subpart has to be weighted to produce the right percentages. We also know that the exact wording of questions play a major role in how they are answered.

Even with these problems, polls from a number of firms taken at the same time can be combined to create averages that appear to yield better results, as people like Nate Silver showed during the election. If one poll undercounts Democrats another might undercount Republicans, but the two combined will have a more representative number. There’s no surety to this but the technique has proven itself in practice. And while some variant wordings can produce hugely different results, good polls will provide fairly neutral wordings that hone in on an opinion. Polling firms also now release their methodology, full questions, and results so that outside analysts can examine them and try to determine how reliable these methods are. Silver discarded a few firms’ polls because they were below the line. He kept some that others considered faulty. The end result was quite good, but maybe drawing a different line could have done better.

Trend lines are the best indicators, because for most issues outside elections, it really doesn’t matter whether the population is 48% or 51% for or against something. It does matter if that issue is winning greater approval over time - or losing it. And combining polls over time has many of the same advantages of combining many polls take at the same time.

You tend to hear about polls only when something newsworthy makes reference to them. The major pollsters poll continuously, though, literally every week. And they ask the same questions, with the same wording, year after year, so that a good sense of public opinion appears. Except for some silly stunt polls, every major policy and social issue has dozens of firms asking questions on everything all the time. There’s never a shortage of ways to check opinions.

By a weird coincidence, there’s a letter to the editor in this morning Rochester Democrat & Chronicle from someone from the Pew Research Center. She was correcting some earlier letter writer who took a Pew survey result out of context. She wrote:

Polls do usually pick up general attitudes well, and changes in attitudes even better. But they are also good to show that attitudes are not always rational and precise. That same letter noted that “18 percent say that abortion is immoral and say that Roe v. Wade should not be completely overturned.” That’s a more important finding than mere raw numbers would be.

The short version of all this is: never trust a poll; only trust polls.

Thank you everyone, for your responses.

Exapno Mapcase, yeah, that makes sense. When I see something in the paper that says, “90 percent of city residents are in favor of teen curfews” but then not a single resident I discuss it with is in favor of that curfew, I wonder how in the world could the pollsters get it that wrong?

Trust polls, not a poll. Sounds like sound advice.

The people you talk to aren’t a representative sample of the population of city residents. That’s not to say that the poll is right, but you shouldn’t expect to get a reasonable measurement of anything by going out and asking people you know.

Ok, I understand that, but is it unreasonable to expect that if 90 percent of the city residents feel that way, then I should run across maybe ONE out of the hundreds that I talk to on the topic?

Or do I still have no idea…

Is this an actual example (ie., there’s an actual poll that says 90% of people in your city favour teen curfews and you actually asked hundreds of people of people about it)? If so, I would think that was somewhat odd, depending on how homogeneous the group you were asking.

If it’s a hypothetical example, it’s impossible to say.

I don’t know if it was an actual poll, because I wasn’t the one citing the poll.

Some years ago, our city made the decision to put a teen curfew into place. During that time, I was very against it. I talked to many, many people about it. I live in the city…I spoke to all of my family members, my neighbors, my daughter and the people at her school, my co-workers, the people at the center I volunteered at…I just was very vocal about it and everyone seemed to agree with me. I remember being very angry, because the curfew was a response the murder of three kids in the city. I really believed that the city was being reactionary and I didn’t like it one bit. Everyone I spoke with seemed uneasy with the whole thing.

I don’t recall anyone saying it was a good idea.

Now, recently, in an argument with someone who DID disagree about it, that person said that the people of the city agreed with him, and not me, and that a citywide poll backed him up. He said that the poll showed 90 percent thinking the curfew was a good idea.

This person never struck me as a liar before, but I swear to you, I couldn’t find a single person who thought that curfew was a good idea. I know there existed such people. I’m not a total idiot…

But I find it hard to believe that the number could be 90 percent without me bumping into a couple before this guy, years later. He says he will find the poll and prove it to me.

I’m thinking he needs to come at me with more than one poll, if he wants to convince me.

Because, oh I dunno, you can’t do polls in GQ?

See my earlier post.
See Tripolar’s post about smoking and praying.

A poll can be made to produce any answer you want with the proper leading questions, wording etc. - IF… that is the result you want.

One question:

“3 teenagers roaming the streets for no good reason last month were savagely murdered from gang violence. Do you think teens should follow a curfew?”

The other:
“Should the law force teenagers to be indoors after 10PM no matter what?”

I bet the answers to those two polls would be wildly different. The poll in favour of curfew was likely the type of poll designed to back up the action; the politicians knew what they wanted to do, all they wanted was a credible poll to wave in the face of anyone who objected. they didn’t care if people upon reflection thought it was bad, they just needed justification.

This is the opposite from the type of continuous polling to tell whether a particular issue motivates voters, or how they are leaning. Polls about who is winning the elction, for example, don’t mean much because news (or even a lack of news) can influence people from day to day.

Ok. I’m following you, md2000. Thanks.

Even if I had thought the curfew might be a good idea, I might not have had the cojones to say so to the face of a very angry, very vocal neighbor or family member or coworker who was dead-set against it.

This is a case where I’d want to see the original question. If the question is “Should teens be home by 1 am?” then I would definitely say Yes. Really, people of any age should be asleep at that time.

The catch is that the poll’s authors - or a writer for the paper - might interpret that response as “favors a curfew” even though that’s not what I meant by my response.

Had the question been worded “Should we pass a law prohibiting teens from being out at 1 am?” then I would answer No. Just because being home asleep is a good idea doesn’t mean we should make it a law.

Yeah, I’m pretty sure I’m not that scary. These were conversations with peers, not me bellowing at subordinates or something.

Dracoi, I don’t know how the poll was worded. I tried to google it myself, but I’m no good at it. I will have to wait until I speak with him about it again.