Nate Silver is systematically building a case for declaring pollster Strategic Vision, LLC a fraud

“Not an outright fraud”?

A sketchy pollster that operates out of a series of UPS mailbox addresses and has a high school poll with absurd results that it’s fairly obvious are completely made up.

I’m not a statistics maven, but the results of that HS survey are so insane I’d like to know how you can characterize them as anything other than fraud.

I thought that type of accusation is just noise. It’s fairly common for companies to use UPS boxes as offices, or “virtual” offices, that consist of some outfit hired to answer the phone (for a host of companies using the same service). I’m hardly surprised that a PR firm does this - that’s probably exactly what they’re recommending to half their clients. It has zero bearing on the reliability of their polls.

I addressed this above. I suspect that they counted spelling mistakes and the like as incorrect answers. These were not multiple choice questions.

That said, bear in mind that other surveys have also shown very little knowledge of civics among HS kids. E.g. this survey of California HS kids which shows that

And these are things constantly in the news. The SV results were more egregious, but not so much more egregious that they are obviously completely made up.

This doesn’t seem to be the case. In the full results the totals seem to add up to 100. For example, the totals for the 2 political parties question were 43% Democrat and Republican, 11% Communist and Republican, 46% Don’t Know. Unless they counted everything but the first two as “Don’t Know” then it had to be a multiple choice (and why those were the choices will be left as an exercise for the reader).

Similarly for the Supreme Court justices question - nobody guessed 7 or 11, but all other choices from 5-12 were represented? Seems unlikely unless it was multiple choice.

Finally, notice that in your California poll the students “averaged a little over 60% correct on the commonly used survey items”. The average in the OK poll was 27.2%. So, either California students are more than three times as smart as OK students (I’d believe twice as smart…) or there is something dramatically wrong with this SV poll.

I assume that they showed the categories that had most responses and included the others with “don’t know”.

The descriptions of the survey say the questions are taken from the USCIS citizenship test, and the results are compared to the results of that test. AFAIK the USCIS citizenship test is not multiple choice. ETA: I see the OCPA article explicitly says it was not MC

The survey items are not directly comparable. I would say the California test is a lot easier, requiring name recognition and basic understanding versus recognition of terms like “executive branch” and relatively irrelevant things like length of senate terms or number of justices in the OK one.

[BTW, while I would guess that your civic knowledge exceeds both CA & OK students, I’m not so sure of your math … :)]

They are obviously completely made up (or at least grossly misrepresented) if you look at the raw answers. As you said, these are not multiple choice questions, and they were asked of 1000 students. Keep that in mind and take a look at this report.

  1. On the question, “Who wrote the Declaration of Independence?” seventy people said “Barack Obama” and twenty said, “Michael Jackson.” Would twenty different students independently pick “Michael Jackson” as a joke answer on an open-ended non-MC test? That doesn’t pass the laugh test.

  2. On the question, “We elect a US Senator for how many years?” 34% answered “don’t know” and 11% gave the correct answer of six years. The remaining students answered either ten, four, or two years. Not one of these students answered an odd number, and not one of then gave a crazy joke answer. What are the chances of that?

  3. On the question, “Who was the first president of the United States?” fully 140 students answered either “Barack Obama,” “George W. Bush,” or “Richard Nixon.” Other students gave presidential answers; oddly, none answered “Michael Jackson.” Again, doesn’t pass the smell test.

  4. The kicker: On the question “What are the two major political parties in the United States?” there were only three answers given: “Democrat & Republican,” “Don’t know,” and “Communist & Republican,” the last by fully 110 students. Does it seem plausible that, on a non-multiple-choice test, 11% of the respondents got exactly the same wrong answer, and all other students either got the right answer or admited not knowing?

I’m surprised anyone with access to the raw data would think this is a legitimate poll.

A dangerous assumption. From the previous link,

“I don’t know” is a very specificly defined answer. Lumping other answers in with it would be misleading at best. Interestingly, the Arizona test commissioned by the Goldwater Institute and performed by SV is reported by the Goldwater Institute using (for some questions) an “other” category. Jas09’s observation on the Supreme Court question is still apparent, albeit with revised numbers, as is mine on the Senate terms. I wonder, more than a little, what the “other” category represents on the political party question.

I don’t assume it was a joke answer. That poll was taken very shortly after MJ’s death, when he was all over the news and being lionized constantly.

Length of Senate terms doesn’t lend itself to jokes. I’m speculating that some of the smaller answers were grouped with “don’t know”.

I think people knew enough about MJ to know that he hadn’t been a president. But they knew that he wrote things.

It does look weird. But OK is a very conservative state, and I wonder if some people there associate Democrats with Commies and confuse the kids. Prior comments about “don’t know” apply.

Hah! And with a BS and MS in Electrical Engineering (with a minor in math) you’d think I’d avoid those kind of mistakes. :mad:

I guess it’s plausible that they lumped everything that didn’t match the top few vote-getters with “Don’t Know” but that seems like a really stupid way of doing it. Especially if “Don’t Know” is a response explicitly given. “Other” seems a much more rational way of doing it.

Either way, the results of the OK survey make absolutely no sense. It’s just not plausible to me that not one of 1000 high-school students got even 8 of those questions correct.

The only thing that borders on a possible explanation is that the students just made shit up as a laugh.

And I’m fairly certain that’s incorrect, and certainly certain it’s incorrect in the Arizona poll. And “one million” is a perfectly appropriate joke answer.

Just how laughable do the results have to be before you’re willing to think something might not be kosher? “They knew Michael Jackson wrote things”? Really? “Some people in Oklahoma associate Democrats with Commies and confuse the kids?” Really? Enough so that fully eleven percent of the students actually believe that? That’s an extraordinary claim. Do you actually believe that, or are you speculating in a “well, if the stars aligned and maybe these other things happened, then if you cocked your head just right these numbers could have been produced without resorting to outright fraud” kind of way?

The reported result that 75% of high school students in Oklahoma don’t know that George Washington was the first president strike me as implausible. Not implausible enough to disregard the whole poll, but enough to take a good hard look at it. And when you take a good hard look at it, a lot of other things look surprising. And implausible. And laughable. At some point you have to say, “yeah, something’s just not right here.” It’s certainly possible this isn’t outright fraud, but I con’t think of a more likely explanation.

It’s certainly possible that it’s outright fraud, but I still think shoddy is more likely.

I know if I was someone who had paid for one of these surveys I would be asking them for backup data around now.

Even if you grant that, you still have to address the fact that, according to their results, all students in Oklahoma are exactly the same. If they were counting off for spelling, or whatever, then the kids who were better spellers would be expected to do better than the kids who can’t spell. But if some groups of students did better than others, then you would have seen a different set of results on that test.

They checked for spelling mistakes and counted them as incorrect for a phone survey? That’s a pretty large leap.

There’s one non-fraudulent way to always get percentages that add up to 100%. Ask exactly 100 people. Of course, that results in a 10% margin of error…

Or 1000 people, if you then report results to a tenth of a percent.

Here’s yet another article questioning SV, this time in regard to the OK student poll.

Several other articles at FiveThirtyEight on this since the last post.

What’s really interesting is that, according to Silver, Strategic Vision has not published any new poll results since he called them out.