I was reading an article about a pollster who’s predicting a Trump win. His credibility is based on the fact that he predicted Trump’s victory in 2016.
He explains that his predictions take “shy voters” into account. He says these are people who will vote for a candidate but who won’t admit how they are voting outside of the voting booth. So pre-election polls mark these people down as “undecided” based on how they answer surveys.
I agree that shy voters exist. I just question this guy’s claims that he can make predictions about them. How are you supposed to make an accurate assessment of the impact of people who are refusing to give you accurate data?
How does this guy know that there are people who plan on voting for Trump but won’t say so in a poll? How does he know that there aren’t shy voters who plan on voting for Biden? How can this guy tell the difference between a shy voter who’s pretending to be undecided and a voter who genuinely is undecided?
It seems to me that the only way you can determine how many shy voters there are is to form a pre-election model based on polls and then compare it to the outcome of the actual election. A significant difference between the polls and the actual outcome is a sign of shy voters. But the point is that you can’t predict whether there are shy voters until you have the data from the election.
If you go beyond just “who do you plan to vote for” and ask a lot of other questions about what policy issues are important to people, there’s probably some predictive power there about whether someone is lying. I suspect that most “shy” voters don’t have some sophisticated master plan prepared to lie consistently about their opinions all the issues. But it would surely be difficult to predict with any accuracy.
But, of course, the idea that he has “credibility” from one correct prediction is nonsense.
So, like, what are good, solid reasons to assume there will be more shy Trump voters than shy Biden voters, or that the former will make more of a difference than the latter? Most discussion I’ve seen seem to take this for granted, but a lot of the possibilities for reasons I’m thinking of are basically just assumptions and best guesses based on personal pop psychology. Is there anything better to go on?
Ve statisticians haf vays to find out ze sings ve vant to know.
I think one way would be to take a sampling from your target population and ask them who they will vote for. Then take a separate sampling from the same population and just outrightly ask them if they are “shy”. (Getting the question worded correctly probably is important.)
Even though the two samples consist of different people, you can probably do some kind of statistical formulas on the data to tease out the kind of information you want.
Mr Cahaly said voters in 2016 “didn’t want to admit they were voting for Trump”.
Similarly in the 2018 Florida governor’s race between Democrat Andrew Gillum and Republican Ron DeSantis, “people were saying they were for Gillum who had no intention of voting for Gillum”.
“The way you minimise the social desirability bias is you make the person feel the most anonymous,” Mr Cahaly said. “I’ve got to get past what you want to say in public and get to what you really feel, because what’s in your heart is what’s going to be on that ballot.”
Trafalgar Group uses a variety of different methods, including live calls, digital questionnaires, texts – which he says generate huge response rates from often difficult-to-reach younger voters – emails and a “proprietary digital platform” that Mr Cahaly won’t explain.
“And there’s different ways to do it when you do a live call, but really push the anonymous part – this is your anonymous say-so,” he said.
Mr Cahaly also points out another big problem with traditional polling – conservatives in general are less likely to participate.
“We see a five-to-one refusal rate among conservatives (versus progressives). You’ve got to work very hard to get a fair representation of conservatives when you do any kind of a survey,” he said.
To account for this, pollsters will give greater “weight” to a smaller Republican sample to make up for a shortfall in responses, but Mr Cahaly says one problem is that Republicans “who don’t like Trump can’t wait to answer a poll”.
“So immediately, within the 22 per cent, they’ve probably overrepresented it, the anti-Trump Republicans, the Never Trumper types,” he said. “Well, when you weight that up from 22 to 35, now you have skewed an already bad representation sample. So that’s kind of inherently how they can be so off.”
He adds that all of the sampling issues are magnified in national polls.
“It’s easily skewable at that point,” he said. “You start making assumptions.”
Pretty sure I read the same article and the pollster explained why he thought his methods were more accurate and how he took this into account, as well as acknowledgement that he thought there could be a couple more point swing that his method could not account for. He explained pretty thoroughly (I thought) his opinion why the “old style” of long polling encouraged shy voters to hide their true opinions or lie and how even the polls that supposedly adjusted after 2016 still have blindspots.
Yeah, the pollster also said that if Bernie had been nominated he would have shy voters as mod Dems wouldn’t want to admit being “socialist” or radical but Biden is so moderate and mainstream there’s no shaming.
No, I don’t think shy voter theory is principally based on the opinions of those around you. It’s about the specific characteristics of Trump and his supporters.
(a) Some people voting for Trump have a “fuck the libs”, “burn everything down” mentality, just don’t want to cooperate with pollsters.
(b) Some people voting for Trump know full well that he’s despicable and are ashamed to admit who they are voting for, but will just always vote for any Republican however awful over a Democrat.
The UK has a similar concept, the shy Tory, and polls have been reliably wrong since the early 90s. Frankly, polls have been so wrong in the UK I am surprised anyone takes them in any way seriously, but I guess without them things get hard for the twenty four hour news cycle.
Based on the wikipedia article on the UK 2019 polls, it looks like the aggregate of the polls was pretty much spot-on. Why do you say they got it wrong?
Texas is pretty good evidence for the shy (Democratic) voter. For several cycles, Democrats have out-performed polling by 2-3% in that state.
I’d really like to see a study of polling psychology that was actually conducted by psychologists instead of pollsters (especially agenda-driven ones like Trafalgar).FWIW, FiveThirtyEight gives Trafalgar a grade of C-.