In any sort of survey research (be it political polling or consumer research), we always assume that a small number of respondents aren’t giving truthful / real answers, for whatever reason.
It’s why we make sure to have a sufficiently large sample (usually at least a few hundred respondents), and why the results are presented with a “margin of error”.
A significant number of people not having landlines any longer (and even more using Caller ID to screen out pollsters) is a much bigger issue for the political pollsters (and the reliability of the data that they generate), and it’s an issue for which they have not yet come up with a great solution.
Bearing in mind that my area of expertise is in consumer research and advertising testing, not necessarily political polling, and that I live in the U.S., not England or Europe…
I didn’t pay super-close attention to the Brexit polls, though I do listen to and watch the BBC news fairly regularly. The sense I’d gotten in the weeks leading up to the referendum was that it really was going to be a close vote (which turned out to be true).
Looking at the Financial Times “poll of polls” page that pulykamell linked to earlier in this thread, it looks like the “Leave” numbers had trended up slightly over the past 6 weeks, while “Remain” stayed pretty flat. Seeing that suggests that Leave might have had some growing momentum in the weeks leading up to the vote.
But, further, most polls, even good polls (i.e., statistically rigorous), have a margin of error of +/- 3 to 5%. If polls are suggesting a 2-point margin of victory (even if a lot of them all point that way), that’s still likely within the margin of error. When I see polls that are that tight, I wouldn’t be surprised to see the actual vote go either way.
Finally, as has been suggested, a lot of people associated the “Leave” side with xenophobia and racism. Even if a “Leave” proponent had reasons for supporting that side that had nothing to do with immigration, some poll respondents may have not wanted to admit where there vote was going to go.
I heard that many people made ‘protest votes’ for leaving the EU, thinking that the referendum would not pass. They could make their protest, but still be safe in the knowledge that Britain would remain in the EU. I think these voters would answer poll questions differently from how they voted. Add to that the xenophobe vote and the people who believed the bald-faced financial lies, and Robert’s your auntie’s live-in lover.
With all respect to the OP’s request not to digress on the bigotry issue, the fact that the Leave side had become associated with xenophobia may well have skewed the poll results and hence it has a direct impact on the question the OP is asking. All the more ironic, therefore, that someone is presenting a poll as evidence that the xenophobic attitudes to immigration were not the dominant factor!
Furthermore, ISTM that the Lord Ashcroft poll cited was very badly worded. In being asked what was most important to their decision, one choice was a very narrowly worded single-issue statement about immigration that reeked of xenophobia, the other was a generic motherhood and apple pie statement about British autonomy that nobody could possibly disagree with and that could be construed to also encompass concerns about immigration, the changing demographic landscape, and traditional values – all the things that xenophobes are always ranting about, but respectably stated in dog-whistle terms. And of course that would also be the choice of non-bigots, so at best, the Ashcroft poll tells us nothing meaningful about why voters really voted the way they did. I think a more meaningful analysis was the one provided by John Cassidy at the New Yorker, who noted, among other things, that “the Leave side went up in the polls after it managed to shift the debate away from the likely [negative] economic impact of Brexit and onto immigration”.