As a general rule, single polls mean very little if anything. Many polls are a bit more reliable. Trend lines from a series of polls are the best indicator.
Single polls have a variety of known problems. A 1000 person representative sample will give a statistically valid reading of a population, but it’s become notoriously difficult to find people to talk to pollsters. Some firms need to make 8000 calls to find 1000 that will answer. We don’t know exactly what the differences are between those who will and those who won’t, but we do know that the 1000 normally do not correspond to known divisions from census data, so each subpart has to be weighted to produce the right percentages. We also know that the exact wording of questions play a major role in how they are answered.
Even with these problems, polls from a number of firms taken at the same time can be combined to create averages that appear to yield better results, as people like Nate Silver showed during the election. If one poll undercounts Democrats another might undercount Republicans, but the two combined will have a more representative number. There’s no surety to this but the technique has proven itself in practice. And while some variant wordings can produce hugely different results, good polls will provide fairly neutral wordings that hone in on an opinion. Polling firms also now release their methodology, full questions, and results so that outside analysts can examine them and try to determine how reliable these methods are. Silver discarded a few firms’ polls because they were below the line. He kept some that others considered faulty. The end result was quite good, but maybe drawing a different line could have done better.
Trend lines are the best indicators, because for most issues outside elections, it really doesn’t matter whether the population is 48% or 51% for or against something. It does matter if that issue is winning greater approval over time - or losing it. And combining polls over time has many of the same advantages of combining many polls take at the same time.
You tend to hear about polls only when something newsworthy makes reference to them. The major pollsters poll continuously, though, literally every week. And they ask the same questions, with the same wording, year after year, so that a good sense of public opinion appears. Except for some silly stunt polls, every major policy and social issue has dozens of firms asking questions on everything all the time. There’s never a shortage of ways to check opinions.
By a weird coincidence, there’s a letter to the editor in this morning Rochester Democrat & Chronicle from someone from the Pew Research Center. She was correcting some earlier letter writer who took a Pew survey result out of context. She wrote:
Polls do usually pick up general attitudes well, and changes in attitudes even better. But they are also good to show that attitudes are not always rational and precise. That same letter noted that “18 percent say that abortion is immoral and say that Roe v. Wade should not be completely overturned.” That’s a more important finding than mere raw numbers would be.
The short version of all this is: never trust a poll; only trust polls.