Sure, there have always been polls about presidential elections. But over the past year or so, it seems the media cannot report an any social or political issue without observing that x% of whomever feel one way or the other about it. NPR is especially reliant on polls. What exact news value are such polls supposed to mean to the listener/reader?
I tend to be dubious of group opinion as expressed in polls, as it is easier to claim a preference than to act consistently with that purported preference. I think it would be preferable for the media to report on peoples’ actions, explaining what that says about their preferences, opportunities, etc. I think increased polling about all manner of topics is not necessarily a good thing.
George Gallup made the opinion poll news and he started almost a century ago.
By 1944, he had established polling organizations in Canada, Australia and Sweden. All were syndicated by newspapers, as was his work in this country, beginning in 1935. ‘‘I’ve always thought of ourselves as simply a fact-finding organization,’’ he said in an interview with a researcher last year [1983]. ‘‘It’s our job to get the facts, and what people do about them is something else.’’
He polled about everything, not just politics, and I expect that the kind of daily horserace presidential polls we see today would have made no sense back when candidates weren’t set until after the conventions and the campaign started after Labor Day. He also had far less competition. When a hundred organizations do polling they want their name out there all the time, and political polls seem to have a bottomless or perhaps topless appeal, although less revealing.
That’s easy to explain. It started when judges stopped using facts, law, and precedents to base their decisions and started ruling according to their personal preferences and twisted the facts, law, and precedents to suit their own preferences. So it became important to know who appointed them.
Yeah, from my youth I remember Presidential approval polls - and the name Gallup. It just seems that the reporting of polls on so many issues was not previously as common as it is now. When they talked of public opinion for/against the Viet Nam War, I don’t recall it being presented in terms of polls, like a horse race. But perhaps I am misremembering.
So back when John Marshall was appointed (Marshall ruled on the legality of actions that he had been intimately involved in before his appointment to the Supreme Court - with a ruling that put Jefferson in a political headlock).
The Gallup and Harris Polls were already well-established and well-known public opinion polls, particularly for politics and presidential campaigns, by the time I was studying market research in college in the 1980s.
Hypothetically, citing numbers from a survey or poll provide a news story with context, and quantitative support for what they’re reporting. If a story says, “most people aren’t happy with the economy right now,” and don’t back it up with a cite (like we ask people to do here on the SDMB), it can come across as just opinion on speculation on the part of the reporter or the network.
That said, as has been discussed many many times here in recent years, polling and market research are dealing with a lot of systemic/methodological issues, compared to decades ago, and their results may not always be as reliable as they would like to have people believe.
The plummeting number of landlines have upended the industry, true. But each of the major polling companies have developed their own techniques to cope with the new world. And their results are usually within the same margin of difference from one another as they used to be.
So either they fudge their figures after seeing a competitor or the various techniques approach the “correct” percentages from both sides.
The issue is that they all struggle, more than they used to, with getting truly representative samples of their populations. Again, it’s not just limited to political polls; all market research suffers from this.
There is an increasing percentage of Americans who simply refuse to participate in any such polling/research, and “non-response bias” is a continuing issue with generating quality polling results – the pollsters simply don’t know if the types of people who refuse to participate are demographically and attitudinally the same as the people who do respond.
As this article, from a market research industry publication, notes:
Noting the forum we are in, I am treating this as an opinion, and will just note that it has not been my experience with NPR. I listen to an NPR local outlet (KQED) and read their website news stories, and I at least have not noticed any particular “reliance” on polls.
Counter with the number of times one is requested to fill out a “survey” after conducting business with anyone, where the survey seems to be mainly geared at the business patting themselves on the back and/or something with which to club employees if they don’t score 10s on everything. Or the political polls that are clearly designed to promote one person (I’m looking at you, Governor Newsom, and all the “California Polls” that I see where you are the intended beneficiary). So yes, I don’t do polls any more. The ones I see are not objectively phrased or constructed.
IMHO one of the issues with polls, or at least their reporting, is the infamous “is the country going in the right direction” question. That seems to me to be a worthless question, since the “wrong direction” people probably vary considerably in what they would judge to be the right direction, yet they get reported on as if they were a monolithic group.
Sure, I didn’t do any quantitative analysis. And since I pretty much only listen to NPR, I cannot compare it to other purveyors.
My use of the word “especially” was intended to refer to how frequently NPR itself seems to cite polls. Not comparative as in “NPR cites polls especially more than other outlets.”
All I have to go on is my impression that when I hear the NPR news on the hour/half, it seems that a tremendous amount of the time when they report a policy issue, whether local or national, they seem to include, “x% of people like this.” I do not recall this being as frequent in the past, outside of upcoming elections and presidential approvals.
Thanks fo the informative responses. I think my preference would be that they do less reporting of what “most people [are/aren’t] happy with.” Instead, I’d prefer that they report what the economy is doing, and I’ll decide whether that makes me happy or not. Several reasons. If I feel strongly about something, that 48% or 52% of people surveyed agree with me is not likely going to change my mind. As I mentioned upthread, I am dubious about people expressing opinions in polls, and whether their expressed views reflect their actions or their private feelings. Moreover, polling is quite challenging, requiring assessment of the specific poll questions, sampling, methodology, etc. Rarely do the news stories provide that info.
Reporting polls and what people clam they like/dislike impresses me as lazy reporting. Same way I’m not interested with stupid stories about Joe/Jane Public and how they feel about something. Give me the freaking data, have experts help put it in context, and I’ll form my own opinion, thankyouverymuch. But I know I am unusual in that respect. I understand a lot of people find such personalized story-telling meaningful.