I read this article the day it came out and I haven’t stopped thinking about it.
I’m the kind of person who very much likes to engage people with whom I disagree, mostly because I enjoy challenging myself. I have seen my views change considerably over the years, and I have seen the views of other people I know change as well, so I know it’s possible.
Yet, studying international relations, I’ve read plenty about the ills of groupthink, confirmation bias, and crowd psychology to wonder how possible it is to actually get people to engage in meaningful debate and to change their mind. The article seems to reinforce that. Moreover, I’ve read enough stuff by theseguys to know that, at the very least, there is plenty of reason to believe that the idea that just giving people the facts does not mean that people will act on them, no matter how credible they are.
I’m a huge fan of Factcheck and politifact. I thought that they were going to change the nature debate in this country, but a lot of what I’ve been reading doesn’t support this belief. Judging from our current political debates, I certainly don’t think things are getting better.
I’m just curious what work is going on, good books, or projects are showing promise in the way of creating a framework that aims at discouraging or nudging people away from their tendency to try to confirm, rather than challenge, what they already believe. What should someone who wants to work on this know/read/watch? Or is it a lost cause? For example, the article mentions that shaming pundits and politicians who outright lie or sensationalize might be one possibility. How would that be done? (Ideally through the media, but much of the time they don’t seem particularly interested in that.)
I was talking about this to a French friend recently, and he said, “Is that just in the US?” I don’t know. I’m assuming that the same tendency exists everywhere, but I’m wondering if it’s more pronounced in different places. The article mentions the following:
Are places that tend to be collectively more “liberal” less likely to see these sort of effects?
I’ve long believed that the vast majority of people first decide what they want to believe is true, then pick the “facts” that suit that belief. I admit that I do it too most of the time. Sometimes, when I really put some mental energy into it, I try to play the devil’s advocate (from my point of view) so I can really try to see the other side’s arguments with no bias. But it’s tough.
Well, it’s small scale, but the Blue-eye experiment does seem to be a good cure for racism and feminism.
I think it was in Watzlawicks book about the nature of reality, where he mentioned one group pressure experiment that he thought should be obligatory in all schools. The experiment goes like this:
A group is asked very simple, obvious math/ geometry questions, like comparing two lines (not optical tricks!), which is longer. But one person keeps answering different than the rest of the group. Increasingly, he becomes uneasy; in 1/3 to 1/2 of the cases he will switch to the group opinion, even if the group is obviously false. (because the group was instructed beforehand seperatly to answer the first question correct and then always false).
I think it’s not only about being liberal. Certainly, every country has it’s share of dumb people. But the US has this strong anti-intellectual bias, where intellectuals are those liberals elites in the ivory tower giving bad advice, while honest, hard-working people , the salt of the earth, have an instinctive grasp of anything.
Another factor is the lack of proper education - people believe that anything can be learned in 2 weeks course, instead of having a proper apprenticeship of 2-3 years. If things are simpliefied that much, not only do people lack background understanding, the simplifying means dumbing down to black-white view, instead of understanding how complicated and connected everything is.
Another factor is the strong authoritarian view. There’s a good e-book somewhere (don’t have the link right now), which correlates conservatives with authoritarinasm: the need to believe in black/white, yes/no world, instead of accepting complexity; the want to hand over responsibility for decisionmaking to authority, the acceptance of what your paster/ pundit tells you…
Another factor is the media: you lack objective, truth-oriented public TV like BBC, ARD + ZDF, that don’t care about ads and percentages, but employ serious journalists and make long documentaries, instead of sensatinoalism and celebrity gossip.
And, the media over here don’t try to present both sides. They know that in an evolution/creationist debate, one side is obviously right, and the other side is kooky, and tell that to the viewers.
Another is the strong fundie current, where faith is more important than facts or intellect (often used is the “Become like children” verse from the NT, ignoring all the tradition of the Jewish religion to use your god-given brain to figure out good solutions to real life).
Another factor is the extremly varied school system, not only from state to state, but from city to city, and with those children who would need exposure to other views and ways of thinking most - the fundie ones - being able to opt out. This continues in the strange standards of colleges, which offer a lot of courses with little facts - see one of the recent threads on the abysmal knowledge of the professor of a summer course in college.
So making high standards for all high schools and colleges, whether private or not, in all states mandatory, with enough funding to catch up the kids who fail tests, and distributing funds evenly, so that instead of one school handing out laptops to every kid while 100 km away in the ghetto, the paint peels from the walls and there are not enough books for every pupil, every school gets the minimum to operate. Also, standards for teachers, so you don’t have ignorants who teach.
So I think lots of factors make this worse in the US than the normal human tendency of the brain.
But remember, that in one of the Harry Potter books, Percy, who was proven wrong about Voldemorts return, still hadn’t apologized to or made up with his family, and the wise old Dumbledore said “It’s easier to forgive being right than wrong”. So it’s a general human problem, that needs to be combatted with the right education (lots of effort and money) and attitude (that dumb people are dangerous, not heroes).
I feel often conflicted about this when I hear or read people uttering wrong or hateful opinions. On the one hand, the old maxim is “You can’t argue a person out of a position they didn’t argue themselves into”. (And with the teabaggers and dumb consies, they apparently have a deep need to feel as victims, because paranoia make your own life more interesting and resolve you of responsibility. Can’t get a job? It’s because of those evil libruls affirmative action! Your life is boring? But you are a Christian praying to strengthen the spiritual fight going on all around you! Fred at Slacktivisthas a lot of posts about the mindset of these bottom-dwellers, the hate and paranoia and lunacy.
OTOH, because I know history, I feel the need to always stand up to hateful opinions, to let others know that this is wrong and won’t be tolerated, so it doesn’t happen again.
I also have a pet peeve regarding this whole “my opinion is based on facts/ no feelings” thing. My opinion starts out with my personal ethic and philosophy, which is a belief thing. Then I weigh the facts. So my opionion can change if new facts come in, depending on the underlying moral. E.g. I’m against the death penalty for ethical reasons, and the facts also support why it doesn’t work. But if the facts were different, I still would be against it.
For ethical reasons, I’m for social support or treatment of criminals. But which method works best depends on the facts. If new research shows that method B works better than method A, though it sounds weird/ contraindictive at first, then I will favour method B (or combine both if possible to reach different groups and adress different causes.)
Sometimes I think that problem of grey cats is caused by the moon (for an example), and then research figures out it’s caused by Jupiter, which means that we need to change the method to deal with it. But it’s my ethical philosophy which says that we need to adress problem of gray cats in the first place.
Sometimes facts come in afterwards to validate my philosophy, like prevention of crime with youth clubs is cheaper than prisons, or that higher wages means more spending and is better for economy, but that’s not my first reason.
Isn’t it possible you have a pro-intellectual bias?
If the e-book was written by a liberal, isn’t that a possible source of bias?
You make a completely unwarranted assumption: that commercial pressures are the only sources of bias. And in any case, how do you prove the media you mention are “objective, truth-oriented”? Who guards them? And who guards those guards?
Orwell wrote, “One has to belong to the intelligentsia to believe things like that; no ordinary man would be such a fool.” (Notes on Nationalism, May 1945)
But remember, Voldemort was very smart and extremely knowledgeable. And you quote a fairy tale? Well, I’m convinced. :rolleyes:
Namecalling is SO effective. :rolleyes:
I don’t need that website to get a closer look at hate/paranoia/lunacy.
To answer the OP: I think the best way is to try to follow in your own life what Richard Feynman said about scientists (emphasis added):
Easy to say, hard to do (and I’m no better at it than anyone else). Impossible to do perfectly, but you must try as best you can. And once you’ve done that and continue to examine yourself, you’ll be better prepared to take on groupthink, confirmation bias etc. in others.
That’s C.S. Lewis (referring to J.B.S. Haldane). You may be of the opinion that he did not exercise this principle outside of his own scholarly field, and you may be right. And Aristotle made many errors of his own. But it sounds a bit like Feynman, doesn’t it?
I think the book in question is The Authoritarians; by Bob Altemeyer. And yes, I’d definitely say it has a liberal bias… the treatment of right-wing authoritarians is a bit harsh; a bit heavy on the rethoric, the science (what little there is) seems good tho. Keep in mind my background is in Sociology, not Psychology, so I might be missing something.
I didn’t read it like that, it was simply a part of the list of possible factors listed. And considering the BBC, ARD and ZDF, are non-commercial public access media houses… presumably they are less inclined to yield to outside pressure than commercial enterprises, it has some validity. I think the point was about contrasting commercial news stations with non-commercial ones, and in that sense the difference in biases comes from funding pressures (ads and viewer shares)
For example the BBC charter defines “The Public Purposes” of the BBC as;
a)sustaining citizenship and civil society
b)promoting education and learning
c)stimulating creativity and cultural excellence
d)representing the UK, its nations, regions and communities
And I presume the BBC Trust are the proverbial custodes who custodiet the custodes… then you could also ask who guards them… and who guards the people above them… but that’d be an infinite regression…and that way madness lies.
In essence the answer is to cultivate and habituate critical thinking via the education system.
It’s somewhat difficult to summarise concisely as a concept, so it’s worth reading the Wikipedia page as a jumping-off point, but Edward Glaser distilled it to the following;
Provision of these skill-sets should be one of the most important manifest aims of schooling from the outset, IMO. It’s widely recognised as a desired learning outcome at the undergraduate level, but it deserves greater emphasis during compulsory stages of education, since its implications are so far-reaching for the individual and society as a whole.
Good teaching practice already incorporates many of these elements, but (again IMO) it warrants adoption as a curriculum subject, in and of itself. This has fairly recently taken place in the UK, at the A level stage.
The reason I put it in GQ is because I was looking more for answers like “A group of psychologists did a study in which they grouped self-described conservatives and liberals in a room without them knowing who was who, then they [laid a certain groundrule] and found that it effectively reduced the likelihood that people would form groups with like-minded people” or “The Danish government did a study in which the required [a certain check on decisions] and found that it reduced the ability of administration officials to assume the answers they wanted.”
Interesting. How would you describe a pro-intellectual bias?
I consider pro-intellectual bias to be equivalent to a bias toward rationality, which I would deny is a bias at all. That is, I would describe it not in terms of the people you listen to (intellectual as a noun) but in terms of the arguments that persuade you (intellectual as an adjective). Is that what you intended to describe, and if it is, would you consider it a problem?
I doubt that. People on the low end of the spectrum, who listen to radio propaganda or the authority of their pastor, have a psychological need to believe these things. It’s not only so simple that it’s hard to admit that you are wrong - normal adults have that tendency also, but with repeated and overwhelming evidence, can accept new information. Rational adults can even say “I accept the facts and reality, but I still have a different opinion on what we should do because …”
For the radicals however, their personal self-image is bound to the paranoia and fear that some big (liberal) conspiracy is out there, often with a spiritual battle one level above, because that gives their lives meaning; to feel as victim of said consipiracy/ affirmative action/ etc., because that absolves them of responsibility for taking their own lives in hand; to hand over the responsibility of making decisions themselves to their leaders, because it’s more comfortable to have someobdy else do the heavy thinking; to feel specially created and thus deny evolution as threatening their status, and so on.
To make them accept the facts of reality, you would need to change all those emotional needs, and that would require therapy, which they can’t acknowledge as needing.
Um, I think you are confusing something: while sociologists and psychologists do study how to prevent people coming to conclusions prematurely and then fitting their evidence to the conclusion, that is a different phenomen than the one described above, a denial to accep facts that contradict your worldview. For the specific problem of decisionmaking by considering all data some countermechanisms exist, which rely however on having rational adults capable of acknowledging first that minds can play tricks and therefore that counter-measures are necessary. It’s like establishing a proper order and method for group discussion, so that shy people are heard, too, or that statements by women, despite social conditioning, are given the same consideration as statements from men. A good moderator and people without ego-problems help a lot there.
To change someone’s mind they not only have to accept the data but they also have to accept the source. For example, many Dopers deride Fox News. That makes accepting truth from Fox that much more difficult for them.
Secondly, it can be very difficult to accept that there can be multiple correct angles and analyses. Consider three reports of the same incident: ‘Man kills woman’ vs ‘Husband kills wife’ vs ‘Man turns off wife’s life support’. All equally correct, but your response to each is very different. As another example, I know that while it won’t deliberately lie, the BBC is heavily biased on certain subjects, so I take reports on those subjects with knowledge of those biases in mind. (Contrariwise, I don’t know that Fox won’t deliberately lie.)
One good way to do this is to argue the other side. Take any controversial issue and list five good arguements for the other side. Honestly doing this will educate you about the issue and require you to see the other side as rational people. This is much more constructive than treating anyone who disagrees with you as an idiot. (I bet constance has never converted anyone). Remember though, most political opinions are signaling behavior and not rationally conceived.
I am a bit of a pessimist with regard to large groups of people revising misconceptions that are a bond between them. In my view public or group opinion usually changes not by the believers in an untenable position changing their mind, but by the cohort of believers of the untenable position leaving the discussion, being replaced by a newer cohort.
“The first principle is that you must not fool yourself–and you are the easiest person to fool.”
This is a principle that most people struggle with. We all tend to resist the idea that we can be fooled, only those other dumbasses can. The only thing Ive seen change people really is a ton of education and time. Most people expect beliefs to change far too quickly, and for them to change in a public way.
I agree that being made to argue the opposite to your own views can work well, but few people can do it without a lot of practise. This is where formal debating can be a great educational tool.
The initial examples arent great in my view in that there may be different levels of adherence to each topic as a starting point.
This is standard advice on forming your own opinion. But the OP was about a different question: how to deal with people who don’t want to accept basic facts. Accepting facts and reality is the first step in forming an opinion.
Facts don’t have two sides. They can be interpreted differently, but they have to be acknowledged first.
Let’s take a simple example, nothing charged like evolution/ creationism or AGW. Let’s take chocolate, and three simple facts:
chocolate contains a lot of calories and is therefore bad for your weight.
Chocolate contains a lot of sugar and is therefore bad for your teeth.
chocolate cheers you up when you eat it*
If person A and B agree on all three facts, they can still have different opinions on whether sad people should eat chocolate to cheer up or not, because they can assign different priorities to the facts, so A says “feeling better is more important than 1 and 2, because if the sad person feels better, they will take a walk and burn off the calories, and feeling well is more important”, while B will say “if they gain weight, they will feel even sadder, they should only take a walk and skip the chocolate”.
But if person C comes in and simply denies fact 1 and 2, he will come to the same opinion as A , but be an idiot for denying the facts. A and B can agree to disagree about their priorities, but C can’t.
*Many people believe that’s because of the serotonin it contains; new studies link a psychological reaction to the sweetness instead. Regardless of the mechanism, it doubtless works.
Well, in my book, anybody who won’t accept basic facts, or proven facts, is an idiot. It’s one thing to be ignorant, that is, not -knowing, about something. That can be remedied. But refusing to accept facts through stubbornness is being idotic on intent.
You remind me of the conservopedia entry about creationism: biologists have stopped arguing with creationists because creationists tend to win the debates. No they don’t, it’s just that’s impossible to argue with people who don’t want to accept facts.
So when my facts get ignored, that doesn’t make the other people right.
Which is a very bad sign. It’s one thing to base your opinions concusily on a system of ethics you have pondered, and consider the facts along that system. It’s quite another to react according to a gut feeling and then ignore evidence contrary. It makes reasonable political discourse impossible, because all you have is angry people stirred up by demagogues, against everything, and without anything constructive. That’s bad for democracy and for a country dealing with complex problems.
You are getting the order wrong. It’s not that Dopers deride Fox news and therefore distrust them. It’s that Fox has been proven countless times to lie every time they say anything, and that’s why Dopers distrust it. There are some media that lie so often that I wouldn’t trust them if they said the sky is blue.*
No, they are not equally correct. Each offers a different degree of specificness. Man kills woman applies to thousands of cases, Husband kills wife specifies the instance a bit further, and Man turns off life support is the most specific.
Saying “All apple trees are trees” Does not mean that “all trees are apple trees” is also correct.
I was going to ask you for more specifics, because the accusation of bias in the BBC is a serious one close to slander. But since you already blamed Dopers on mistrusting Fox, I think that would be useless.
Huh? The evidence already there that Fox lies isn’t enough for you?
note for the nitpickers: yes, I know the sky is black and only appears to be blue because of refraction through atmosphere.