No, he is calling your misinterpretation of Trenberth (and taking his quotes out of context) nonsense. Let’s take what Trenberth says and apply it to something less controversial, like the seasonal cycle.
What Trenberth is effectively saying is that if you run the models without carefully initializing them to current conditions, you are going to have a very hard time predicting if this winter is going to be colder or warmer than average here in Rochester. (In fact, as it turns out, even if you try to initialize them, it is a difficult task to make these predictions because they are quite sensitive to any errors in the initial conditions.)
What you seem to be thinking he is saying is analogous to the claim that if you run the models without carefully initializing them to current conditions, you have no clue whether this winter in Rochester will be colder or warmer than the summer…or even roughly how much colder or warmer than the summer it will be.
That is a big distinction: There are things that are sensitive to the initial conditions and there are things that are not. By failing to understand distinctions such as this, you are making a complete mess of understanding what Trenberth is saying.
[There are a couple other points in Trenberth’s piece, such as the fact that the models make projections rather than predictions because we are talking about “What if…?” scenarios. A climate model can’t predict how much coal humans are going to burn, so we have to come up with various scenarios of how our societies evolve in the use of energy, etc., and then see what the climate models project will happen given these assumptions about our future greenhouse gas emissions.]
I wonder if that will change some minds at last then.
Other studies I have seen point at political affiliation as the most likely reason for why the facts will not work, there is an ideology behind the denial.
And that shows another problem with the article*, it goes once again for the idea that it is based on religion, as usual it is the most used accusations by proponents of woo woo when they can not get the experts and scientists to agree in their woo.
The first one was to not notice the ideological reasons that are a big part of the denial.
I don’t know who the hell Vox Day even is. It’s the quote “Before some audiences not even the possession of the exactest knowledge will make it easy for what we say to produce conviction. For argument based on knowledge implies instruction, and there are people whom one cannot instruct.” that got my attention. (I also didn’t know what the hell a SJWs was either)
Which has nothing to do with what Aristotle wrote about. Or maybe it does.
Because here is another case where facts don’t matter, because the person who brought the information up is a bad man, we should ignore what Aristotle wrote thousands of years ago.
I’m fine with Aristotle. I just thought you might want to know that the guy you were quoting was a white supremacist. No big deal – use that information however you’d like.
I didn’t really care about him, it was his discussion of people that facts will make no impression on. That he seems to be one of those people, now that is ironic.
Very interesting and prescient quotes from a wise old philosopher. It’s almost like Aristotle could see into the 21st century. Here are two of his observations from the above quote, followed by corresponding extracts from the cover story of New Scientist, 31 October 2011, titled “Science in America: Selling the truth”. The parallels are quite amazing, highlighted for convenient comparison.
[QUOTE=Aristotle]
… before some audiences not even the possession of the exactest knowledge will make it easy for what we say to produce conviction. For argument based on knowledge implies instruction, and there are people whom one cannot instruct.
[/QUOTE]
In March [2011], an impressive array of climate scientists [addressed Congress at the invitation of John Holdren], but their efforts seemed only to inflame the scepticism of Republicans opposed to regulation of emissions. For researchers who study how people form their opinions, and how we are influenced by the messages we receive, it was all too predictable. Holdren’s prescription was a classic example of the “deficit model” of science communication, which assumes that mistrust of unwelcome scientific findings stems from a lack of knowledge. Ergo, if you provide more facts, scepticism should melt away. This approach appeals to people trained to treat evidence as the ultimate arbiter of truth. The problem is that in many cases, it just doesn’t work. Perversely, just giving people more information can sometimes polarise views and cause sceptics to harden their line.
[QUOTE=Aristotle]
Here, then, we must use, as our modes of persuasion and argument, notions possessed by everybody, as we observed in the Topics when dealing with the way to handle a popular audience.
[/QUOTE]
These findings suggest that one way to change people’s minds is to find someone they identify with to argue the case … The appeal of this story to those on the political right illustrates another key finding: how the message is framed in relation to the cultural biases of the intended recipient is crucial to its persuasiveness.
Considering the new evidence from MRI studies, which show the brain literally turns off the circuits, so that the facts never even reach the part of the brain that could make a decision about them, but instead the anger part of the brain lights up, his insights into human nature actually are backed up by science.
Here’s a bunch of links for anyone willing to consider facts in this matter. None of them will matter to some people.
This last one has a phrase that is amazingly insightful.
Just knowing your brain is actually preventing you from seeing or hearing facts you don’t like, does nothing to stop this from happening.
Of course, as noted in my first quote above, the persuasiveness of an evidence-based approach tends to be limited to those who understand it, and “… to people trained to treat evidence as the ultimate arbiter of truth.” Not so much those who lack the education or scientific grounding to assess evidence, or those financially motivated to reject facts that threaten their interests, like those in my example and the powerful constituency they represent.
It’s often more instructive and revealing to study the vested interests and motivations around why people reject facts than to study the abstract psychology of how it works.