Arctic Ice Returning?

The fact is that the most reliable way to decide these details of what is predicted is probably to run the GCMs with elevated CO2. And, the best way to do this is to do it with several GCMs produced by different groups and incorporating different assumptions (and at multiple different resolutions) so that one can get a feeling for how robust a particular prediction is. The best tests of AGW are for predictions that are very robust to such changes. Predictions that are not very robust to such changes are simply not good candidates for testing the AGW hypothesis (nor, presumably, for making future predictions on that particular aspect of climate).

In other words, this isn’t some simplistic ideal one learns about in an intro science class. This is the real world where hypotheses and what they predict are complex and it takes a lot of skill and determination to find good ways to test the hypotheses. Again, the best people to actually evaluate both how falsifiable a hypothesis is and how well it does fit the data are the scientists in the field themselves. They understand notions of falsifiability and hypothesis testing plus they understand the intricacies of the particular problem at hand. You can start with the default assumption that they are stupid and you know better than them, but unless you are a freakin’ genius then that will just make you look silly.

In this regard, global average temperature is certainly one of the more robust variables but even for that, the results of climate models clearly show that there will be fluctuations and you cannot reliably determine a trend over just a year or a few years. You need at least a decade or so. So, no a drop of one year in average global temperatures does not undermine the AGW hypothesis. In fact, if the temperature rose very steadily with no noise, this would tend to suggest problems with the climate models which actually show more variability (akin to what is actually seen).

That’s nonsense. My statement was clearly hypothetical. I’m not necessarily claiming that anyone has taken the position I described.

That’s not an answer to the question I asked. I asked whether it supports the AGW hypothesis.

Ok, and if the average global temperature increases for the year does that support the AGW hypothesis?

Same question I asked in my previous post: if the average global temperature increases for the year does that support the AGW hypothesis?

A single data point neither supports nor denies a hypothesis…ANY hypothesis. It’s taking the data as a whole that either supports or denies it. So…a single data point showing local warming when taken by itself supports nothing. Taken with similar data points globally however it does support the hypothesis.

Again, no…it’s still a single, isolated data point. Taken in isolation it doesn’t support or deny it. Taken along with data over the course of several years a trend emerges. Attempting to ‘prove’ anything with a single data point (in this case the average temperature globally over a single year) really shows nothing.

-XT

Fine. As I said to enginerd I wish you’d been around back in the 1990s when a college professor of mine cited a couple hot summers as evidence of man-made global warming. (Ok, so that’s two data points. But the basic argument should be the same.)

Are you saying that science has to work in accordance with what you recall your college professor saying back in the 1990’s?

Lol no. I’m saying that the professor overstated his case, at least according to the standard that’s been advanced in this thread.

You may not realize this but college profs can be as full of shit (or just wrong) as anyone. I had Darwin’s Finch jerk me up (correctly) in another thread about evolution because I was basically saying what I learned in college about micro vs macro Evolution was…well, wrong. I remember one gem of a college prof telling me (this would have been in the mid 80’s) that (to paraphrase) ‘Micro computer networks are a fad. They will never catch on. Industry, business and the government, and anyone serious about processing will continue to rely on mini and mainframe networks for the foreseeable future’. I had another prof tell me that wireless data networks would never work because they would never be able to arbitrate data and signal correctly to ensure a reasonable bandwidth.

-XT

I’m much more aware of that now than then.