From a philisophical perspective, my problem is this: If the Artic had had a very warm January, it seems likely that it would have been cited as evidence of catastrophic man-made global warming. So if a cold January is dismissed as a fluctuation, it seems like there is a bit of a double standard at work.
Not only that, but if a cold January is cited as evidence of catastrophic “climate change,” one wonders what would NOT be considered such evidence. Perhaps an average January. The trouble is that somewhat extreme weather is always happening somewhere on the globe. So whether catastrophic global warming theory is true or false, there will always be some evidence to support it. And that raises a falsifiability problem.
As others have already pointed out, the blog where the Daily Tech got their data has already disassociated itself with their interpretation of the data as wiping out nearly all the warming over the last 100 years…and there are good reasons that they did. First of all, that claim is not even true numerically. While it may be true that the global average temperature has dropped from January 2007 to January 2007 by an amount that is almost equal to the total quoted warming in the last 100 years (of about 0.7 C or so), the former refers to temperatures averaged over one month where the latter refers to temperatures averaged over a year (and, then in fact, further averaged over a few years). So, in particular, the anomaly value in January 2007 was higher than the anomaly over the entire year. Thus, we see that from the Daily Tech’s own figure that the anomaly value for January 2008 is still positive (using a 1951–1980 baseline), i.e., the January 2008 temperature was higher than the average over the period 1951-1980, which itself was already higher than 100 years ago.
Second of all, there are always going to be fluctuations in temperature over shorter timescales. To give an analogy, when it got above 60deg in Rochester a few weeks ago, did that wipe out nearly the entire drop in temperature associated with the season that we here call “winter”?
As for their argument attributing this correlation to reduced solar activity, the old’ “correlation is not causation” argument comes into play. (Worse yet, they haven’t even tried to show how good the correlation looks in time…and it may not look good at all.) If solar activity were a dominant factor and the climate could respond this quickly to such changes, then there ought to be a stronger climate signal due to the known fluctuations in solar activity associated with the 11-year solar cycle. As it is, studies suggest that any such signal seems to be pretty small.
A more likely explanation for the change is the dramatic shift in the last year from El Nino conditions to La Nina conditions. In fact, you can see in the data a drop that looks quite analogous in 1998/1999 when we had a similar shift from El Nino to La Nina.
The ‘double standard’ you refer to is that of making a distinction between illustrating an argument with an example, and providing an example without an associated argument.
An argument without an example, like a cat without a grin, stands on its own. But the example helps the argument connect. But an example without an argument, like a grin without a cat, doesn’t stand on its own - outside of Wonderland, anyway.
Well, okay, yes it is obvious that the ice will get thicker in the winter. This was still worth saying given that the National Post piece was pretty sloppy with exactly what they were describing.
And, unfortunately, you misread my statement. In this part I was now no longer talking about the absolute amount of ice but about the anomaly, which means that the season variation has been factored out.
My point is that such a partial recovery is not unexpected…and in fact as the RealClimate piece that I linked to showed, there was at least a significant fraction of scientists at the AGU meeting who seemed to expect a recovery such as this. [Actually, there was a significant fraction who even expected a recovery on a year-upon-year basis rather than a partial year basis…which is all the data we have at this point since this summer’s record low extent.] For one thing, I am pretty sure that the models predict more decrease in ice extent in the summer (relative to what it was before global warming) than decrease in ice extent in the winter (relative to what it was before global warming)…which means that the anomaly itself will tend to get more negative in the summer and less negative in the winter. For another thing, global warming does not repeal the fact that there are significant fluctuations in the system. Just as the global temperature does not monotonically increase each year but also has quite a bit of noise, so too this will be true of the sea ice
The thing is (and correct me if I’m wrong here) that a year or even a couple of years (or in this case a season) of cooler temperatures in a region or even globally (though I’ve seen no indication of that) doesn’t disprove the theory of AGW/GW. It’s not like the temperature is always going to go up everywhere on the earth…or even that the average temperature is always going to go up. There will be all sorts of variations if I understand the theory correctly…even VERY cold local conditions. That’s why a lot of the folks I’ve talked to who work at Sandia prefer Global Climate Change to Global Warming…it better represents what is going on they say.
An anomalous data point (as referenced in the OP) is simply that…ONE data point. If we see a trend in the next decade or so then THAT may be something to think about. Maybe there is some other factor at work that wasn’t previously understood. Obviously the CO2 IS rising…and that is going to cause SOME effect. But right now this evidence doesn’t mean anything…it certainly doesn’t disprove or even dent the theory of climate change happening. In fact, quite the opposite…as from what I can tell it is EXPECTED that there will be a great deal of local variance as the climate shifts about due to the extra CO2 in the atmosphere.
I’m not sure I see your point . . . are you saying that if a pro-AGW individual cites a few melting glaciers, or a hot year, that he or she is merely providing an example to illustrate an argumnt?
On the other hand, if a skeptic cites a few instances of colder weather, are you saying that he or she has no underlying argument that’s being illustrated?
Anybody who cites only a hot year or a few instances of colder weather has no underlying argument, no matter what they’re trying to show. Climate change is a long term process, and the temperature for any given year is nothing more than a single data point. You can’t draw any conclusions from one data point, and you can’t use one data point to prove anything. You can do two things with a single data point: illustrate the continuation of a trend, or show that the point is outside the expected range and point out that the expected trend doesn’t account for that data. If you want to do the second, you’d better understand the hypotheses and models you’re trying to disprove.
Your conflation of “a few melting glaciers” with a “hot year” is misleading as well. A hot summer is one season, but a melting glacier (over a reasonable time frame) is a climatologically significant phenomenon. Because of the summer-winter cycles, most glaciers are constantly losing and gaining mass. Every summer, some of the glacier melts, and every winter the glacier accumulates more mass. Changes in glacier are quantified using an annual balance that takes into account both summer and winter processes. A “melting glacier” is one that has been losing water consistently over a multi-year period. The majority of glaciers on earth fit into this category over the last 50 years, and the rate of water loss has increased significantly over the last 10-15 years.
And by the way, I wish you’d been around back in the 1990s when a college professor of mine cited a couple hot summers as evidence of man-made global warming.
I wasn’t necessarily thinking of any particular time frame. But anyway, to avoid confusion, please just assume I was talking about more limited events. Thank you.
Agree or disagree with what exactly? Your cited quote is showing a single data point to illustrate a TREND. As opposed to the single data point in the OP which illustrates…a single data point.
I agree, and I have no idea why you’d post this. It illustrates the point I made perfectly - realclimate is using the small group of data points to show both the ideas I mentioned. They’re showing that the 5 sigma event fits AGW models that predict a rising global temperature (illustrating the continuation of a trend) and that it’s phenomenally unlikely that they would occur in the absence of a warming trend (illustrating problems with an alternative hypothesis).
You can’t discuss climate change without thinking of any particular time frame. I haven’t seen anybody cite single season summer glacial melt as evidence for AGW. If I make this assumption, your statement ceases to illustrate the point you were making - it’s clear that that’s not what you meant.
I’m not going to assume anything for you. If you want to discuss various time-sensitive phenomena, you’re going to have to define the time scales you’re talking about.
But you said that "Anybody who cites only a hot year or a few instances of colder weather has no underlying argument, no matter what they’re trying to show. "
Seems to me that’s basically what realclimate did.
But to make things clearer, let’s reverse things. Suppose that somebody cited a few months of unusually cold weather and said “yes, this doesn’t PROVE that AGW is a hoax, but it fits in with and illustrates my ideas.”
I bolded an important part of my quote for you. You can use a single data point in context to illustrate or contradict a larger idea. That’s what realclimate did, by going into the standard deviations and expected values and probabilities. The article posted by the OP didn’t do it - they just threw out the point about a cold winter and arctic sea ice without any scientific analysis whatsoever.
You’re making two different claims, I think. The first is with regards to the AGW hypothesis. If the data point that you’re citing fits into that model (and harsh winters certainly do fall into the range of expectations from many, if not most GCMs), there’s nothing to be said about that. The data point either does or doesn’t contradict those predictions.
As far as the claim that it supports your ideas… do you have the statistical analysis to back up the statement? Do you even have a model or a firm hypothesis that you’re working from? If you do, then I think it’s a fine claim to make. It’s certainly possible to come up with differing mechanisms and models to explain the same phenomenon, especially when it’s something with as much uncertainty as climate change. Every new data point provides another way to test the robustness of a model.
But you can’t do it in isolation. If you make that claim, you’ll then be expected to show how you explain other data sets as well.
It would be equivalent to saying ‘These things that look like human foot prints next to these obviously dinosaur foot prints proves that the theory of Evolution is incorrect’. Unless you have a body of data AND a (testable) theory to go with it you are basically just pointing to anomalies and then saying this proves the theory is wrong.
Show me an alternative theory to AGW and the data that backs it up (with the OP’s data being a single data point among many) and then we can talk. I’ve seen no such alternative theory with the corresponding data.
ETA: Posted in response to brazil84’s previous post…didn’t see the other reply until after. IOW…never mind. Was said better by Enginerd.
-XT
Thank you. And I think it illustrates my point, so to speak.
Maybe not. If the AGW hypothesis predicts that either (A) weather will trend warmer; or (B) weather will trend colder; or (C) weather will stay about the same in temperature, then I wouldn’t argue that it’s incorrect. I would simply argue that it’s non-scientific.
Well, if you put it like that then…sorry, there was too much straw blowing there. What were you saying?
Do you understand the difference between average global temperature and average local temperature? Because it doesn’t seem so based on what you are saying in this thread. Unless you can show that the average GLOBAL temperature is down FOR THE YEAR then nothing shown here contradicts the GW hypothesis.
The trend is an increase in average global temperatures. That doesn’t mean that every year will necessarily increase…nor that local conditions everywhere will increase. There are all kinds of factors that influence the variation in global average temperature. One local data point is not going to disprove anything. A global trend for the next decade downward MAY disprove the GW part of the hypothesis…but until that happens the hypothesis remains the best model we currently have. Since it IS science (unlike your man of straw) it’s testable. If the hypothesis is wrong it will eventually be discarded in favor of a new hypothesis that refines the data and builds a new model. Thus far that hasn’t happened despite a decade of testing. That doesn’t mean it won’t prove false in the end…but the data point given by the OP proves nothing.
The statement I quoted from you was obviously a strawman…and having seen you in your 1000 years thread I know that YOU know it.
It neither proves nor disproves anything. It is simply an isolated data point. People who use a single local rise in temperature or some freak weather system (like, say, a hurricane) to ‘prove’ AGW/GW is right (without reference to all the other data) don’t know what they are talking about. Same with people attempting to ‘prove’ AGW/GW is wrong using local lows. There will always be local variations.
No, of course not. Again, even globally there will be variations. Solar output for instance is a variable. It’s the trend that is important. You will get anomalies and variations. However if the trend is an increase (over time) and if it tracks with CO2 emissions rising then you have the basis of data pointing toward confirming the hypothesis.