The Decade, 1998 to 2007 - No Global Warming

The data I see there are monthly, not annual. Perusing through their site further, I found a very nice graphing applet that allows you to see monthly data from 1880 - that is, temperature anomalies for every January (or your month of interest) since 1880. I did not run across the annual temperature anomaly data. Here is what the NCDC site has for average global surface temperatures for four different months:

…Jan. Mar. Jul. Sep
2006 0.28
2005 0.51 0.55 0.52 0.58
2004 0.48 0.52 0.34 0.4
2003 0.56 0.40 0.39 0.48
2002 0.60 0.61 0.42 0.41
2001 0.33 0.47 0.41 0.37
2000 0.19 0.34 0.26 0.30
1999 0.38 0.17 0.27 0.25
1998 0.48 0.46 0.58 0.40
1997 0.21 0.28 0.38 0.50
1996 -.-- 0.14 0.17 0.15

(The periods are in the headings for spacing purposes. Data were not available for Mar, Jul, Sep 2006, and the Jan 1996 value was not included.)

These values are anomalies from the average monthly temperature as calculated from 1880-2005/2006 (depending upon when the latest measurements were taken). Looking at the July data, 1998 did indeed show the greatest anomaly from average. However, when looking at other months, 2002 had the warmest January and March, and 2005 had the warmest September. It should be noted that these values are all positive - every measurement here is higher than the 125 year average. One can question whether that is statistically meaningful or not (and it can be calculated), but a good scientist will make note of the fact that this is either a significant, long-term anomaly, or it is an indication of a change in behavior. Either way, it is not something to be ignored.

Monthly temperatures can also be more significant than annual averages, because they can signal significant changes in the annual maxima and minima. These values can play major roles in broader systemic behavior, such as glacier formation and melting, ecosystem behavior, and so on.

(Edit - removed an unnecessary comment)

Let me see here, so if I’ve got this right, all those photos I have of myself as a kid, growing up in Ohio in the early 1970s surrounded by several feet of snowfall are somehow forgeries since the region of Ohio I grew up in rarely gets more than a slight dusting of snow in the winter now. After all, since there’s little to no snow in the winter now, and there’s no global warming, there could have only been little to no snow then. (Since a change in snowfall would indicate that there was something screwy going on with the climate, and the OP has proven that there’s nothing going on with the climate.) Therefore, the only logical conclusion is that in between running for President, being Vice President, and heroically ridding the world of manpigbear, Al Gore must have not only broke into my house and replaced all those photos of me as a kid, but ran some kind of mindcontrol device on me, implanting memories of a snowy childhood that could never have possibly existed!

Man, that Gore’s one sneaky bastard!

The El Nino in 1997/1998 was much larger…the “El Nino of the century” it is often called. As a result, the global temperature 1998 was head-and-shoulders above any previous year. The 2006 El Nino developed late in the year and since temps tend to lag the El Nino a bit, it is actually 2007 that will mainly have its averaged pushed up somewhat. However, as others have noted, this El Nino was pretty weak and short-lived.

So, in summary, the problem is that you have cherry-picked your start date…and in fact it was one hell of a cherry pick! Using the same logic, I could disprove the seasonal cycle too. After all, the high temperature over the Memorial Day weekend was close to 90 F here in Rochester but today was only in the 70s…and those silly climatologists claim we are heading toward summer!

And, by the way, climate models driven with increasing greenhouse gases show the same sort of fluctuations that occur in real climate…i.e., they don’t show a monotonic increase in temperatures.

Actually, the 8C is a little bit of a high-end estimate for the global temperature change…I think it was more like ~6C. Also, that change occurred over a period of a few thousand years, which corresponds to rates of about 0.2 C per century at most. (It was then actually pretty flat, or even dropped a little bit, over the last several thousand years.) The rate of increase in temperature over the last ~30 years have averaged close to 0.2 C per decade.

The standard answer is in fact that these cycles are triggered by orbital oscillations (so-called Milankovitch cycles). These result mainly in a redistribution of radiation over the globe rather than much of change in the total radiation. However, because of the different distribution of land and water in the northern and southern hemispheres, this allows glacial ice in the northern hemisphere to shrink or grow over time…and the changing albedo of the earth as a result causes warming or cooling. The effect gets magnified more by the fact that warming causes a release of greenhouse gases from both the oceans and land/biosphere…and this causes further warming.

And, by the way, by estimating the “radiative forcings” and the resulting global temperature change during the glacial – interglacial cycles, one can obtain an estimate for the so-called climate sensitivity to CO2 (i.e., how much rise in global temperature a doubling of CO2 concentrations should cause) because we know quite accurately the radiative forcing due to changes in CO2 concentrations. The estimate obtained is in good agreement with the climate sensitivities predicted by the climate models.

Here is the OP’s table projected back further, to 1990, allowing one to see what a dramatic anomaly 1998 was and how looking at the bigger picture changes the sort of conclusions you draw!

NCDC Annual Global Temperature Anomaly and Difference vs 1998, Degrees C

1990 +0.07C -0.43C
1991 +0.14C -0.36C
1992 -0.19C -0.69C
1993 -0.15C -0.65C
1994 -0.01C -0.51C
1995 +0.13C -0.37C
1996 +0.02C -0.48C
1997 +0.05C -0.45C
1998 +0.50C +0.00C
1999 +0.03C -0.47C
2000 +0.02C -0.48C
2001 +0.19C -0.31C
2002 +0.30C -0.20C
2003 +0.26C -0.24C
2004 +0.19C -0.32C
2005 +0.33C -0.17C
2006 +0.28C -0.22C
2007 +0.30C -0.20C Projected

If you do a comparative analysis to the most recent peak, you’re guaranteed to find a decrease since then. That’s what makes it a peak. Likewise, if you do a comparative analysis to the most recent trough, you’re guaranteed to find an increase. How would either be of scientific use?

Lamar, interesting questions. The difference in land mass shouldn’t matter, since in theory anyways they are getting the same heating per square meter.

And CO2 is generally considered to be a “well mixed” greenhouse gas because of its fairly long residence time in the atmosphere. Average concentration of CO2 at the South Pole is not much different from that at Mauna Loa, for example.

w.

Citation?

w.

What does it mean? I mean, what is the debate here?

The average annual global temperature anomaly since 1998 is +0.23C. Obviously someone is including the temperature change to get back to 0.00C to come up this +0.2C increase per decade.

This is like someone whom had a temperature of 96F and then got a fever and whose temperature then went to 98.9F. The question is obvious, did the persons temperature rise +2.9F due to the fever? The answer is no. Returning to the the average state can only be considered natural. Thus, the temperature increase due to global warming, if it exists, has been no where close to +0.2C per decade since the temperature anomaly over the past decade is only +0.23C.

The IPCC 2001, Woods Hole data shows the rise in temperature since the last ice age due to the current warming cycle to be approximately +8C.

Well, here is Hansen’s Scientific American article (or the extended version thereof) where he gives the argument

That’s just sillyness. Temperature “anomalies” just means that you measure the temperature relative to some reference period. However, what reference period one uses is a matter of convention. My guess is that the reference period they use might be 1961-1990…but I am not sure. (Actually, it might start later because the satellite data record doesn’t exist until 1979.)

[There is also the fact that the UAH version of the satellite data record has a somewhat lower trend in temperature since 1979…about 0.14 C / decade, I believe…as compared to either the surface temperature record or the value that is derived by another group, RSS, analyzing the same satellite data.]

Whatever…It is not worth arguing about a few degrees. There are error bars on it anyway. I should note though that in the calculation that I just cited for intention, Hansen used an estimate of 5C for the change from glacial to interglacial. If he had used 8C and the same estimate for forcings, he would have gotten a proportionally larger estimate for the climate sensitivity. So, in other words, unless you can come up with some big correction to the forcing estimates he made, you are not going to do the argument that rising CO2 levels won’t cause significant warming much good by adopting a high estimate of what the temperature change was between glacial and interglacial. You’d be better off arguing that the temperature change was actually on the low side of what is estimated!

Thanks, jshore. Hansen is being a bit optimistic about his error values, and has left out a key number.

First, the IPCC FAR says the current uncertainty for the forcing of CO2, CH4, and NO2, with all of our accurate measurements, is 0.3 W/m2. Hansen gives the uncertainty for the same data for 50,000 years ago as being 0.5 W/m2, which seems doubtful.

Second, the IPCC uncertainty in the modern aerosol data is +0.6, - 1.2, while Hansen says the accuracy for 50,000 years ago is ±1.0 … again, this level of accuracy does not seem to be possible, it’s almost the same as the modern numbers.

Third, he does not include any error estimate for the temperature. The error in the HadCRUT data set from only 150 years ago is 0.8°C, so the error on 50,000 year old data must be more than that. Let’s very conservatively call it ±1°C, although it may be larger.

Using his (dubious) figures, and including the conservatively estimated ±1° temperature error, gives a 95% confidence interval for the temperature change for a doubling of CO2 as 1.1 to 7.6°C … which certainly encompasses the IPCC estimated range (2°C to 4.5°C) for a CO2 doubling, but can hardly be claimed to be in “good agreement” with the IPCC estimate as you say.

w.

jshore, we posted at the same time, so I wrote my post about Hansen and uncertainties without reading the post immediately above where you say:

You are correct, there are error bars on the ice age temperatures … so why didn’t Hansen see fit to use them?

I must confess, the general reluctance of AGW believers to use error bars, or when they use them to underestimate them, is a continuing frustration to me …

w.

This statement is untrue and can only be made with the aid of hindsight. Using the UK graph as an example, we procedeed to establish new peaks every couple of years up until 1998. There has not been a new peak since 1998.

The most common argument to oppose the OP is to discount the recent 1998 peak as something unusual. This has not been done so far based on statistical analysis nor discounting the many other large historical temperature anomaly changes. The 1878, 1879, 1942, etc. anomaly changes were more pronounced than 1998.

No, sillyness is moving the 0.00C anomaly to some negative anomaly in order to indicate a much greater “change” in the anomaly over time. The change from a negative anomaly to the 0.00C can only be looked at as natural, that is the whole idea.

The idea that the starting negative anomaly is the datum point for establishing changes in the anomaly defeats the whole concept of establishing the 0.00C reference point. Basically, saying that a change from a negative anomaly is the same as saying that the anomaly should have remained negative throughout time instead of 0.00C.

I am not sure I follow you. If we are looking at the change in global temperature due to mankind, the change from the glacial period to when mankind could have had an affect on global temperature, the larger the better. Certainly, man did not generate CO2 in sufficient quantities over most of the current warming cycle to account for the +8C change. At best, based on the NCDC data, +0.23C which is the average annual global temperature anomaly from 1998 to 2006 and this, again, includes the 1998 anomaly.

The Petit/Vostok data which the 2001 IPCC used is shown here:

What are you proposing…That radiative properties of the various gases has changed in the last 50,000 years or that the measurements from the ice core data aren’t sufficiently accurate?

In order to get the number at the low end of your range (1.1°C), we would need, for example, the temperature change between the ice age and interglacial periods to have been only 3°C while the forcings that Hansen estimates as 6.6 ± 1.5 W/m2 are really ~10.6 W/m2. If the temperature change was really 4°C then the forcings would have to be ~14 W/m2. This sounds rather unlikely. Furthermore, I think Hansen’s estimate of 5°C for the temperature change is already toward the low end of estimates. ClimateGuy has been claiming 8°C while I have generally heard numbers like 5-7°C .

And, of course, if the global temperature change between glacial and interglacial periods were only like 3°C and we know what kinds of changes that caused for the earth’s climate, it doesn’t really support ideas of skeptics that changes of a degree or two or three C are no big deal. Much better to find out that the forcings were really much higher than that the temperature change was really much lower!

In the end, we are back at the usual situation: You essentially want to bet the “farm” on Hansen and most of the rest of the scientific community being wrong. Yes, it is conceivable that if all the uncertainties about the climate system break strongly in “your direction”, we may not be in as much hot water as we think. But, some of us would prefer not to gamble on such a hope.

All you have to do is look at the data to see that it was something unusual. It stands out like a sore thumb. That doesn’t mean we “discount” it, i.e., I know of no scientific studies that omit the 1998 data from a running average of the temperature. It just means that we don’t give it the sort of special treatment that you give it by conveniently starting all of our measurements from there…and drawing straight lines from that year to another year when the general way of handling noisy data is to use some sort of averaging techniques.

Okay, let’s try this again…What year you use as the base is merely a matter of convention. What we are interested in is how fast the temperature is changing. The reason we look at anomalies and not absolute temperatures is not that we believe there is any special meaning to the period that we arbitrarily define as the baseline period. Rather, it is because the “temperature anomaly field” has nicer properties than the temperature field. For example, consider a mountainous region: The surface temperature field in such a region varies quite rapidly with distance because of the elevation changes. However, if you look at temperature anomalies so that you are comparing the temperature at one time to the temperature at another time in the same place, you eliminate this effect.

The question is not how much of the change from the last ice age to the current interglacial we are responsible for. We know that we didn’t cause the ice age – interglacial oscillations. The question is this: Since we know quite accurately the “radiative forcing” that a given change in CO2 levels will produce…and we know very accurately the change in CO2 levels we have produced over the last couple hundred years (and, with somewhat less accuracy, how these levels will change in the future given certain assumptions about emissions), what we are left to ask is how sensitive the climate is to a given amount of radiative forcing. One way to determine this is to estimate the radiative forcing and the resulting temperature changes that produced these past changes in climate. In that regard, for any given estimate of the forcing, a larger temperature change indicates a higher sensitivity of the climate to a given amount of forcing.

By the way, just to note, one should keep in mind that the temperature scale shown for this data gives the estimated temperature change in that region which, because of polar amplification, is larger than the estimated global temperature change.