Has the world actually been cooling since 2002?

Nature published a peer reviewed paper showing that since 2001 the planet has not warmed.

I checked all the data, and even GISS shows slight cooling since 2002. It is clear from looking at the GISS data that the cooling NH cold season, and especially the winter, is the reason the global mean is negative for a trend. For the GISS analysis, which is station based readings. Satellite and the NCDC data shows cooling ior the annual mean, but the cold season signal also shows up in all data sets, ocean land or surface readings.

You can easily check the GISS yourself. The others not so easy, but if anyone isn’t convinced by the GISS, then we might give it a try. (can’t link directly to the evidence, you have to create some yourself) All trends from GISS are for the time period. NCDC trends are decadal

GISS trend for January 2002-2014

GISS trend for February 2002-2013

GISS trend for March 2002-2013

GISS trend for April 2002-2013

GISS trend for December 2002-2013

And to save time, the GISS trend for NH warm season shows warming +.06 C

GISS trend for NH cold season shows cooling, - .10 C, which is greater than the warm season warming, hence the GISS trend for the entire year, 2002-2013 shows cooling.

The NCDC, HADcrut, MSU, RSS and Crutemp all agree with GISS, though some show much greater cooling for the NH cold season, and especially winter, -.16 C for the global trend DJF 2002-2013 from the NCDC

You can check the major data sets at woodfortrees, but not at any great level of detail.

CRUTem3 NH clearly shows the cooling for DJFM, and even the RSS MSU satellite data shows the winter trend, even at a global level.

Woodfortrees key
Red is December
Green is March
Blue is January
Purple is February

And for good measure the GISS LOTI data from woodfortrees

If you disagree with any of this, we can have a debate.

If you start the trend in 1997 you can get an annual trend of +.11 C

However, the winter trend starting in 1997 is already negative.

The NCDC data show the NH winter trend from 1998 at -.17 C a decade. The SH summer trend is positive. +.01C

Which is why I used 2002, as even GISS shows a global cooling trend starting then. But many other data sets show cooling since 2000, or before.

NH boreal winter trends are negative since 1988 in large areas, all are negative since 1992. Alaska has negative trends starting in 1979, but drastic cooling since 2004 in almlost every station. (see December trend for a shocking example) But an eight year trend isn’t very important. Unless you live in Alaska of course.

Various ideas are being discussed to explain WHY it’s happening, like the paper in the first line of this topic. Yet some people are still arguing it isn’t actually happening.

How about you?

First.

I though they many contrarians said that it had paused. So never mind that huh?

In any case, the answer is no, it has not cooled.

http://www.giss.nasa.gov/research/news/20140121/

I’m not sure why I’m going to bother, but the first question to my mind is “Why the year 2002 in particular?” Why not some other year?

Despite the attempt at justification, it smacks of cherry picking. Why not a decade earlier at 1992? If so, there’s a general increase. Ok, how about 2000? Still an increase (you should check your numbers again). 2003? A slight decrease. 2001? A slight decrease.

Year to year variations and all that, so why bother cherry picking. Take time series averages and don’t use arbitrary start years, like any half trained statistician would know to do.

The debate is not over the long term trend. The OP makes that clear.

It’s an interesting aside of course, that a hundred year trend show cooling for parts of the Pacific, and the Atlantic. The Pacific is important, as this is very much some of the areas the Nature paper discusses, in regards to causing the cooling since 2001. The map doesn’t really show the area correctly, but it’s a very large amount of ocean that has not warmed at all in the last hundred years.

The NH warm season also shows areas that have seen no warming in a hundred years.

But that is another topic.

I must have left that pout of the OP. Because that is the year even the GISS data shows global cooling. Other data sets show cooling since 1998, but that is an obvious huge anomaly, a “super El Nino”, which warmed the world more than any other event, except maybe the 2010 one.

Why is it happening? Short term variations in climate happen all the time for any number of reasons. This includes the El Nino and La Nina phenomena, solar activity, etc. There’s no need for special pleading.

And it’s somewhat disingenous, as I’m sure you realize as you put this in GD. Why discuss short term variations except as a launching point into general climatic trends?

The numbers are the numbers. There’s not a lot of reason to discuss them in isolation of broader trends or as a discussion at all except to make a specific point about cherry picked data and general climate science.

So, cherry picking by a different name. Ok.

Far be it from me to interrupt this fantasy with facts, but… well, not really. I like facts. Here are some.

  1. The paper doesn’t actually “show” that the planet hasn’t warmed in that period, and neither did you (tracking specific months is not the same as tracking annual average temperatures), nor – most importantly – is such a short-term eventuality of any particular relevance. The abstract contains the rather loosely worded statement “the annual-mean global temperature has not risen in the twenty-first century” but the references it cites in support of that statement say no such thing. Here is what the cites actually say:
  1. The intent of this Nature paper has nothing to do with attributing asinine causes or false significance to the present temperature gradient, but rather to examine possible causes for the fact that the rate of temperature increase has temporarily changed. It attributes this to internal variabilities related to changes in ocean heat uptake, which I and others already described in another thread here, and which energy budget considerations dictate must ultimately result in an accelerated rate of warming in the future, which is no doubt the opposite of what you would prefer to claim. Much as the regional cooling of a negative ENSO is compensated by unusual warmth in a positive cycle. Also of humorously ironic note is that the paper’s conclusions – as well as the Easterling et al. paper that it cites – were both based on the results of climate models, which you just finished trashing in another thread as useless.

Skip it.

Never do this again in Great Debates. Never.

Frankly, it doesn’t matter. The variance in temperature is large, and any 10-20 year period could easily show net warming or cooling whether or not the longer-term trend is in the opposite direction.

Look at the 100 year trend on this chart: Global temperature trends

No one would argue that the chart shows anything but a trend of rising temperatures. But notice that temperature hit a peak around 1938, and then began a multi-decadal period of general decline, and temperature didn’t again hit that peak until about 1977. So you could make a true statement in 1976 that the earth’s temperature as measured from a start in 1938 had declined for almost 40 years! It’s just that the statement, while true, would be ‘lying with statistics’ if you used it to ‘prove’ there was no overall warming trend.

On the other hand, the opposite is also true - people who trot out factoids like “Four of the five hottest years on record occurred since 2000, and therefore it proves warming!” are making exactly the same fallacy. As are the people who are frantically trying to tie individual weather events to global warming to make the case for current damages.

It also opens the question as to whether even a 100 year trend means anything in this regard. The Vostok ice core temperature reconstructions show that we’re still well below the peak temperature other intra-glacial periods hit before the next ice age kicked in, and the data shows variance much greater than the difference in temperature we’ve measured in the last 100 years, and that variance is over much longer periods.

If you were looking at this chart as an outside observer, what would you conclude about the current temperature? That it was unprecedented? Hotter than it should be? Colder? If you look at the previous peak about 125,000 years ago, you can see it surrounded by several other smaller peaks, all of which represent temperature variation greater than what we’ve measured currently.

All this doesn’t prove or disprove global warming. I just present it to warn against using short term data alone to prove or disprove that abnormal warming is happening. You need to present that data in context, and hopefully provide testable hypotheses for why it’s doing what it’s doing. A mere trend does not an argument make when the data is this noisy.

Even though I linked to that first chart, it shows another way to lie with statistics - the chart is designed to make the change in temperature look severe by choosing a very narrow scale. It’s a split chart and should be indicated as such. If that chart included the full range of temperatures the Earth experiences, the change would look much smaller.

What sort of trend would?

One reason is the unpredicability of solar output. From this graph, we see part of this plateau in global temperatures is an unexpected lower peak radiation from the sun. The OP didn’t ask for this, but here’s a longer term graph of this solar output. Although the written description of the graph doesn’t say so, I believe the data shown in red (before 1750) is unconfirmed. Both graphs are from Wikipedia, please handle with care.

Over 12 years, climate data in “noisy”, better to compare 50 year averages with other 50 year averages.

But, variation in solar activity has been ruled out as a causal factor here.

A few points of clarification, or maybe “perspective” would be a better word.

The fact that other inter-glacial periods hit higher peak temperatures than we have today might lead someone to conclude that we were due for higher temps all along, and that this is just a natural series of events.

I would first of all ask if you question this basic approximation of CO2 forcing. I hope not, because it’s basic physics, and the present net effects have been fairly well assessed.

Now given the strength of CO2 forcing and understanding that it’s the primary driver of glaciation and its termination, we need to understand that the maximum CO2 levels of interglacials have been around 280 ppm – and never more than 300 ppm – and that the transition from a low of 180 ppm to a high of around 280 takes around ten thousand years, and thousands more to reach equilibrium temperature. We are now at 400 ppm and we’ve achieved that in just a couple of hundred years – the graph looks like this and it’s even worse than that – CO2 is increasing so fast that it’s already outdated even on the rough scale just because it’s six years old.

We’ve increased the atmosphere’s CO2 level more in the past couple of hundred years than natural glacial terminations do in ten thousand years, and moreover, we’ve introduced this increment *on top of * what had been a normal interglacial level of 280 ppm at the beginning of industrialization. Climatically, this puts us in an entirely new climate era never before seen in human history, and indeed never seen before at all since the onset of rapid glaciation at the mid-Pliocene transition 1.2 million years ago, with CO2 levels possibly already the highest in 15 million years and no end in sight.

Hopefully that helps introduce a more realistic perspective. The 100-year trend isn’t the “proof” of what we’re doing to the climate; it’s the first emergent empirical evidence of the consequences of the measured forcings that we know are occurring, and which have a very long way to go to equilibrium even if we stopped all GHG emissions tomorrow.

Well, that’s the problem, isn’t it? How big a trend do you really need to be certain?

If all you have to go on is the trend, the answer is “A really freaking big one.” The last global maximum followed a trend of constantly rising temperature that lasted for maybe 5,000 years. Then there was a very sudden peak and a constant decline that lasted almost as long. The problem with extrapolating trends into the future is that you never know where the inflection points are. The problem with using trends in very noise data is that you never know if the trend represents an underlying cause or is just noise.

So mere trends aren’t enough. You need to establish causation, establish hypotheses, come up with tests for them, make predictions, and in general provide proof that the trend is not mere correlation or an accident of noisy statistics or due to an unknown common cause. That’s where hard science comes in, and that’s where the debate should be centered - not a debate centered around pulling conclusions out of short-term noisy data when the direction of the noise just happens to coincide with your theory.

For example, I think it is very compelling that the forcing effect of CO2 is well established, and that we have good measurements of atmospheric CO2 concentrations that would lead one to suspect that the CO2 is causing a temperature increase. And since we’re measuring increasing temperatures, that helps strengthen the argument for anthropomorphic global warming. So we have not just the trend, but a theory for why temperature is doing what it’s doing.

However, that’s a long way from certainty, and it’s a long way from the kinds of long-term predictions being made by some AGW proponents. Again, the variance in the system is such that even if the AGW theorists are correct and man is warming the earth by 2-5 degrees in the next century, we could STILL see a century of colder-than-expected temperatures due to natural variance or unexpected causes.

The proper way to state the case then would be that man-made AGW adds a positive bias to global temperature that makes it more likely that it will be hotter than cooler. If temperature was flat, then it would be equally likely that the next century will be on average hotter or colder. WIth the bias of the AGW signal, all outcomes are biased by the amount of forcing. Extreme cold is less likely, extreme heat more likely. etc.

For example, look at that first chart I posted. There was almost a 40 year period where the temperature remained lowe than the peak in 1938, despite an average warming over the entire period of close to .1 degrees per decade. If it could happen before, it could happen again. And even if we doubled the rate of warming back then, we still would have seen a decline in that period of almost 20 years, purely due to variance (or if you like, unknown causes). So nothing is certain over a short time frame.

But that’s a harder sell than to just tell everyone there’s absolutely no doubt that tragedy will fall on us if we don’t do something NOW.

Now, I think we can do better than that. As causes are determined and measured, we can start removing them from the trend, and hopefully increase the signal to noise ratio to the point where we can make definitive claims. But that carries it’s own dangers. We may eliminate the sources of variance that are easiest to detect but not the most important, giving us a false sense of what we know. Or confirmation bias, publication bias, disparities in funding or political pressure could result in cherry-picking or a research focus on those causes that make the case for warming stronger while ignoring the ones that would make it weaker.

You’re missing my point. I don’t necessarily disagree with anything you said. I’m just trying to point out that using temperature trends alone doesn’t prove a thing unless it’s over a long enough period that the real trend can be discerned from the noise. And temperature data is very noisy and as the past 100 years showed, even a 40 year period of ‘cooling’ does not disprove an overall warming trend.

As to your other points, I pretty much agreed with you in my last message. Yes, CO2 forcing is compelling. Yes, I believe it has contributed to warming. It’s going beyond those ‘basic facts’ into extrapolating severe positive feedbacks, estimating future temperatures and economic effects on a civilization that doesn’t yet exist is where the real debate is for me, but we never seem to get to that debate because we’re stalled out at step 1.

Your citation doesn’t back your claim, as the title “Solar activity not a key cause of climate change, study shows”. The study didn’t rule out solar activity as a part of climate change.

From your source, “How do we talk about climate change? The need for strategic conversations”. Interesting thoughts …

Perhaps, but:

But, your claim that solar flux variations have been completely and utterly ruled out as having any effect of any kind on climate change has been shown to be in error.