So who can answer Jerry Pournelle's questions about Global Warming?

There has been a lively discussion over at www.jerrypournelle.com about Global Warming.

It boils down to this, and I have to agree, how can we make conclusions about small differences in climate when the purported change in global temperature is less than the error likely in the measured data?

Can anyone here refute his skepticism? As a society, should we commit trillions of dollars based on this kind of thing?

Short answer: any climate scientist can respond.

Longer answer: IANACS, but the changes are larger than the measurement errors, there are trends larger than same, and multiple independent data sources confirm climate change.

Didn’t Pournelle want to build Star Wars/SDI as a meteor defense before there was any census of the number or type of earth-crossing asteroids? He’s willing to piss away trillions to no known effect, yet unwilling to accept the demonstrated empirical facts of climate change. :rolleyes:

Say, is Pournelle debating or other people on his board? Changes the target of my rant.

Your rant is targeted correctly.
http://www.jerrypournelle.com/view/2010/Q4/view643.html#dialog

From there he goes on at length trying to justify his refusal to read the report.

Well he has a fairly simple question. Saying he should read a 900 page rather complex report isn’t really a satisfactory answer. There ought to be a simpler answer than that.

I will address what I see as your three underlying issues: (a) scientific debate, (b) common-sense understanding, (c) trillions of dollars.

(a) scientific debate can be subdivided into (1) scientists vs scientists, (2) scientists vs intelligent layman, (3) scientists vs liars.
(a1) been there done that. anyone who thinks scientists are in doubt is a moron.
(a2) it’s good to debate non-specialist geniuses. Freeman Dyson is a skeptic and his views deserve (and have received) attention. (Is Pournelle in the same league as Dyson? :smiley: )
(a3) for several months public attention focused on a “scandal” in which, for example, faulty computer code was printing negative variances and the maintenance programmer complained with expletives. That negative variances are necessarily a software bug, not a model failure, was ignored.

(b) your question (why should small changes in CO2 or temperature have big effects?), is an interesting one. There are cyclical effects much larger than the small AGW increase (most obviously the 24-hour and 365-day cycles :D). A “low-pass filter” is needed just to observe the small effects of AGW. But there are low-pass filters telling a dismal story. Glacier melting should be a bigger story than it is. The pH change in the ocean just in recent years is huge (and already having profound ecological effects). Measurements match models. Frankly the anti-AGW noise is mostly as nonsensical as the Obama-born-in-Kenya noise, but more “credible” because better financed.

(c) detailed comment about your “trillion dollars” might hijack the thread, but I can’t resist noting that
The same people worried about a trillion dollars to save the environment are often the very same people who happily spent a trillion pursuing non-existent weapons of mass destruction! Google “hypocrisy AND greed” if you’re still confused.

I was just answering Typo Knig’s question about whether or not his rant was targeted correctly. I think the link shows that Pournelle is a denier and that Typo Knig’s rant against deniers is therefore properly targeted.

To address your point, I think Pournelle is wrong in his opinion about AGW. However, I can see where his argument about not having to read the report if he doesn’t accept its basic assumptions and methodology can be defended. I think his non-acceptance of the data is mistaken, but his logic for not reading it, based on that non-acceptance, isn’t totally unreasonable.

Is it, though?:confused: How can one know if one refuses to read the frigging scientific report?

Perhaps Pournelle’s read “Footfall”, and knows what “random death in the life support system” is supposed to look like, and this isn’t it?

Sure:

Jacobshavn Glacier Retreating

A bunch of glaciers retreating.

Athabasca glacier retreating.

McCarty Glacier.

Nunavut Glacier

Retreating glaciers in Bhutan (Himalayas)

Franz Josef Glacier

Polar Ice Cap

We can do the above all day. This is happening in almost every case wherever you look in the world.

If this is not the earth getting warmer is there another explanation?

I haven’t read the thread at Pournelle’s site, but is he claiming the error bars on measurement of temperature are too large, or the error bars on measurement of time?

Or concentration of carbon dioxide?

There’s not enough information here to know what his objection is.

Nitpick: the Franz Josef is actually advancing again. However several of the Himalayan ones I trie to visit in Tibet had disappeared.

Any individual data point has large error bars on it, but if you take information from a large number of data points, with the proper statistical techniques, you can significantly improve your signal-to-noise ratio.

What kind of thing? The objections of science fiction writers?

Look, even assuming that the AGW deniers are on to something - an assumption I find it sad that we have to make - the other benefits of reducing our greenhouse gas emissions stand on their own.

It’s not as though we’ll wake up one morning in 2035 and go, “oh, we spent $50 trillion and got nothing.” It would be more like, “well, we spent $50 trillion and made no appreciable impact on the climate, but hey, we did reduce particulate emissions by 49% and no longer depend on imported oil.”

I looked into it a little, and I have bad news for Mr. Pournelle:

His objection was about error bars on temperature:

As Chronos mentioned, it’s trivial to take a large number of measurements with, say, an accuracy of ±1°C and use those measurements to generate an average with an accuracy much higher than ±1°C.

Here’s how it works:

Temperature Values:
1 measurement at 14°C
1 measurement at 19°C
6 measurements at 20°C
3 measurements at 21°C
9 measurements at 22°C
5 measurements at 23°C
2 measurements at 24°C
2 measurements at 25°C
1 measurement at 27°C

The mean (average) of these values is 21.8 and the standard deviation 2.31

But it’s the mean of a sample of temperatures, not the population, so what is the standard deviation of the sample mean (in other words, how close is the population mean likely to be to that sample mean?)

We know that the distribution of the sample means is {µ, (σ/√n)} where µ is the population mean (which we don’t know,) σ is the standard deviation (which we don’t know, but we can estimate using the sample standard deviation and the Student’s t-distribution density function with the appropriate degrees of freedom,) and n is the sample size. Degrees of freedom will be n-1.

Instead of calculating (σ/√n) we calculate t*(s/√n) where t* is the Student’s t-distribution critical density for a 95% confidence interval and s is the sample standard deviation. This allows us to calculate using only known sample parameters, not unknown population parameters.

In the example given, t* = 2.045 (can be looked up in any table of Student’s t-distribution,) s = 2.31, and n = 30
Therefore we know with 95% confidence that the population average for that set of temperatures is 21.8°C ±0.86°C.

This is not secret information. Mr. Pournelle is basically denying that inference about a population mean, taught in every college-level statistics course on the planet, is impossible.

The other bad news for Mr. Pournelle is that temperature measurements in the modern era have an accuracy much higher than ±1°C. In my own work, I use cheap (<$100 if you buy in bulk) temperature loggers (found here that have an accuracy of ±0.2°C, but since I need an accuracy higher than that, I simply set them to take a temperature measurement every 15 seconds and average the large number of measurements to get a mean with a higher accuracy than ±0.2°C, and I work with that mean. If you collect a measurement every 15 seconds, you collect 240 measurements an hour, which gives you a lot of data to average from. It turns out that NOAA reports temperatures to a 0.1°C every 10 minutes, example here, so either they use a better sensor than I do or they take many measurements in that 10 minutes.

To be fair, older temperature measurements will not have the same accuracy. But if you have a large number of measurements, you can still do inference about the population mean.

I wish I knew exactly what day Larry Niven met Jerry Pournelle. I’d like to mark it on my calender, in a “Black Tuesday” sort of way.

Aliens?

I like the mote series. I generally like hard sci fi.

A novel approach to escape velocity, sticking a series of nuclear roman candles up your ass. That’s not hard SF, that’s hard stupid. Feh!, as they say on We Made It.

There are no empirical facts to support climate change. It’s a hypothesis that may or may not be true based on computer models that cannot possibly contain all the variables. The only technical solution being pursued is the reduction of carbon dioxide and not technology aimed at lowering atmospheric temperatures.