Please explain Climategate.

Climate models are meant to be predictive of what is going to happen where. The basic math to determine whether things will heat up or cool down can be done with pencil and paper.

There’s a reason why there is no climate model which doesn’t show heating when CO2 is introduced. Half of all climate scientists are employed by big business and have (theoretically) had pressure to disprove global warming. The theory of global warming was introduced 103 years ago, so it’s not like there hasn’t been plenty of time for challengers to offer an alternate theory than anthropogenic warming and pay some college kid to code up something which explains where we are. But like I said, the basic idea of the greenhouse layer is pretty simple. Different molecules absorb and radiate heat at different rates. The ability for the earth to scrub particular molecules from the air is a particular rate. You add those together and you’ve got an answer. No one’s been able to come up with a climate simulator that comes to a different result because then you have to toss out our basic understanding of the greenhouse effect.

Is there even any evidence yet that the e-mails said any such thing, or are we still all just taking the word of the crackers who broke in in the first place? Until that’s resolved, it’s kind of silly to even ask any other questions.

And they did, that is one of the reasons many say that this “gate” does not change much. Your point also ignores how is that other models from other nations arrive to almost the same conclusions. (That is just another reason why many scientists agree that models are a good tool to use)

http://www.nec.com/global/environment/featured/warming/04_esim_iv01.html

Of course they do. As documented in the emails, they were sharing tricks to get the models to match up to each others results.

They came to the same conclusions, but not by independent means.

As explained before, repeating that “trick” meant some nefarious thing, is a silly affirmation.

Sorry, you need better evidence than a reckless say so.

That’s ridiculous. There is far too much data(gigabytes!), far too many inputs, and far too many algorithms, and far to many forcings being applied to even come close.

Who said anything about nefarious? If the scientists use the same technique to get the same results, that is not independent verification of either the technique OR the results.

And for what it’s worth, that explanation is complete BS. It’s very clear he’s talking about merging the instrument record into his data series in order to fudge the results, not just plotting them along side by side. Plotting them side by side would not have the effect of ‘hiding the decline.’

Everyone can notice that you are ignoring that that action was explained and in the open.

And the researchers that took a look before saw nothing improper, once again you have no evidence to support your say so’s

I’ll bet the really tricky part of the Climategate conspirators was faking things like the melting of the glaciers and polar icecaps, especially in the Arctic; the changes in latitudes where certain animals and plants are found, and stuff like that.

But it’s amazing what you can do these days with a few lines of computer code. :slight_smile:

Whatever might have been meant by “hiding the decline,” is there an underlying problem with the use of tree-ring data, if it does not match up with experimental results in recent years? What is the basis for considering tree-ring data a useful indicator?

(I’m not trying to argue that it’s not; I’m just looking to understand this point).

Actually, no. The technique as been pretty thoroughly discredited. See Mann’s Hockey Stick, etc. But you already know that from previous threads.

:dubious: That’s funny. I’m pretty sure I acknowledged their explanation. Oh, it’s right here:

Just because an explanation is offered doesn’t mean it’s a good one. And this one is a whopper.

It looks like climate change deniers are not just trying to attack the science. There’s also

Break ins, computer hacking, as well as nasty emails and phone call harassment

Oh, that. That’s just scientific research, denialist-style.

My worry about the science it this. The Harry_read_me file is probably a symptom. It shows apalling problems in the science. Yes it is about recreating the database. But what it does, is tell us the mechanisms about how the database is created and maintained, and how the synthetic data is created. And it doesn’t smell good. There are a whole range of assumptions and approximations used to create synthetic data, to interpolate data, and for very rudimentary culling of bad data. Some parts of the process will be pretty robust to coding errors. Others will not. But there has been no analysis or testing to determine the level of robustness.

The question that must be answered is this: is global warming anthropogenic? The argument that global warming is clearly happening and thus this stuff isn’t important misses the point. It isn’t enough to show it is happening. It is critical that the data be good enough to allow computational climate models to be validated. And the data isn’t good enough.

The mathematics of global heating are appallingly complex. You have a number of coupled models. Plus a range of parameters that affect the system, many of which are not well known. To give a few examples. Sea surface temperature is probably the most important determinant of much climate. That is affected by the global ocean currents, with cyclic phenomina such as el-Nino, and long terms flows such as the North Atlantic Conveyer being utterly critical. The amount of sunlight an ocean receives, the inflow of fresh water, all affect things. It is a huge 3D computational fluid dynamic problem, with the flows running around a model of the continents and ocean beds, modelling all the energy flows as much as possible. Then you have the atmospheric modelling. Same idea, this time you have a multi-mode flow, where you need to understand how clouds and rain work as well. Once you get to clouds you get into the probems that become contentious. This means modelling aerosol contributions to energy flow, and the parameters are not well understood, and subject to argument. Also, the manner in which cloud cover forms a feedback system. Clouds are formed by oceanic evaporation. More heat in the oceans, more cloud. But less sunlight. So what are the parameters? How do you validate your model? It isn’t enough that you get the model to the point where the oceans cease to boil. You need to get the model to the point where it matches all the measured data and energy flows. And you need to test the model for stability. These coupled multi-mode models contain many feedback paths. You have to know what the stability margins are. And you have to know what issues for stability your various approximations cause. Just gridding an algorithm can cause things to go unstable. Some of the stability margins are what we are trying to find out. The positive feedback loop for CO2 is the most important of these. It isn’t good enough to say that the climate is warming. We have to know where, when, and by how much. These guys were responsible for the database telling us what the observed data was. And there is enough of a question mark over the manner in which that data was massaged and corrected to seriously worry that the critical gap - the ablitiy to compare theoretical models with observed data is compromised. I don’t know what the state of play with other country’s work is. But the impression I get is that the CRU was the main gatekeeper for the world climate data. They were the reference point, and that there was no independant duplication of their work being done, and no seperate database maintained elsewhere.

And this is where we get into trouble. Two weeks ago I would have defended the science being done. Now I am deeply worried.

But again, all of that is only necessary to tell us what is going to happen where. You can create a significantly simpler model if all you care about is whether CO2 will raise the global temperature. You keep a few variables with the PPMs of the molecules in the atmosphere, you have a few constants with the radiative ability of each molecule, you have the rate at which the ocean sucks up heat and CO2, and you have the rate at which CO2 is scrubbed from the air. You don’t need wind or currents or anything more than that. You don’t need to track regions or movements or anything. It’s all just arithmetic in a loop.

There’s several different climate simulators. Their analysis for where the peaks and lows of temperature and weather will occur is based on looking at the output of all of them, not just the CRUs. If all of them were just spitting out random information, there’d be no way to synthesize that data. The fact that they have been able to do so is evidence that the data is meaningful.

No, you missed the point, you saying that “that explanation is complete BS” was BS. Do you have support of your affirmation than just a ‘because I say so?’

But don’t these simulators have difficulty modeling the last ten years? Isn’t that mentioned in one of the CRU emails? CO2 has continued to rise but the correlation with temperature increase has fallen off.

Hey, I’ve learned a valuable lesson here. You can really cripple an organization with Freedom of Information requests if you time them right and are persistent.

Well, we need a cite now to see where are you coming from.

So far I have seen that affirmation out of context, cite to see if it is the same issue please.

It was realized for a long time that the increase was not going to be gradual.

Latif explained why there will be periods where it would look like the temperature is going down. In this case, it is clear that the mislead is to grab a very hot 1998 year and claim that warming has stopped. The explanation here mentions that the variation will most likely be used by contrarians to say that there is no more warming. And he was 100% correct.