My profound apologies, jshore, indeed I mixed up posters.
I was responding to SentientMeat’s incredulity when I said that there was such a thing as exponential decay of CO2, and that exponential decay means that if we freeze our emissions at current levels, the atmospheric concentration will stabilize and not increase indefinitely.
My point about the IPCC remains - exponential decay is accepted by everyone, including the IPCC. The question relates only to the time, and not to the mechanism, that is common scientific knowledge, despite SentientMeat’s protestations to the contrary.
I am not asking you to believe anything. I am reporting that my research into the question came up with the same answer, independently, that Mark Jacobson came up with. This is an e-folding decay time on the order of 35 years. The IPCC says that it is “nominally about 150 years”, which doesn’t help a lot. In addition, they say “No single lifetime can be defined for CO2 because of the different rates of uptake by different removal processes.”
This is a common misconception, which you repeat, saying " the actual decay of CO2 levels back to their pre-industrial values (once we essentially cease our emissions) is not expected to be a simple exponential, but will have a long tail, as noted in IPCC report."
In fact, the combination of any number of exponential decay processes with different e-folding times is exactly equivalent to a single exponential decay process. I have posted a mathematical proof of this here .
Thus, what both Mark Jacobson and I are calculating from the instrumental data is the effective e-folding lifetime.
The IPCC uses what’s called the Bern model of CO2 decay. This model assumes that 25% of the airborne CO2 decays with an e-folding time of 171 years, 28% with 18 years, 32% with 2.6 years, and 15% stays in the atmosphere forever … I’ve never met anyone who can explain to me how that can happen physically.
For example, the proposed 2.6 year e-folding time operating on a third of the CO2 molecules assumes that there is some mechanism that only works on new CO2, and once a particular CO2 molecule has been in the air for about ten years or so, that mechanism no longer works on that molecule, but continues to work on CO2 molecules that have been in the air for only a few years … and what’s up with the 15% that is never absorbed by any decay mechanism? How do the decay mechanisms recognize those molecules? What immunizes them from decay? Perhaps you can explain all of that … I can’t, nor can anyone I’ve asked. CO2 is a well mixed gas, so all molecules are available to all decay mechanisms.
Thanks for the post, lazybratsche. Actually, the question is quite relevant, because the mechanism (exponential decay) works whether the emissions are increasing or not.
The IPCC assumes, based on what I see as a flawed, physically impossible model, that the CO2 e-folding lifetime is on the order of 150 years. The data says more like 35 years. The difference is very significant for the future CO2 concentrations, regardless of the rate of emissions. We estimate future concentrations based on two things - emission levels, and e-folding lifetime. If the the lifetime is shorter, more CO2 is sequestered, and the concentration does not rise as fast.
Thus, the question of the e-folding lifetime is very important.
What you show here is correct as far as it goes, i.e., if you have a number of exponential decay processes operating in parallel then the total decay will also be exponential with a rate equal to the sum of the rates of the individual processes. However, I think in the real world, it is more complicated than this. For one, while some of the rate processes are exponential, some probably are not (e.g., the rate may be practically independent of concentration). However, more importantly, some processes don’t operate in parallel but in series…For example, the surface ocean waters absorb a certain amount of CO2 quite quickly but once they get saturated, the rate-limiting step is then the much slower process of the overturning of the ocean waters to bring deep waters with a lower concentration of CO2 dissolved in it back up to the top.
Well, since I am not really “up” on this carbon cycle modeling, I am afraid I am not qualified to give an explanation. However, I assume it accounts for issues such as the one I explained above involving the oceans. I’ll try to look into when I find the time.
At any rate, even if we make the draconian assumption that you are right and the IPCC, which represents the combined views of hundreds of experts, is wrong here, we are still left with the fact that lazybratsche has noted that the worldwide emissions of CO2 are continuing to increase…which means the level at which CO2 will top out at in your simple model is continuing to increase. In order for CO2 to stabilize at 420 or 440 ppm or whatever even in your model, we would have to stabilize our CO2 emissions rather than allowing them to continue to increase. And, like I said, there is absolutely no reason that I can see to believe that your simple assumptions are right and the combined expertise of those in the field is wrong (especially given that your argument is basically phenomenological and they are actually looking at the mechanisms involved). It’s just that I don’t have the background to explain precisely why your model is wrong.
jshore, thanks as always for your thoughtful replies. The situation you describe certainly can occur, where a reaction gets saturated, and slows down. The problem is that CO2 is what is called a “well-mixed ghg”, that is, it is evenly mixed throughout the atmosphere. Therefore, all of the molecules are subject on average to the same sinks. If a sink slows down for one, it slows down for all. This makes the fractioning of the CO2 into parts, each of which is subject to a different e-folding time, quite problematic. This is the problem I described below:
There are several difficulties with this point of view:
The IPCC does not represent the combined views of hundreds of experts. It represents the views of scientists who have been selected because they agree with the point of view previously expressed by the IPCC. There are a number of experts who have quit in disgust because their views were being ignored, or changed in committee. There are other experts who have been refused by the IPCC, despite their government recommending them for inclusion. There are other experts who have never been recommended because their views disagree with the IPCC.
Your statement is as nonsensical as saying “The National Rifle Association represents the combined views of hundreds of gun experts” … THERE IS NO CONSENSUS VIEW ON CLIMATE SCIENCE. The field is far too poorly understood. It’s not my view against everyone else. Mark Jacobson, one of the premier climate modelers in the world, agrees 100% with my view, I’ve given you the citation. How do you explain that?
My argument, like Jacobson’s, is based on an examination of the instrumental data. The Bern model, as the name implies, is based on a climate model. This is a bit different than “phenomenological” vs “looking at the mechanisms” as you imply. They are looking, not at the mechanisms, but at models of the mechanisms. The Bern model has been “validated”. Care to know how? By comparing it with other models … for example, “the parameterization of surface-to-deep transport by eddy diffusion was evaluated for anthropogenic CO2 and radiocarbon using the GFDL ocean transport model [Joos et al. (1996)]”
As I pointed out to Lazybrache, the e-folding time is central to the discussion of the effects of future emissions. Let us say that CO2 emissions continue to rise at the current rate of ~0.5%/year for the rest of the century. Seems very doubtful, exponential increases rarely continue for that long in the face of evolving technologies, and the population is slated to level out by about 2050, but let’s take that as an extreme assumption.
If the e-folding time is ~35 years, in the year 2100 we will have 460 ppmv of CO2. Using the Bern model, on the other hand, gives us 580 ppmv by 2100. You can see why the question is important. (Note that neither of these are as large as SentientMeat’s figure of 600 ppmv by the end of the century.)
There is an interesting discussion of the climate sensitivity here. Basically, it makes a good case that the sensitivity is 0.35 ± 9 °C per W/m2.
I mentioned before that the two estimates (exponential decay and Bern model) of the continuation of the current 0.5%/yr increase in emissions are 460 and 580 ppmv respectively.
This gives us a projected increase of 0.3-0.4°C for the exponential decay, and 0.6-1.0°C for the Bern method.
The good folks over at NASA, James Hansen and all, say the sensitivity is ~0.6° per W/m2. This would make the projected increase 0.6°C for the exponential decay, and 1.4°C for the Bern model.
You can see why the exact values of these two parameters are crucially important to the discussion. It is also clear that their is no agreement in the scientific community about the values.
w.
PS - The discussion also provides additional evidence that cosmic rays (modulated by the sun’s magnetic strength) are an important driver of the climate. These are one of a number of climate drivers and feedbacks totally ignored by the GCMs. It is not unrelated that the sun’s magnetic strength is currently the highest in the last 8,000 years …
Oh, for fuck’s sake … Okay, okay, I lost the bet. I don’t question your data that more carbon dioxide is being sequestered (in fact, I’ve already stated that I never meant to in the first place). I therefore unreservedly apologise to you, for making you think that that was what I wanted to bet on. Sorry.
So, let’s now move on and talk about something which isn’t so frigging irrelevant: the fact that carbon dioxide emissions are increasing even more, such that the concentration increased by 2.6 ppm in 2005.
If you remember what kicked this line of debate off, I said that the CO2 concentration would hit 600ppm this century at this rate. You chided me for assuming simple physics (and compared aluminium and human beings) and said that, actually, it would top out at 425 ppm, citing Le Chatelier’s principle. You also said that even if such a high concentration as >550ppm was reached, this would only cause a half-degree or so temperature rise.
I would like to thank you for telling me about the increased sequestration - I had wondered whether we were actually past the point of no return already, and it seems that the climate could recover with decisive action on emissions ASAP. But there’s still a whole host of questions remaining which you seem to have ducked out of so far:[ul][li]Why are you assuming that the global climate can be represented by a simple equilibrium reaction? Like you said yourself, a complex system like a human being simply cannot be modelled like this. Just as it won’t show the heat conducting properties of aluminium, it may very well not show the reaction kinematics of an equilibrium reaction when you keep feeding uniform shots of ethanol into it. [/li][li]Why are you using only the absolute most optimistic estimate of sensitivity to CO2 given what you say yourself about how little we know about climate modelling. Surely that uncertainty must admit worse case scenarios are possible?[/li][li]Why are you assuming zero time lags in your estimate of the warming due to the twentieth centuries emissions? Surely that uncertainty again makes it possible, if not likely, that we are only measuring the warming from years, perhaps decades, ago?[/li][li]Why are you only including negative feedbacks in your prediction of future stabilisation, given … that’s right … the uncertainty you repeatedly emphasize? Felled rainforests, acid-affected phytoplankton and melting tundra surely won’t have a beneficial effect on sequestration rates, will they?[/li][/ul]If you could find time to reply to these points I’d be very grateful. However, I’ve got one more question to add, which is really to do with the bet I was trying (and failing - sorry again) to ask you about.
What results would you need to see to convince you that the CO2 concentration won’t stabilise as you predict?
You see, for it to top out at 430 ppm or so as you suggest, that annual increase (2.6 ppm for 2005, but I know you suggest nearer 2ppm) is going to have to start shrinking pretty soon. Another 2.6ppm increase this year will leave only 20 years or so before your predicted limit, and for every year that the increase stays abve 2ppm the brakes are going to have to be slammed on evermore abruptly.
So, to rephrase the question (and I’d be very grateful if you could address both phrasings) would another, say 3 years of >2.5 ppm increases have you saying “Actually, yes, now we do need some kind of Emissions Protocol.”? If not, what would? It seems that your stabilisation depends on us stabilising emissions, but you don’t seem to advocate us doing so!
First of all, while the IPCC process may not be perfect, it does in fact represent the current views in the peer-reviewed literature. In fact, one finds that most papers appearing in the peer-reviewed literature reference it in regards to the current view in the field. Even some of the so-called skeptics have grudgingly admitted that the IPCC report represents a pretty good overview of the science and have focussed their ire on the “summary for policymakers”, which they claim oversimplifies the discussions in the full report and understates or underdiscusses uncertainties. However, this is irrelevant to the discussion at hand since we aren’t looking at that summary.
Second of all, Mark Jacobon, whatever you may think of his climate modeling skills, is not as far as I can tell an expert on the carbon cycle.
Third of all, he is using his simplified model of the carbon cycle to do a different calculation than the one you are using it for…i.e., he needs it just to be good enough to get a handle on the relative effect of CO2 vs. black carbon particulates, mainly over the near term. He does not make the claim that his model is a particularly good predictor of how CO2 levels would eventually level out if emission rates were held constant. It is not at all clear to me that Jacobson would endorse you using his model in the way that you have.
Fourth of all, you are taking Jacobson’s predictions further than he seems willing to. I.e., in that paper, he takes the range of lifetime for CO2 to be between 30 and 95 years. He thinks the upper bound is generous and a more likely one would be 50 or 60 years but he still chooses to keep the broader range because he understands that there are lots of uncertainties associated with his calculation in Fig. 1. If you use the upper limit of 95 years, then in fact his Fig. 2 assuming a constant anthropogenic emission rate of CO2 gives a CO2 concentration of somewhere around 540-570ppm in 2100 and still rising significantly with time. And, this is, I emphasize, under the optimistic assumption that we hold emission rates constant.
I didn’t know this zombie was still walking. And really you can theorize all you want. I don’t really don’t believe all the doom and gloom of global warming. The climate on Earth is variable anyway. I think we have more important things to worry about frankly. Running out of oil as a viable fuel source being near the top of the list. I think we should be exploring alternate fuels not because of some insignificant climate change, that may or may not be happening because of mans’ influence, but because the whole damned global economy is based on it!
You are a gentleman and a scholar, sir, and I salute you.
This is a perfect example of why you should never get your scientific information from the media. So before we discuss this claimed increase, let’s look at the actual data. According to NOAA, the peak CO2 increases year-over-year were as follows:
In other words, your claim that CO2 is “increasing even more” is simply not true. Nor is the claim that the increase in 2005 was 2.5 ppmv, heck, it didn’t even beat the increase in 1973. Never trust the media, always go to the source.
In order of the questions:
I do not “assume” that atmospheric CO2 levels can be described by exponential decay. I use exponential decay because it is the calculation which best fits the data. Which is also why everybody in the field uses exponential decay. The only discussion is about the exact parameters of the decay, not its existence.
Actually, the most optimistic sensitivity is that of Idso , which is about 0.1°C per Watt/m2. If you have not read it, I strongly suggest you do so. Certainly, worse case scenarios are possible. I have detailed some of them here, just a few posts above this one.
I am not assuming zero time lags. The climate sensitivity figures are estimates of the equilibrium response to the forcings, including the lags. The most fundamental of these estimates relates the total greenhouse forcing (~325 W/m2) to the total temperature rise due to the greenhouse effect (~33°C). As you can see, this is about 0.1°C per Watt/m2, and includes all time lags as well as all feedbacks and parasitic losses.
While there are both negative and positive feedbacks in the system, the negative feedbacks must predominate. The earth’s temperature has varied by about ±6°C over the last couple of billion years, a swing of about ±2%. During that time, the sun’s intensity has increased about 30%, without a corresponding change in the earth’s temperature. In addition, there have been meteor strikes, volcanoes, and a host of other things that destabilize the temperature. This is clear evidence that the negative feedbacks on average exceed the positive feedbacks.
In this regard, it is important to distinguish between feedbacks and parasitic losses. One of the consequences of thermodynamic theory is that parasitic losses increase with deltaT, the temperature difference. Thus, they will always act to reduce any potential temperature increase.
I point out the negative feedbacks because by and large, they are not included in the climate models, while the positive feedbacks are included.
I don’t “suggest” nearer 2 ppm, that’s the average (actual average last ten years 1.94 ppmv. Note that 2.15 is the actual figure for 2006, and 2005 was at 2.20 ppmv. Since your theorized “another 3 years of >2.5 ppm increases” has already failed on two counts, does that have you saying we don’t need an emissions protocol?
Also, I don’t think you’ve quite gotten the idea of exponential decay and e-folding times, because you say “Another 2.6ppm increase this year will leave only 20 years or so before your predicted limit …” which is a linear extrapolation. You can’t do that with exponential decay.
Let me make it clear where I stand on the issue of your two questions. Here is what I believe:
A) In order of importance to the general temperature of the earth, I would say the major factors are:
The sun, including both the change in intensity and more important, the change in the sun’s magnetic strength minus the earth’s geomagnetic strength.
Land use changes, in particular converting forest to pasture and pavement.
Changes in greenhouse gases.
B) I would place the current relative weight of these at something like 60% : 30% : 10%. These are very rough figures. See NASA for more information.
C) The major effect of these changes has been a slight spreading of the isotherms away from the equator. The tropical regions haven’t been getting much warmer.
D) Instrumental evidence shows that the major temperature increases have been during the winter, outside the tropics, and at night.
E) Cold kills more humans than heat does, especially the poor.
F) Ice cores show that the world has been warmer than it is today, both in the last thousand years, and warmer yet in the last ten thousand years (during the Holocene).
G) China is currently building eight 500MW coal-fired power stations per week.
H) Underground coal fires in China alone put more CO2 into the air than the entire US fleet of cars and light trucks.
I) The results of computer models vary so widely as to be useless for purposes of forecasting 100-year time horizons.
And my conclusions are:
A) There is no practical emissions control system that will make the slightest bit of difference to the future temperature. In part this is because China and India and most of the developing world won’t sign on, but mostly it’s due to the small impact of GHGs on the temperature.
B) Overall, slight warming will be a benefit. Warming winter nights is not going to do much harm.
C) All of the evils predicted by AGW adherents are with us today - floods, droughts, sickness, hurricanes, rising sea levels, heat waves, cold spells, we have them all. Any of these may increase whether or not the globe warms up by a couple degrees.
D) The “no regrets” strategy is to figure out how to better protect ourselves from an increase in any of these hazards, no matter what the cause of the increase might be.
jshore, always a pleasure to hear from you. You say:
While I agree that the IPCC represents the opinion of the majority of climate scientists, it gives short shrift to the opinions of the many scientists who do not agree with the majority. In part this is in the IPCC process, and in part it is due to the exclusion of many people who do not agree with the revealed IPCC wisdom.
One of the most egregious examples of this is their treatment of the economic scenarios. Despite being notified a couple of years ago that the use of market exchange rate (MER) analysis in place of purchasing power parity (PPP) analysis badly skews their results, and despite agreeing that in fact it does skew their results, and despite the fact that every other major agency in the world (all the rest of the UN, the EU, the World Bank, and everybody else) uses PPP because MER gives bad results, the IPCC have said they “don’t have time” to change the economic scenarios (e.g., A1, B2, etc.), but that they will change them for the fifth annual report.
Riiiight …
Mark actually is quite expert on the carbon cycle, and is recognized as such, which is why he writes about things like the effect of black carbon on the global temperature. Making a model as sophisticated as his, far beyond what other modelers are doing, requires extensive knowledge of a variety of processes, including the carbon cycle and e-folding times.
Say what? He gives a graph (Figure 2) of how CO2 levels would eventually level off if emission rates are held constant, says that the IPCC calculations are wrong, and discusses why. In other words, he does everything you say he doesn’t do.
While the instantaneous value varies between 25 and 100 years, the average during the period is 31-43 years, in line with his confidence interval of 30-60 years. Gaffin found an estimate of 30 years over the period.However, it is possible to refine this further by looking at a longer time frame.
When we look at the emissions from 1850-2003, an e-folding time of 60 years does not fit the data at all. The best fit with the data is an e-folding time of ~35 years, so this is the time I have used in my analysis above.
I’m not at all expert on climatology, but as a psychologist, I must commend intention on his mastery of evasion and distraction, and his ability to interweave incredible complexity with the simplicity of a toddler. In particular, if found your “clear” response to SentientMeat’s primary questions had all the clarity of a David Lynch film.
Correct me if I’m wrong, but your points are this:
The complexity of the climate makes forecasting nigh on to impossible.
We can nevertheless say with certainty that there isn’t a problem with global warming.
Whether or not there is, we can say for sure that we will nevertheless have unavoidable problems with drought, flooding, locusts, telemarketing, etc.
The best we can do is make the sign of the cross on ourselves, buy stocks of flotation devices and duct tape, and hope we can ride the storm out.
And you’ve never published in a peer-reviewed journal?
Very well. I retract my statement that you ran with your tail between your legs. Nevertheless, I will now make a new statement, that you have given up pretending that there is any real-world basis for your position, and instead are restricting yourself to claims starting with “I think” and “believe”. To the thousands who have already been evacuated because of human-caused global warming and the millions more who will be, the fact that you think they aren’t being evacuated will not be much comfort. Presumably even your thoughts and beliefs would agree with that.
This is not true. The IPCC report does not use simply exponential decay. The discussion in the Jacobson paper may focus only on the exact parameters for this exponential decay and not its existence. However, this is a weakness of that paper (although probably not a fatal one for the main topic that he is addressing…i.e., of the approximate relative effects of black carbon vs. CO2).
You are taking what Sentient Meat said about time lags out-of-context. His point is that you are assuming zero time lags when making estimates of the climate sensitivity from empirical data, in particular, based on the observed temperature increases during the 20th century.
The calculation you give above that gives that woefully low sensitivity of 0.1 °C per Watt/m2 has other problems unrelated to time lags.
Unfortunately, as often seems to be the case, your interpretation and conclusions of the lessons from the past are at odds with what is being published in the field. See, for example, this paper in Science in which is summary says:
While it is interesting to see you lay out what you believe, most of these statements seem to be at odds with what most of the scientific community believes.
Whatever. The Earth has always been changing and always will. You want to live on the coast, near sea level, that’s your business. You think every coastline has always looked the way it does now? This isn’t global warming; it’s normal variation.
Actually, it looks to me like you’re playing a familiar game amongst ACC Deniers – what I call the El Nino Switcheroo. It goes like this: When your opponent suggests that temperatures or concentrations are increasing, point to 1998, the year of the highest increases on record in both respects, and say Nuh nuh, wrongomundo!. But whenever a new record is set (and we’re expecting one this year or the year after), say Nothing to worry about – that’s just old El Nino.
You cannot have it both ways. 2005 & 2006 saw the biggest increases in non-Nino years on record, and over the last 30 years the average annual increase in concentration has undeniably shot up. This is the trend I consider most worrying, and while you’ve done much to set my mind at rest that we haven’t irreversibly fucked up the climate already, you’re doing nothing whatsoever to allay my worries in this respect with what appear to me to be pedantic games on your part.
No, those decade by decade figures of 34:22 Gt, 47:29 Gt, 63:36 Gt, 76:43 Gt and 86:53 Gt can have any old curves and lines fitted to them (in fact, I’d say the most sensible is one in which the gap between emitted and sequestered continues to get larger). They might be modelled in the way you suggest, I suppose, but you keep saying that we don’t understand how to model the climate very well at all, and so this does constitute an assumption on your part, and yet another optimistic one at that.
Even if this is the case, again: Why are you assuming only amongst the most optimistic parameters of all?
And you agree that you estimates are definitely optimistic, yes? Why so, given the uncertainty you repeatedly emphasize?
Very well – can you tell me which year you suggest we should measure the temperature in order to gain an accurate estimate of the warming due to the 20th Centuries emissions? If you say 2000, that’s a zero time lag.
Over the long term, maybe. But why are you ignoring completely all positive feedback mechanisms over this very very short term? Are you saying it is impossible for phytoplankton to become less efficient in increasingly acidic water, or that rainforests simply can’t shrink at an incredible rate, or something?
They’re included in your Le Chatelier-style estimate of a 430ppm top-out, are they? It seems to me like they would represent a reaction solution near saturation, where Le Chatelier’s principle might not apply.
Let’s just see what the next three years brings, shall we? (Remember, you’ve already played your El Nino joker in correcting me about increasing CO2 concentrations – you can’t have it both ways)
Very well, when do you think we’ll see an annual increase of less than, say, 1.5ppm? Because if it’s not soon, that 430ppm top-out looks like extremely wishful thinking.
Thanks, but that (A)-(I) list (which was frankly bizarre in places) rather evaded the specific questions I asked you. The first was specifically about your estimated top-out concentration and what results would make you abandon it. Clearly, by definition, an actual concentration of 435ppm would, but is there anything you might see in, say, the next ten years which yould convince you that your prediction is just plain wrong?
And secondly, am I right in saying that literally nothing would have you advocating any kind of Emissions Protocol? (Of course, you already know what I think of the hysterical doom-mongering surrounding the Kyotogenic Economic Cooling lie.)
Leaffan, please point to an instance of the CO2 concentration increasing by an enormous 30% in mere decades. The changes you speak of were very slow - slow enough for us to deal with comfortably. This change is very very fast. Do you understand?
I am not saying he is not a smart guy. However, his area of strength seems to be in aerosols (including black carbon) and their transport and climatic effects. I am simply saying that his particular expertise does not seem to be the carbon cycle (by which I mean the cycle of CO2 through the atmosphere, biosphere, hydrosphere, and land). For the comparison he wanted to make between the effects of black carbon and the effects of CO2, he needed to assume something about the carbon cycle…so he made the simplest estimate he could and assumed it just obeyed an exponential decay.
My point stands. Yes, he has a graph that shows CO2 levels (at least in principle) eventually leveling off under the assumption of emission rates held constant. However, that graph is shown for many different lifetimes…and for the longest lifetimes it still hasn’t leveled off by 2100. Furthermore, this is not the crux of his paper. It is not one of the conclusions of the paper that CO2 will likely level off at levels say below 460ppm if emissions are held constant. It is simply one of the predictions (if he keeps the lifetime short enough) of the simple model that he uses for another purpose than to predict exactly how CO2 will behave under a constant emissions scenario.
Maybe he has a lot of confidence in his model and believes it to be correct for this but he doesn’t clearly say so and doesn’t clearly state why he thinks it is superior to other models such as the Bern carbon cycle model. Look, I have a proposal for you: You seem to like to communicate with various people in the climate science field, so why don’t you send Mark an e-mail and ask him directly how much confidence he has in his model for CO2 levels being used to predict the level at which CO2 will level off under a scenario of constant emissions. You could say, for example, “With what confidence do you believe your model and data-derived estimates of CO2 lifetime enough to say that CO2 levels under a constant emissions scenario would level off below 450ppm? What about below 500ppm? Below 550ppm?” It would be an interesting question and I can say I honestly don’t know what answer he would give you. (I.e., I am not suggesting that you ask the question because I believe he will give the answer I want him to give…I honestly am not sure what his answer will be.)
The point is that there is no mechanistic reason to believe that the CO2 decay will exhibit simple exponential behavior. In fact, there are apparently reasons to believe otherwise. While the data, as far as I know, is not violently incompatible with an exponential decay, it is not incompatible with other assumptions too…Some that have a more mechanistic basis behind them.
SentientMeat, thank you for your post. You had said:
To this, I pointed out that the concentration did not increase by 2.6 ppm in 2005, it only increased by 2.2 ppm.
Bizarrely, having made a totally untrue claim, rather than going “oops, my numbers and my claims were both wrong”, you respond:
Sentient, I give up fighting ignorance with you. Did you notice that 2006 was an El Nino year, that’s why all you AGW folks said there were so few hurricanes? Did you notice that, despite you claiming that 2006 emissions would be greater than 2005, they weren’t? Did you notice your number for 2005 was totally bogus? Did you notice that the atmospheric CO2 growth rate was 0.37% per year in the 1970s, and since 2000 has averaged 0.53%, a whopping 43% increase in a third of a century … you call that “shot up”?. If your income went up 43% over the last 35 years, would you say your income had “shot up”?
Have you actually done any research to see that everyone agrees that the decay of atmospheric CO2 is exponential? You claim that any curve could fit the data of “34:22 Gt, 47:29 Gt, 63:36 Gt, 76:43 Gt and 86:53 Gt”, and the best is one where “the gap between between emitted and sequestered continues to get larger” … dude, have you heard of percentages? Exponential decay works on percentages, not absolute figures, your level of ignorance is too deep to fight here. Ask jshore to explain it to you.
Finally, you asked for the answer to two question:
Well, I’m not sure what you mean by the “hysterical doom-mongering surrounding the Kyotogenic Economic Cooling lie.” If you mean that the Kyoto protocol won’t cost much money, it has already cost many billions of dollars in direct cost, plus plenty of indirect costs, with very little to show for it. I don’t favor emission controls because 1) they don’t work in a world with China and India and the developing world demanding energy, and 2) they cost billions of dollars, and 3) the hurt the poor worst, and 4) any practical change will not make a measurable change in future temperatures, no matter whose assumptions you use.
Regarding my projection for where the CO2 will be at the end of the century, if we continue to emit at about half a percent more each year for the century, it will be at about 470 ppmv. If we continue to increase our emissions at the 1970-2006 rate of increase, we’ll end up increasing at about 1% in 2001, and it be about 500 ppmv. That’s using a 30-year e-folding time. With a 40-year e-folding time and the increasing emissions, it will be 530 ppmv.
With the increasing emissions, the Bern model says 570 ppmv, and with emissions rising steadily at 0.5%/year, the Bern model says 550 ppmv. That gives a range of 450-570 ppmv.
However, the increasing emission rate means that we’d be emitting about four times the current amount in the year 2100, and I’d say the odds on that are pretty slim … I’d say 450-550 ppmv is a reasonable value for 2100. This is a forcing increase of about 1-2 W/m^2 in the current downwelling radiation of 325 W/m2 …
Am I worried about that? No. I think we’re doing more to change the climate by cutting down forests than by changing CO2, I don’t see a half percent change in the downwelling radiation as a big issue.
Finally, suppose we could magically freeze our emissions at the current rate. The Bern model, which some scientists think overestimates residence times, says in that case the the 2100 concentration would be about 500 ppmv instead of say 570 ppmv for emission rates constantly increasing … which is about seven tenths of a W/m2 difference in the forcing. Using the NASA/James Hansen sensitivity of 0.6°C per W/m2, thats a difference of about a third of a degree in 100 years … and freezing our emissions is not doable. This means that any achievable reduction in emissions will earn us less than a third of a degree by 2100. That’s why I think they’re a waste of time.
Fear of a natural disaster is being used to induce a state of hysteria, and that is being exploited to introduce highly artificial economic structures and additional forms of taxation.