Wednesday, January 6, 2010

Smoothing Splines and Law Dome CO2 Data

I've been reading about a paper by Ernst-Georg Beck where it is claimed that CO2 levels in the past 150 years have fluctuated widely. Apparently, when Mauna Loa measurements started, CO2 levels magically became more stable. It's not surprising these claims have been made fun of. But that's not what I really want to discuss.

Beck's paper got me thinking about the Law Dome ice-core data I've been using to associate CO2 with temperatures. I suspected it might actually be too smooth. You see, Etheridge et al. (1998) provides two convenience data sets: One is a 20-year smooth series spanning 1832-1978. The other is a 75-year smooth series spanning 1010-1975. They are obtained using smoothing splines.

You can actually see that the Etheridge et al. 20-year smooth data is even more smooth than Mauna Loa data (e.g. in this figure, before and after 1978.) The problem with smoothing out noise is that you can easily lose information.

I went ahead and calculated the natural spline interpolation of the raw data from Etheridge et al. (which I'm making available HERE.) The natural interpolation only uses the spline function to estimate data for years that are missing. It does not smooth out the raw data that is known. In the Etheridge et al. data set, what this will mean is that recent data will have most of the original noise, while old data will be smooth, but not 75-year smooth, hopefully.

Let's take a look at the 1850-1978 20-year smooth series provided by Etheridge et al. (1998), along with the natural interpolation of the raw data.



Interesting, isn't it? I think it makes a lot of sense. In particular, notice the flat CO2 trend between the years 1933 and 1952. That's 20 years without an increase in atmospheric CO2. With the smooth series, this is not so evident. You see perhaps 12 years of pause or so.

Is this important? Let's take a look at the HadCRUT3 temperature series along with the logarithm of the new CO2 series, from 1850 to 2008. (Mauna Loa data is added after 1978, minus a small offset.)



It would appear that the climate reacts rapidly to CO2 fluctuations, which again, argues for a small amount of "warming in the pipeline."

There are other things you can't see very well with the overly smooth data. Let's take a look at the time span between 1700 and 1900. For temperature, I will use the NH-SH average from the CPS temperature reconstruction of Mann et al. (2008).



Keeping in mind that both the temperature and CO2 data in the figure are reconstructed, I think this is pretty interesting. It suggests there might have been slight warming of about 0.1C right at the beginning of the industrial revolution. This is consistent with estimates of climate sensitivity.

Tuesday, January 5, 2010

Warming in the Pipeline

How much "warming in the pipeline" is there? Is there a way to be sure? NASA GISS findings suggest it's about 0.6C.

I was contemplating a method of estimating "warming in the pipeline" from available temperature and CO2 data. It's sort of a heuristic method, with all this implies, but it's interesting that I get somewhat different results.

CO2 appears to be the main causal agent of recent temperature shifts. I've demonstrated that temperature fluctuations lag CO2 fluctuations by about 10 years. It comes to reason that observed temperature fluctuations lag equilibrium temperature fluctuations by about 10 years as well.



Imagine you have an equilibrium temperature series that looks like a sinusoid. Observed temperature will lag the hypothetical series, also looking like a sinusoid, by some number of years. If we use Newtonian cooling as an approximation, the expected rate of temperature change (R) will be given by:

R = r·(T' - T)

T' is the equilibrium temperature and T is the observed temperature. The constant r is something I will call the rate coefficient.

The question is: If you know the lag between the sinusoid series, can you estimate the value of the rate coefficient r? Then, if you know r, can you estimate "warming in the pipeline"? I think the answer is yes.

I gave up trying to solve it with calculus. Perhaps a reader can give it a shot. I instead solved it by means of a Monte Carlo simulation.

It turns out that the rate coefficient r depends on the period of the sinusoid and the lag between the series, but not the amplitude of the sinusoid.

The period of the CO2 sinusoid that results from the 3rd-order detrending of the CO2 series is about 85 years (pulsation is 0.074.) You can see that in the figure above. For this period, and a lag of 10 years, my simulations indicate that the rate coefficient r should be just about 0.08 in units of year-1.

Let's assume that the current rate of temperature change (without weather noise) is about 0.018 degrees Celcius per year. Then we have that:

0.018 = 0.08·(T' - T)

So the temperature imbalance is:

ΔT = T' - T = 0.018 / 0.08 = 0.23°C

This is not too bad. If correct, I'd take it as good news.

Climate Sensitivity

I'm essentially claiming that the global equilibrium temperature is knowable and that it's perhaps 0.7°C relative to the HadCRUT3 baseline. It also appears (based on various reconstructions) that the temperature in the 18th century was fairly stable at about -0.4°C. We can probably assume that's an equilibrium temperature. The difference is 1.1°C.

The concentration of CO2 in the 18th century was about 277 ppmv. The concentration as of 2008 is more like 385 ppmv. This means that:

1.1 = k·ln(385/277)

Therefore:

k = 3.34

We can estimate the climate sensitivity to CO2 doubling as follows:

λ = k·ln(2) = 2.32°C

This is actually a tad lower than the best estimates available. There are some uncertainties in the estimate, to be sure. For example, were temperatures really stable at -0.4°C in the 18th century? Then there are some errors that are immediately obvious. Some of the warming could be due to methane and other greenhouse gases. You also have cooling due to aerosols, which would confound the estimate in a manner whose magnitude is not well understood.