Skip to content
February 9, 2008 | 14 Comments

How to look at the RSS satellite-derived temperature data

It’s already well known that the Remote Sensing Systems satellite-derived temperature data has released the January figures: the finding is that it’s colder this January than it has been for some time. I wanted to look more carefully at this data, mostly to show how to avoid some common pitfalls when analyzing time series data, but also to show you that temperatures are not linearly increasing. (Readers Steve Hempell and Joe Daleo helped me get the data.)

First, the global average. The RSS satellite actually divides up the earth in swaths, or transects, which are bands across the earth whose widths vary as a function of the instrument that remotely senses the temperature. The temperature measured at any transect is, of course, subject to many kinds of errors, which must be corrected for. Although this is not the main point of this article, it is important to keep in mind that the number you see released by RSS is only an estimate of the true temperature. It’s a good one, but it does have error (usually depending on the location of the transect), which most of us never see or few actually use. That error, however, is extremely important to take into account when making statements like “The RSS data shows there’s a 90% chance it’s getting warmer.” Well, it might be 90% before taking into account the temperature error: afterwards, the probability might go down to, say, 75% (this is just an illustration; but no matter what, the original probability estimate will always go down).

Most people show the global average data, which is interesting, but taking an average is, of course, assuming a certain kind of statistical model is valid: one that says averaging transects gives an unbiased, low-variance estimate of the global temperature. Does that model hold? Maybe; I actually don’t know, but I have my suspicions it does not, which I’ll outline below.

So let’s look at the transects themselves. The ones I used are not perfect, but they are reasonable. They are: “Antarctica” (-60 to -70 degrees of latitude; obviously not the whole south pole), “Southern Hemisphere Extratropics” (-70 to -20 degrees; there is a 10 degree overlap with “Antarctica”), “Tropics” (-20 to 20 degrees), “Northern Hemisphere Extratropics” (20 to 82.5 degrees, a slightly wider transect than in the SH), and “Arctic” (60 to 82.5 degrees; there is a 22.5 degree overlap with NH Extratropics). Ideally, there would be no overlap between transects, global coverage would have been complete, and I would have preferred more instead of fewer transects, which would have allowed us to see greater detail. But we’ll work with what we have.

Here is the thumbnail of the transects. Click it (preferably open it in a new window so you can follow the discussion) to open the full-sized version.
RSS transects
All the transects are in one place, making it easy to do some comparisons. The scale for each is identical, each has only been shifted up or down so that they all fit on one picture. This is not a very sexy or colorful graph, but it is useful. First, each transect is shown with respect to its mean (the small, dashed line). Vertical lines have been placed at the maximum temperature for each. The peak for NH-Extratropics and Tropics was back in 1998 (a strong El Nino year). For the Arctic, the peak was in 1995. For the Antarctic, it was 1990. Finally, for the SH-Extratropics it was 1981.

You also often see, what I have drawn on the plot, a simple regression line (dash-dotted line), whose intent is usually to show a trend. Here, it appears that there were upward trends for the Tropics to the north pole, no sort of trend for the SH-Extratropics, and a downward trend for the Antarctic (recall there is overlap between the last two transects). Supposing these trends are real, they have to be explained. The obvious story is to say man-made increases due to CO2, etc. But it might also be that the northern hemisphere is measured differently (more coverage), or because there is obviously more land mass in the NH, and—don’t pooh-pooh this—the change of the tilt of the earth: the north pole tipped closer to the sun and so was warmer, the south pole tipped farther and so was cooler. Well, it’s true that the earth’s tilt has changed, and will always do so no matter what political party holds office, but effects due to its change are thought to be trivial at this time scale. Of course, there are other possibilities such as natural variation (which is sort of a cop out; what does “natural” mean anyway?).

To the eye, for example, the trend-regression for the Arctic looks good: there is an increase. Some people showing this data night calculate a classical test of significance (don’t get me started on these), but this is where most analysis usually stops. It shouldn’t. We need to ask, what we always need to ask when we fit a statistical model: how well does it fit? The first thing we can do is to collect the residuals, which are the distances between the model’s predictions and the actual data. What we’d like to see is that there is no “signal” or structure in these residuals, meaning that the model did its job of finding all the signal that there was. The only thing left after the model should be noise. A cursory glance at the classical model diagnostics would even show you, in this case, that the model is doing OK. But let’s do more. Below is a thumb-nail picture of two diagnostics that should always be examined for time series models (click for larger).
RSS transects

The bottom plot is a time-series plot of the residuals (the regression line minus the observed temperatures). Something called a non-parametric (loess) smoothing line is over-plotted. It is showing that there is some kind of cyclicity, or semi-periodic signal, left in the residuals. This is backed up by examining the top plot: which is the auto-correlation function. Each time-series residual is correlated with the one before it (lag 1), with the one two before it (lag 2), and so on. The lag-one correlation is almost 40%, again meaning that the residuals are certainly correlated, and that some signal is left in the residuals that the model didn’t capture. (The “lag 0” is always 1; the horizontal-dashed lines indicated classical 95% significance; the correlations have to reach above these lines to be significant.)

The gist is that the ordinary regression line is inadequate and we have to search for something better. We might try the non-parametric smoothing line for each series, which would be OK, but it is still difficult to ask whether trends exist in the data. Some kind of smoothing would be good, however, to avoid the visual distraction of the noise. We could, as many do, use a running mean, but I hate them and here is why.
Running mean
Show in black is a pseudo-temperature series with noise: the actual temperature is dashed blue. Suppose you wanted to get rid of the noise using a “9-year ” running mean: the result is the orange line, which you can see does poorly, and shifts the actual peaks and troughs to the right. Well, that is only the start of the troubles, but I don’t go over any more here except to say that this technique is often misused, especially in hurricanes (two weeks ago a paper in Nature did just this sort of thing).

So what do we use? Another thing to try is something called Fourier, or spectral analysis, which is perfect for periodic data. This would be just the thing if the periodicities in the data were regular. They do not appear to be. We can take one step higher and use something called wavelet analysis, which is like spectral analysis (which I realize I did not explain), but instead of analyzing the time series globally like Fourier analysis, it does so locally. Which means it tends to under-smooth the data, and even allows some of the noise to “sneak through” in spots. This will be clearer when we look at this picture (again, just a thumb-nail: click for larger).
RSS wavelets

You can see what I mean by some of the original noise “sneaking through”: these are the spikes left over after the smoothing; however, you can also see that the spikes line up with the data, so we are not introducing any noise. The somewhat jaggy nature of the “smoothed” series has to do with the technicalities of using wavelets (I’ll have to explain this a latter time: but for technical completeness, I used a Daubechies orthonormal compactly supported wavelet, with soft probability thresholding by level). Anyway, some things that were hidden before are now clearer.

It looks like there was an increasing trend for most of the series starting in 1996 to 1998, but ending in late 2004, after which the data begin trending down: for the tropics to north pole, anyway. The signal in the southern hemisphere is weaker, or even non-existent at Antarctica.

This analysis is much stronger than the regression shown earlier; nevertheless, it is still not wonderful. The residuals don’t look much better, and are even worse in some places (e.g. early on in the Tropics), than the regression. But wavelet analysis is tricky: there are lots of choices of the so-called wavelet basis (the “Daubechies” thing above) and choices for thresholding. (I used the, more or less, defaults in the R wavethresh package.)

But the smoothing is only a first start. We need to model this data all at once, and not transect by transect, taking into account the different relationships between each transect (I’m still working on a multivariate Bayesian hierarchical time-series model: it ain’t easy!). Not surprisingly, these relationships are not constant (shown below for fun). The main point is that modeling data of this type is difficult, and it is far too tempting to make claims that do not hold up upon closer analysis. One thing is certain: the hypothesis that the temperature is linearly increasing everywhere across the globe is just not true.

APPENDIX: Just for fun, here is a scatter-plot matrix of the data (click for larger): You can see that there is little correlation between the two poles, and more, but less than you would have thought, between bordering transects.
RSS wavelets

How to read this plot: it’s a scatter plot of each variable (transect), with each other. Pick a variable name. In that row, that variable is the y-axis. Pick another variable. In that column, that variable is the x-axis. This is a convenient way to look at all the data at the same time.

February 6, 2008 | 36 Comments

Has atmospheric CO2 decreased? A different way to look at CO2 changes

Joe Daleo, the number one guy over at, recently sent me the CDIAC (ice core) CO2 data as criticized in Beck (2007) and asked me what I made of it. Now, this data has been pored over by the great and small, so should we expect any revelations along the lines of “Has CO2 actually decreased?” Well…see below. I don’t often see this data pictured in one particular way that I find instructive, so I wanted to show it to you.

You’re probably used to seeing CO2 through time in plots very much like this cartoon.

CO2 through time

The black line is the actual CO2 data, the two background colors representing, before 1958, estimates of CO2 based on ice cores, and after that on measurements from Mauna Loa. The green line is a suitably normalized estimate of the human population. Both increase at what looks like roughly the same rate. Right?

Side note: these are estimates, and, ideally, both lines would have a “plus or minus” line plotted above and below so that we can see the graphical representation of the uncertainty in the numbers, which might, or might not be, substantial. I don’t know what the error is for either curve, but we’ll ignore this not inconsequential problem today.

Now let’s take the exact same data and plot it in a slightly different way.

Log CO2 change through time

Click the image for larger version: you may wish to right click and “Open Link In New Window” (or words like that) so that you can view the graph and read its description at the same time. Or download a printable pdf version here.

This graph is complicated, so let’s take our time to understand it. The horizontal, or bottom, axis is still time. But now the black line is the yearly change in CO2. For example, in 2007 the CO2 was measured to be 383.32 parts per million (ppm) and for 2006 it was 381.83 ppm. The change, which was an increase, was 1.49 ppm. We measure this change for each year and keep the results, so that we can see the rate of increase (or possible decrease or no change) for each year. We could plot this raw change through time, but a lot of detail is hidden because the increase is exponential (the same shape as the cartoon plot above).

Instead of a raw plot, we take the log of all values so that detail can emerge. This should not change conclusions based on the data in any way, and it does allow us to see it better (technical note: the value of 1.2 was added to all values because some changes were negative and, without using complex numbers, we cannot take logarithms of negative numbers).

The detail pops now, doesn’t it? The first thing to notice is the marked qualitative and quantitative differences in the Mauna Loa and Ice Core estimates. The two methods are obviously not directly compatible, a fact which was hidden in the raw (non-differenced) plots. This makes decisions about the rate of increase of CO2 across the two regimes trickier than is commonly thought.

First concentrate just on the Mauna Loa regime. The rate of change has been over-plotted by a simple regression line, which fits rather well (I’ll spare you the formal statistical tests: but trust me). That is, the model of exponential acceleration of CO2 into the atmosphere is well supported over this range. This is acceleration because, recall, that this plot of the rate of increase of CO2.

To explain that further: suppose, every year, the exact same amount of new CO2 is added to the atmosphere. The graph for that would then be a straight line on our plot, which is roughly the case for the dates 1750 to 1800. During that time, about 0.12 ppm of new CO2 was added each year. At least, according to the estimates from ice cores.

To emphasize: if our graph shows a (rough) increasing line, as it does in the Mauna Loa regime, then the rate at which CO2 is being added to the atmosphere (according the chemical measurement method used) is increasing. If the graph shows a straight line for certain periods, then those periods contributed the same amount of new CO2 each year. But if the graph shows a (rough) decreasing line, as it does in several place in the ice core regime, then the amount of new CO2 being added to the atmosphere is decelerating: new CO2 is still being added, but at a slower rate.

There are even times when CO2 has decreased, i.e. removed from the atmosphere, from year to year (according to the measurements used): these are the points below the dotted-dashed line at 0. These times were roughly 1820, 1831-1838, times before wide-scale industrialization, and 1942-1944. 1942 to 1944? This was certainly a time in which the entire world, if you recall, was intent on adding as much of everything to the atmosphere that it possibly could. So this result is strange. One possibility is measurement error: something might have gone wrong in the way the ice cores were processed.

It is usually thought that the measurement method used for ice cores is accurate and unbiased and so on. So suppose that is true. Then it cannot have been the war that accounts for this dip in the mid-1940s, because there is no similar dip around the years of The Great War. In fact, during that time, the rate of new CO2 was accelerating, as indicated by the regression fit over the years 1898 to 1941.

Just for fun, I have drawn the two regression lines, for the ice core and Mauna Loa regimes, extending forwards and backwards through time (these are the light dotted lines). What I learn from this, again, is that the two measurement methods are probably not compatible.

On to the human population, again pictured in green, but here, like CO2, we are looking at logged differences in year to year population, suitably normalized for ease of comparison. Data from 1950 to 2007 was available for each year from the U.S. Census Bureau; from before that, I used, Lord help me, Wikipedia. Estimates before 1950 were sparse, generally only available every 50 years or so. I fit a variety of splines (B-splines, polynomial, etc.) and even a strict linear interpolation to estimate the missing values: all methods gave substantially the same results. What we have to say about human population isn’t that crucial, anyway.

You first see two dips, one around 1915 or so and another from the late 1940s. These dips certainly are from the two World Wars. Population was still increasing then, but, obviously, at a slower rate. The deceleration from the late 1960s to present time is well known to demographers: while population is still increasing, the rate at which it is doing so is dramatically decreasing, particularly in Enlightened countries. The odd dip around 1960 is probably due to the utopian joys of communism: Mao’s great leap forward (into the grave, apparently).

Ok, that’s the data. But it only takes us so far: human population numbers are only a rough, very rough, proxy of our ability to add CO2 to the air. For example, during 1831-1838 the human population was accelerating but the CO2 was decelerating! Human population also dropped during World War II, the same time as a measured drop of atmospheric CO2, as mentioned above. But a similar deceleration in human population in World War I did not find a concomitant deceleration of CO2.

What to make of plots of people versus CO2? My guess: not much. Particularly since current rates of population are decelerating and will continue to do so, yet CO2 rates are accelerating. The correlation between human population and CO2 is just too noisy and inexact to be of much use.

I am not an expert in measuring atmospheric CO2, but I will make three conclusions which I believe are well supported statistically. (1) The two methods of measuring CO2: ice core reconstruction and air-chemical, are not compatible. One is over-estimating or one is under-estimating. I have no idea which is which; whether, that is, historical numbers should be adjusted higher or current numbers should be estimated lower. Beck (2007) and Jaworski (2007) argue that the historical numbers are low.

(2) We should increase our uncertainty in models, such as global climate models, that use this CO2 data as input, particularly if they use the data which spans the two measurement regimes.

(3) There are odd discrepancies, unexplainable through human population correlations, in the ice core data. At times CO2 has been measured to actually decrease. This might be true, but the times of the decreases are not consonant with human activities. Clearly, measurement error is a likely possibility and should be investigated.

Lastly, of course, there is Beck’s paper, which is essential reading on this subject. I do not have Beck’s data, just the ice core data: some of the same signals, though not the same in magnitude, in the CDIAC data are also in Beck. His contention, supported by data, is that CO2 has been higher in the recent past. Like a peak around 1940 or so, declining afterwards: the decrease we also see.

I might be wrong about all this, so I welcome comments and discussion.

Beck, E.G., 2007. 180 years of atmospheric CO2 gas analysis by chemical methods. Energy and Environment, 18, No. 2, 259-282.
Jaworowski, Z., 2007. CO2: the greatest scientific scandal of our time. EIR: Science, 16 March, 38-53.
February 5, 2008 | 7 Comments

Mandatory suicide to reduce carbon footprint no joke

In an interview with Stephen Wright at, the comedian tells a favorite joke: You never know what you have until it’s gone, and I wanted to know what I had, so I got rid of everything. He lamented, “I really like that one, but it didn’t really get a laugh.”

Every comedian has a story of a beloved joke that never gets a laugh, and of other quips that everybody inexplicably likes.

I tell you this my friends because I worry about you. My number two son and I posted a “story” about Zombie Attacks Increasing Due to Global Warming, and the thing is linked at hundreds, and at a growing number, of websites. But the next day’s post—in my opinion, my most hilarious—about people willfully turning themselves into Soylent Green to battle climate change didn’t even rate a chuckle. Many of you even took it seriously! You can’t go wrong with Zombies, I guess. (By the way, check out today.)

The posting on the Soylent Coroporation’s government contract to encourage people to Go Home–i.e., commit suicide—to reduce their “carbon footprint” was, of course, a satirical observation on the zany lengths to which people will go when swayed by ideology. But it actually wasn’t too far off the mark.

How do I know this? Well, according to this fine article by Brad Allenby at, a “recent study from the Swedish Ministry of Sustainable Development argues that males have a disproportionately larger impact on global warming” because “women cause considerably fewer carbon dioxide emissions than men and thus considerably less climate change.” So we need fewer men.

Think the worst of sins is driving an SUV? Not a chance. Being obese and having children also up people’s carbon output. Eating meat is bad, too. These behaviors obviously have to be curtailed, if not voluntarily, then at some point by force—force of law, of course.

It might not come to that. There might be enough deeply concerned volunteers to pull the load for the rest of us. Says humble citizen Erik Daehler, in an article about how we can all do out part, “You do have to sacrifice,” said Daehler. “I think a lot of people are going to have to soon assess themselves and figure out that what they give up now may allow their kids to have it, or their kids’ kids to have it. It’s sort of a selfish relationship we have with the environment right now.”

But even reducing your [carbon] footprint to zero and living a so-called carbon neutral life may not be enough, said the [director of the Natural Resources Defense Council’s climate change program John] Steelman.

You can take yourself out of the equation,” he said…(emphasis mine)

[Ordinary citizen] Tony Napolillo said he won’t wait for politicians to act.

“Everybody has to realize they have personal responsibility,” he said. “They can’t just wait for the government or the corporate world to do something about it. If everybody could strive to be carbon neutral, this would be a greater world.”

It’s never too long these days before reality overtakes parody, so I should take my own advice and leave well enough alone, before somebody does think “Going Home” is a good idea.

February 4, 2008 | 6 Comments

San Francisco mandatory carbon-footprint reduction program begins

Mayor Gavin Newsom announced that San Francisco’s mandatory carbon-footprint reduction program will begin as scheduled on the first of March.

“There never was a problem as serious as global warming and we must take action now,” said mayoral spokesman William Simonson. “San Franciscans are among the most enlightened people of the world and they are eager to do their part,” he continued.

Phase One of the program requires all citizens to cease jogging and other aerobic activities. “Each time a San Franciscan exhales, they add to the already over-burdened carbon dioxide load of the atmosphere.” Simonson explained that “jogging increases the amount of CO2 in people’s breath to unacceptable levels.” All jogging paths will be converted to green space which will also help absorb CO2. Conversion is expected to last at least three years.

The more controversial part of the program is Phase Two, which is expected to remain voluntary. “Each citizen must decide whether Phase Two”—which the mayor has dubbed Going Home—“is right for them.” A public square highlighting a monument on which will be engraved a listing of the volunteers will be opened downtown by late summer. All work on the square has been donated by Gore Enterprises.

“We hope that this beautiful place will encourage more people, not just here in San Francisco, but all over the world to do their part,” said the mayor.

Phase Two is not without controversy. Bob Thorn of Let Us Breath, a non-profit group, said, “This program will never remain purely voluntary. This is just the mayor playing politics.”

Simonson has been quoted as saying that there are no plans to make Phase Two mandatory. “We will visit that issue if our carbon sequestration goals have not been met.” He added that the Let Us Breath’s “scare tactics” were typical of “climate denialists” and that everybody so far has expressed “nothing but support” for the program.

Gore enterprises is a subsidiary of the Soylent Corporation, makers of Soylent Green®.