In
an article published at The Register earlier today, the author states:
NASA staff have done some recent bookkeeping and refined the data from 1930-1999. The issues has been discussed extensively at science blog Climate Audit. So what is the probability of this effort consistently increasing recent temperatures and decreasing older temperatures? From a statistical viewpoint, data recalculation should cause each year to have a 50/50 probability of going either up or down - thus the odds of all 70 adjusted years working in concert to increase the slope of the graph (as seen in the combined version) are an astronomical 2 raised to the power of 70
The paragraph goes on to make a huge fuss about how unlikely it is, compared to the author's (incorrect) expectation of a 50/50 probability. But he never mentioned the published reasons as to why the adjustments were made. I wouldn't have a problem with him questioning the accuracy of the adjustments or more specifically the methodology used to calculate the current graphs; I do have a problem with him blindly criticising that pre-1970 temperatures have been nearly uniformly adjusted downwards and post-1970 temperatures have been adjusted upwards. Additionally, his crude rotation of the graph overlaid on the other is not very scientific or accurate.
Surface temperature measurements have a low signal to noise ratio. Additionally, NASA do not just use the raw temperature measurements from each station, they apply many layers of adjustments. These adjustments tend to be quite large: "
nearly all the reported warming in the USHCN data base, which is used for nearly all global warming studies and models, is from human-added fudge factors, guesstimates, and corrections". It wasn't until
last year that NASA finally released their algorithm:
Reto Ruedy has organized into a single document, as well as practical on a short time scale, the programs that produce our global temperature analysis from publicly available data streams of temperature measurements. These are a combination of subroutines written over the past few decades by Sergej Lebedeff, Jay Glascoe, and Reto. Because the programs include a variety of languages and computer unique functions, Reto would have preferred to have a week or two to combine these into a simpler more transparent structure, but because of a recent flood of demands for the programs, they are being made available as is. People interested in science may want to wait a week or two for a simplified version. The documentation/programs are at: http://data.giss.nasa.gov/gistemp/sources/
They admit in the next paragraph that "one aspect of our procedure where subjectivity could come into play is the choice of which stations are eliminated from the record" - although it should be noted that there haven't been any claims that anyone is intentionally picking "bad" stations, just that there appear to be fewer "good" stations. NASA have a wide variety of sources and they choose which sources to include when generating the pretty graphs that the general public get to see.
So what does this all mean? It's still not entirely clear either way whether global warming is happening. It's clear that the author's article doesn't really add anything meaningful or useful to the whole debate. But what can you expect from statistics and relatively short trends.