Kevin Cowtan Debunks Christopher Booker's Temperature Conspiracy Theory
Posted on 27 January 2015 by Kevin C, dana1981
In The Telegraph, Christopher Booker accused climate scientists of falsifying the global surface temperature data, claiming trends have been "falsified" through a "wholesale corruption of proper science." Booker's argument focuses on adjustments made to raw data from temperature stations in Paraguay. In the video below, Kevin Cowtan examines the data and explains why the adjustments in question are clearly justified and necessary, revealing the baselessness of Booker's conspiracy theory.
The video features a prototype tool for investigating the global temperature record. This tool will be made available with the upcoming MOOC, Making Sense of Climate Science Denial, where we will interactively debunk myths regarding surface temperature records.
Link to NOAA directory @50 goes elsewhere. Here is where it should go.
MA Rodger @50, I'm waiting for them to turn their attention to all the adjustments nature has been making to sea ice extent in the Arctic. All those natural thermometers must be in on the conspiracy along with NASA and NOAA and the former temperature record skeptics at BEST.
MA Rodger and Tom Curtis many thanks for your recent posts which have given me the real meat I was after to be able to just maybe change a potential UKIP voter's mind or at the very least make him change the subject! However much it is true that Brooker is unworthy of his prominence, the trouble is that a lot of local decision makers in my neck of the woods take his output as the truth until something more convincing is offered instead.
JH inline @49, and algorithm is just a proceedure that can (in principle) be automated. The algorithms used for quality control and homogeneity adjustment for GHCN v3 are described in Lawrimore et al (2011). (Off course, sometimes the description consists of a reference to an earlier paper.) Essentially, for homogeneity adjustments, they come down to comparing the temperature series to a set of nearby stations with a high correlation with that station. If there is a sudden, large shift in temperature in a particular station that is not found in its neighbours, it is assumed that there has been some change in circumstances at that station and an adjustment is made. The essential point is that there is a hard rule as to when, and by how much the adjustment is made based on the number of nearby stations, and the level of divergence. The algorithm does not look up the date, and nor does it look up the geographical region in making the adjustment. The detailed description of the method can be found in Menne and Williams (2009), along with a description of a test of the method using artificial data with random change points.
Further to @54, here are the relevant figures for the test of the algorithm for GHCN v3 from Menne and Williams (2009):
As you can see, the algorithm reproduces the correct, zero trend regardless of whether the original data was distorted by upwards or downwards shifts. In otherwords, it shows no bias with regard to trends. The shift in the mean is because the algorithm always makes adjustments with reference to the final year (ie, effectively assumes the most recent measurement is accurate). It is inconsequential.
Can anyone provide an exampe of a graph showing a negative adjustment in the trend at an individual station. I need an exmple to reply to a comment on another site.
The sheer recklessness of the denialists acusations here can be used against them. They do not appear to have even tried to understand the adjustment process before attacking. This is something that they can be pinned down on.
No need, found what I needed.
Lloyd, do you really think midnight is going to see his error (I've been following the posts at DC). He thinks that temp series are adjusted to predetermined GCM output, that the greenhouse effect is negated because Boyle's Law, and that LR radiation can't warm the oceans. He's completely convinced of his superior knownledge (about everything) and will never, ever admit a mistake. He's in complete denial. He isn't interested in undestanding how adjustments are made. He already "knows".
I can show him up. There are others there that want to beleve the denialists are rightb but who will reluctantly accept facts. He is a lost case.
On Feb. 9, 2015, Tom Curtis (#34) wrote, "I intend to fully digitize both the GHCN3 and HadSST3 adjustments on a publicly available spreadsheet, as I think the results will be interesting independently of this discussion. That may, however, take a couple of weeks..."
Tom Curtis, did that ever get finished? Can you provide a link, please?
On Feb. 10, 2015, Kevin C (#38) wrote, "The GHCN tool is unfinished, frequently broken, and not ready for release, I'm not sure how you got the link. I've removed it now."
Has it been released yet, Dr. Cowtan?
daveburton @60, I had forgotten about that promise, so thankyou for reminding me. I will try to follow through shortly. In the meantime, Kevin C has published his far more usefull temperature tool. There is a brief introduction to the temperature tool here. The only thing my spreadsheet will hopefully add to the tool is the unadjusted ocean data, but we know their impacts already from Zeke Hausfather's graphs.
However, as I noted in my email, the Denial101x tool is primarily a teaching tool, and I had to make a lot of simplifications to make both the downloads, and the calculations fast enough. Many stations have been omitted, and all have been reduced to annual data, which introduces its own bias. It is useful for demonstration purposes and some preliminary analysis, but for serious research you need to be using something like Clear Climate Code, or at the very least the SkS tool.
Here are some more resources:
A map of all stations with and without adjustments over the past 40 years (corresponding roughly to the period of dominant human warming):
The green crosses overlay the red ones, so here is a huge version of the same map for more detailed investigation.
Note that there is a general split in the need for adjustments between more/less stable and developed countries, as you would expect, with one very obvious exception: the US. The reasons are of course well known - the volunteer network and the resulting issues with Tobs and the introduction of MMTS.
Here's a nice comparison by Zeke Hausfather on the skill of the NOAA and Berkeley algorithms for reconstructing synthetic US data with realistic errors. The NOAA method does a great job in the US, however on the basis of my own work I think that over the rest of the world, while NOAA method generally improves things, the Berkeley adjustments are more robust. It is not clear whether this is just because Berkeley have more stations - we'll find out with the switch to GHCNv4.
Finally, here are some resources from the Denial101x bonus material:
Actually, I think the 1890 Ellis paper is not open access? It's a beautiful period piece. I've attached the first couple of paras below: