Another global warming contrarian paper found to be unrealistic and inaccurate
Posted on 21 October 2014 by John Abraham
It’s hard to find a reputable scientist who denies that human emissions of greenhouse gases are warming the planet and that there will be consequences for human society and the biological health of the planet. There are a few holdouts who, for various reasons, either think humans are not causing warming or that the warming will not have much consequence.
Some members of this vocal minority spend a lot of time trying to convince the public that they are right. They write letters to newspapers, appear in slick movies, give press conferences, promote their views to Congress, and so on. Their high profile gives the public a false sense that there are two relatively equal-sized bodies of experts that cannot agree on climate change; this is not true.
An even smaller subset also tries to publish their views in the scientific literature – the dueling ground for experts. Sometimes these contributions have been useful, adding some nuance to the discussion, but all too often they have proven to be of very poor quality when other scientists have had a chance to dissect them.
A few months ago, I co-authored an article which charted the different quality in scientific output from the Dwindling Few contrarians compared to the majority of experts. My colleague, Dana Nuccitelli, summarized the article here. What we show is that the Dwindling Few have had a very poor track record – having papers rebutted time after time after time because of errors they have made. The low quality of their research has caused journal editors resign, and they have wasted the time of their colleagues who have had to publish the rebuttals to their work.
Well, again this year, I’ve wasted my time (and my colleagues’ time) by rebutting a 2014 paper published by the darling of the Dwindling Few, Roy Spencer. Dr. Spencer wrote a paper earlier this year that used a very simple ocean model to suggest that standard climate models overestimate the Earth’s sensitivity to carbon dioxide increases in the atmosphere. You can see his manuscript here although it is behind a paywall so you will have to shell out about $40 to read it.
Dr. Spencer and his colleague Danny Braswell made a number of basic math and physics errors in the article that call into question their conclusions.
Before we get into the errors, let’s talk about what their model does. They basically treated the ocean like a non-moving fluid and allowed heat to diffuse into the ocean depths. They did allow some mixing in the upper layers through added terms in a one-dimensional equation. The model neglects down-welling or up-welling of waters which occur particularly at the poles. In the end, they end up with a bunch of tunable parameters, which they adjusted so that the model output matches the measured temperature history.
So, what were the errors and poor modeling choices?
- The model treats the entire Earth as entirely ocean-covered
- The model assigns an ocean process (El Niño cycle) which covers a limited geographic region in the Pacific Ocean as a global phenomenon
- The model incorrectly simulates the upper layer of the ocean in the numerical calculation.
- The model incorrectly insulates the ocean bottom at 2000 meters depth
- The model leads to diffusivity values that are significantly larger than those reported in the literature
- The model incorrectly uses an asymmetric diffusivity to calculate heat transfer between adjacent layers
- The model contains incorrect determination of element interface diffusivity
- The model neglects advection (water flow) on heat transfer
- The model neglects latent heat transfer between the atmosphere and the ocean surface.
Now, simple models like this one can still be useful, even though they necessarily gloss over some details. But some of these errors and omissions are pretty obvious, and would have been easy to fix. For instance, by treating the entire Earth as water covered, Spencer and Braswell omit 30% of the surface of the Earth that’s land-covered, and which heats up faster than the oceans. They then compare the CO2 sensitivity of their ocean-only model to those obtained from more realistic models — apples and oranges. Furthermore, the application of a very local phenomenon (El Niño) to the entire globe just doesn’t make much sense.
But, I here want to talk about the numerical errors, in particular items 3, 4, 6, and 7. In order to explain what went wrong, I need to talk about the underlying math.
The diffusion equation Spencer and Braswell used has a second derivative of temperature with respect to depth in the water. To solve this equation, the common approach is to break the ocean into a number of finite slabs of water and approximate the derivatives by finite differences. So far, so good. The problems arise when you apply what are called boundary conditions. That is, conditions at the ocean surface and the bottom of the ocean. At both locations, Spencer and Braswell’s approach fails.
First, at the ocean surface, you are required to make calculations at the exact surface. In fact, the physical phenomenon which Spencer and Braswell introduce require actual surface temperatures. However, in their computer program, no surface temperatures were ever determined. They basically transcribed a temperature 25 meters deep into the ocean onto the surface (and no, they didn’t do this because of ocean mixing). At the ocean bottom, Spencer and Braswell insulated the ocean, and thereby did not allow any energy exchange there.
Finally, Spencer and Braswell incorrectly used upstream element-diffusivity values in their heat transfer term. They were obligated to use mean values representing adjacent elements. When we implemented the corrected numerical scheme, the quality of the results dissolved. Once again, Roy Spencer has failed in his attempt to show the Earth is not very sensitive to climate change.
These errors are the sort of thing that could have been avoided by consulting any elementary textbook on heat transfer, or any number of papers that have published similar ocean diffusion models. My colleague and co-author, Dr. Barry Bickmore from BYU described the situation like this,
Aren't these the kind of inaccuracies that peer review is supposed to weed out? I guess not always; I remember back when cold fusion made it through.
"The Science isn't settled" One of the advantages of denial is that once Curry and Spencer publish their (predictably) super-low estimates of climate sensitivity, the necessary rebuttal process (by illuminating their mistakes) supports the central denial claim ("The Science isn't settled"). When your object is to sow confusion, even losing an argument allows you to win: there was an argument. Certainly Curry and Spencer, and their backers, know this, and don't care. Anymore than someone tagging a bridge with X-rated grafitti cares that someone else will eventually show up to, slowly and laborously, repaint the bridge. Their purpose is anarchy, like so much of modern conservatism. They are saying nothing gets done, if it can't get done their way.
@ubrew12, agree! I noticed this quote in particular:
"In the end, they end up with a bunch of tunable parameters, which they adjusted so that the model output matches the measured temperature history."
I think Spencer is too smart to not know how wrong this is. But no matter how wrong it is, the casual observer will notice the tuning part and just assume that all models do this, it's just a matter of how sophisticated they are in doing it. That's why any kind of denial is successful.
Whether the science is or isn't settled seems irrelevant. As the VC of the Australian National University has just commented to a Senate estmates hearing "“The reality is our economy and indeed the world economy is going to be dependant on fossil fuels for decades to come,”. That fact seems totally overlooked in the bickering that occurs between those who are convinced climate change is entirely due to humans and those who are less certain of that. It is surprising that the VC of the ANU who, a couple pf weeks ago, jettisoned the University's holdings in fossil fuel related companies has that view. But it does show that it is not impossible to simultaneously hold what appear to be opposing opinions.
Ashton, what view is that opposing? Has someone said, "There's no way that the global economy is going to be dependent on FF over the next few decades"? There are certainly people who want that dependency to be gone within a few decades (don't you?), but the vast majority understand that fundamental change for an integrated, global system that has cultural, political, and economic momentum isn't going to happen overnight. I fail to see what this has to do with the two groups you identify.
Don't give him too much credit there. He seems to have a woefully simple (mis)understanding of what's involved in modeling the climate, and applies that misunderstanding consistently in his own papers. From his blog post How Do Climate Models Work?:
This looks very much like the kind of "modeling" Spencer's papers with Braswell have been about.
DSL Quite obviously I am not making it plain that my comment "holding what appear to be opposing views" referred only to the VC of ANU who on the one hand realises that the world will be dependant on fossil fuels for decades yet on the other thinks that the companies involved with fossil fuels should not be a part of the ANU investment portfolio. Surely if these companies are necessary to the world and will be for "decades" why should ANU not have these companies in its investment portfolio. Not having these companies could well impact unfavourably on the portfolio but having them will not mean ANU is supportive of these companies but is recognising what is to some, an unpalatable truth.
I disagree with your assumptions about what divestment means. In order for change to take place, a shift away from business as usual must occur. Divestment is a strategy aimed at facilitating transition. Divestment is not an immediate, debilitating attack on the fossil energy economic structure. This is the message: energy corporations/companies can and should help transition toward the next human energy regime. It's coming. It's unavoidable. It's easier to get to the next stage when we still have abundant fossil reserves. Today, not tomorrow. That's the unpalatable truth.
This may be straying off topic, but divestment must have a chilling effect on the FF industry.
Essentially, up until recently, fund managers have kept a certain percentage of their holding is the FF industry because it's been a safe, steady place to keep part of your assets. The message is starting to come across that, well, that may not necessarily be the case.
FF industry stocks are priced on the assumption that they can continue to extract oil, at projected rates, for the next ~15 years. And the industry is investing in exploration to support those projections. So, in buying and/or holding those positions you're essentially agreeing to those forward-looking assumptions. And for the past, what, 80+ years those have been good assumptions, with the occasional hiccup here and there.
Now, what happens when we start to see broad international agreements that we can't continue to extract and burn FF's the way we have for the past 80+ years? What happens if we see the next 10 years of accelerating surface temps confirming beyond anyone's sense of doubt that AGW is, in fact, real?
It wouldn't take much of a shift in the extraction rate to significantly change the value of those equities. Suddenly, that safe location for your assets looks a lot less stable. I think fund managers are starting to see the light here.
FF companies are already starting to bake in a price for carbon into their financials. As long as the FF industry and money markets can start accounting for the potential financial risks, then that's a good thing. We can have a fairly soft landing over the coming 30 years. If they all ignore these risk and toddle along like nothing is going to change, then we're going to see a pretty ugly correction in the future.
Like DSL says, it is coming and it is inevitable. We can do it the easy way or the hard way. But it is coming.
I would have to question the wisdom of anyone claiming that, "The reality is our economy and indeed the world economy is going to be dependant on fossil fuels for decades to come..."
That's political posturing, not rational thinking. I think the FF industry is very concerned about the direction things are headed and they're trying to stave off the inevitable as long as they can. If they were that sure of their market, they wouldn't be buying politicians the way they are.
Think of it. Even just with the auto fuel market, electrics have a significant potential to be cheaper and better than internal combustion cars... in just a few years. Gasoline accounts for 45% of oil use (LINK). Over a decade one of their most important revenue streams may start to significantly dry up.
The FF industry is literally staring down the abyss.
I dont think ethical divestment sends much of meaningful message for oil companies. For coal its is bigger issue because it could eventually create difficulties with getting needed capital but that doesnt apply to oil. If fund managers divest because they dont think the companies will make much of profit, then that's a message but ethical choices? I cant see how that can translate into any behaviour change that would reduce CO2. Viable electric cars on the hand on the other hand are a serious risk.
scaddenp... But it's the risk element that should be driving fund manager decisions.
Maybe some of these folks are making the decision on ethics basis, but I really doubt it's many. I would think their decisions are more influenced by risk assessments.
Remember, it really doesn't take that much of a shift in long term projections to profoundly alter the value of a stock when you're talking about a security, like XOM, that's trading at 12X earnings.
The total market value for the FF industry is huge. A collective repricing of the value of that industry could erase a significant amount of total value in the global economy, with a fairly significant ripple effect. The housing bubble had pretty significant impacts. If markets don't take these risks into account in pricing FF stocks, a shift in value could be worse.
What I dont see is how a drop in the share price value affects the amount of oil flowing or the size of the dividend per share for that matter. The recent drop in oil prices, largely due to fracking, would have a bigger impact on share price/dividend than any ethical divestment.
Looking at that statement about models from Spencer, it would seem that Spencer wants modellers to have just one opportunity to get things right. No second chances.
On that basis, we must evaluate Spencer and Christy's work on MSU satellite derivations of atmospheric temperature by restricting ourselves to their first version. To look at how they have "curve fitted" their results, take a look here. (It's in figure 2.)
Of course, in the real world of science, perfection is rarely obtained on the first go. You start with a model, see what it gets wrong, and then try to improve the model. Even Newton didn't quite get the laws of motion right, and physicist are still trying to figure out how to connect gravity with all the other forces in a Universal Theory.
Dropping share prices makes it harder to borrow money and fund new projects. What bank would have lent money to Bre-X the week after the scandal hit?
Perhaps major oil companies have enough loose cash to not worry about banks, but droppng share prices would not be good.
I am new ro this site. Please, someone point me to the unfiltered emperical evidence that global-warming is happening. The raw data if you will. I just want to understand the issue from the purely empirical evidence without anyone else's interpretations involved. Is that out there anywhere?
The big impacts on oil production are going to be on the demand side rather than anything to do with divestment, like suggested earlier with a major shift to electric vehicles. But what divestment does do is, it starts to price FF securities more accurately relative to the inherent risks that lay ahead, which can soften the potential impact it has on global economies.
Part of my own concern is, I think disruptive new market elements, like EV's and solar & wind, are actually going to start replacing FF sources faster than we anticipate. The market value of these new elements need to effectively balance out the massive market value of the legacy FF industry.
It's nice to think one will just nicely supplant the other, but my guess is the transition will get kinda messy instead.
lonelysalmon, not exactly sure what you mean by unfiltered, uninterpreted?
Do you have the skill to analyze the raw oceanographic showing the oceans steady rise?
The land temperature record is more straightforward. If you here you can find links to the raw climate data record. Note the link to glacial monitoring data for indirect evidence. Somewhat more manageable data from the BEST project is here.
If you want purely unadjusted data, how are you going to compare data from when temperature was taken at 3pm to data when they changed to 9am? What about when the station moves or adds a stevenson screen? What about adjusting for urban heat island (if you dont adjust then temperatures appear to warm as city grows). Not the adjustments change the picture that much on a global basis. You might like to look at the BEST methodology for dealing with that (code and data on the site).
I would strongly suggest you over the information presented here as well for the empirical basis.
Rob Honeycutt @11: "it really doesn't take that much of a shift in long term projections to profoundly alter the value of a stock" Witness: Netflix. Ouch.
@16: "disruptive new market elements...[will] start replacing FF sources faster than we anticipate...the transition will get kinda messy" Wait! You mean this is a bubble economy? Who woulda thunk it?!? The old Chinese curse is "May you live in interesting times". My fear: thanks to our deniers, that is our fate, and that of our children.
lonelysalmon@15: anybody got the 'unvarnished truth' out there? Bueller? Ask Earth, lonelysalmon. It's a classic thermometer: a liquid contained in a solid bowl. Is the liquid rising? Ta Da.
scaddenp
You can have so much fun with the US siting data, as Steve Goddard likes to do. Everyone can pretend to be a scientist!
While I believe there is an ethical dimension to disinvestment, the risk factor is just beginning to peek its way above the noise. So, I expect to continue seeing a trend away from fossil fuels. What are the risk factors? (1) Cost of replacement of fossil fuel inventory, (2) increased price volatility, (3) competition for energy markets by renewables, (4) reduction of fossil fuel subsidies, and (5) carbon taxes or carbon limitations.
Of the fossil fuels, coal is the most vulnerable since it has competition from natural gas as well as the renewables. We are just getting into the economies of scale for renewables, so expect that the prices will continue to drop.
The stone age didn't end for lack of stones.
Nice comment over at the Guardian about what is wrong with the Spenser/Braswell model:
"This isn't really my field, but I did study enough about Thermohaline circulation. and it's role in climate, to know that a model that treated the oceans as a non-moving fluid, is about as much use as a chocolate fireguard. "
And that commenter forgot to add Ekman Transport as well.
If you are a novice in a field it's OK to try some stuff out as a part of how you learn. But you don't do a simple version like this then run off to Congress to tell them about it!
Roy Spencer has posted an initial response to Abraham et al. on his blog.
Pierre-Normand @22, much of Roy Spencer's responce depends on asserting the adequacy of 1 dimensional models for assessing climate sensitivity. That, in one respect, is a fair line of defence. Spencer and Braswell (2014) used a one dimensional model, ie, a single vertical profile of the top 2000 meters of the ocean using globally averaged values. Because it uses globally averaged values, it necessarilly treats all points of the global ocean as having the same values, and so much of Abraham's critique amounts to a critique of the adequacy of such models in this application.
Spencer defends the adequacy of his model on the grounds that Hansen has purportedly claimed that, "... in the global average all that really matters for the rate of rise of temperature is (1) forcing, (2) feedback, and (3) ocean mixing." Following the link, however, I find no such claim by Hansen. He does claim that the global energy imbalance determines (in part) the final temperature rise from a forcing, but that is a far cry from asserting that treating only averaged values in a model will adequately determine when that will be (ie, determine the climate sensitivity factor).
Interestingly, Hansen did say, "Ocean heat data prior to 1970 are not sufficient to produce a useful global average, and data for most of the subsequent period are still plagued with instrumental error and poor spatial coverage, especially of the deep ocean and the Southern Hemisphere, as quantified in analyses and error estimates by Domingues et al. (2008) and Lyman and Johnson (2008)." It follows that, according to Hansen, Spencer's one dimensional model must be essentially useless over the period prior to 1970. Indeed, Hansen goes on to write:
Based on that, given the monthly data required for the empirical validation of Spencer's model, according to Hansen the model would be useless for all periods prior to 2004 at the earliest. (Note, long term averages are more accurate than monthly variations. It is the later, required by Spencer, that are inadequate prior to 2004; whereas estimates of the former would still be reasonable although with wide error margins.)
This brings us to the second basis on which Spencer claims adequacy, a claimed superior empirical fit to that of GCMs. That superior fit, however, is unimpressive both because it is purely the function of having tunable parameters, and does not take into account that while GCMs produce ENSO like fluctuations, they do not produce them in sync with the observed ENSO fluctuations. In constrast, Spencer imposes the observed ENSO fluctuations onto his model (which is not superior empirically until he does). Thus, the purported superior emperical fit is not an outcome of the model but an input.
All this, however, is beside the point. While nearly all climate scientists would see a use for one dimensional models, very few (other than Spencer) would consider them adequate to determine climate sensitivity with any real accuracy. They give ballpark figures only, and are known to lead to significant inaccuracies in some applications.
Turning to more specific points, one of Abraham's criticisms is the use of an all ocean world, a point Spencer responds to appeal to the adequacy of single dimensional models. However, in using an all ocean world, Spencer assumes that the total heat gain by the Earth's surface equals the ocean heat gain from depths of 0-2000 meters. That is, he underestimates total heat gain by about 10%, and consequently overestimates the climate sensitivity factor by about the same margin (ie, underestimates ECS by about 10%).
That is important because his estimated climate sensitivity factor with ENSO pseudo-forcing (Step 2) is 1.9 W/m-2K-1. Correcting for this factor alone it should be 1.7 W/m-2K-1, equivalent to an ECS of 2.2 C per doubling of CO2. The step 3 ECS would be much lower, but it only gains a superior emperical fit to step 2 on one measure, and obtains that superior fit by the tuning of eight different parameters (at least). With so many tuned parameters for a better fit on just one measure, the emperical suport for step 3 values is negligible.
A second of Abraham's criticisms is the failure to include the effects of advection. Spencer's response that his model includes advection as part of the inflated diffusivity coefficients would be adequate if (1) they varied between all layers instead of being constant for the bottom 26 layers, and (2) where set by emperical measurement rather than being tunable parameters. The first point relates to the fact that advection may differentially carry heat to distinct layers, and hence the effects of advection are not modelled by a constant ocean diffusivity between layers, even on a global average.
There may be other such nuances in relation to particular criticisms that I am not aware of. The point is that the appeal to the features of a one dimensional model does not justify Spencer and Braswell in ignoring all nuance. Therefore some of Abraham's criticisms, and possibly all of them still stand.
Finally, I draw your attention to Durack et al (2014). If their results are born out, it will result in Spencer and Braswell's model with current parameter choices predicting an ECS 25-50% greater than the current estimates, ie, 2.7-3.3 C per doubling of CO2. Of course, the parameters are tunable, and Spencer and Braswell will without doubt retune them to get a low climate sensitivity once again.
Thank you Tom for a well argued, with accompanying references, article on this subject refreshingly free (almost) from opponent bashing. John Abraham would do well to copy your style. Spencers rebuttal is likewise written, and I am glad to see that the arguments can still be discussed for what they are in a genuine attempt to expose the truth rather that a war between believers and non believers.
I suspect all climate models have tunable parameters, just as models in other sciences. Climate is, to put it mildly, complicated, and we have much to learn. We would therefore do well to use model results from peturbing one tunable parameter at a time to establish and calibrate its effect rather than believing the absolute predictions.
MartinG @24, Global Circulation Models for the most part in CMIP3 (IPCC AR4) and in all cases for CMIP5, do not have "tunable" parameters. They have parameters for such things as the turbulence in surface air flow due to surface type (grassland, forest, mountains etc) where exact expression of the physical laws is impossible at the resolution required. However, these parameters are determined by empirical observations, so they are not "tunable". Further, even for earlier GCMs which did have some tunable parameters, the number of predicted quantities was large relative to the number of parameters so that they constituted a substantial constraint.
I will further note that nobody, least of all the IPCC, makes "absolute" predictions from climate models. The IPCC do make predictions from the ensemble of models, with large and explicit error margins, but predicting a value within a range is not making absolute predictions.
Tom@24. Thats very interesting, and it surprises me, and I bow to your superior knowledge. But we have seen in the climate debate that "empirical observations" turn out to be different depending on who is doing the observations. There is an uncertainty range for most such observations. My point is that I think models are useful for testing sensitivity to parameter change much more than thier predictive value.
Those that "debate" climate science (Ie not the ones actually doing it), havent shown much interest in the empirical observations that are part of parameterization of models. These are things like relationship between wind speed and evaporation. There is more on this here.
"My point is that I think models are useful for testing sensitivity to parameter change much more than thier predictive value."
That is not at all clear to me. What is your basis for that thinking?
MartinG @26, the problem faced with modelling (still) is the large computer resources needed for any given run. Couple that with the fact that short term fluctuations in climate states (at least) are chaotic and it is currently prohibitive to explore the impact of particular parameters by varying them only over a number of runs however desirable that would be. The IPCC does the next best thing, and the best option available given current computer capabilities and resources. That is, they generate predictions from an ensemble of models which, de facto, explore a range of parameters within the constraints generated by emperical studies. The resulting ensemble mean predictions are not precise, but plausibly they cover the likely range of the actual outcome for a given future forcing history, and give a reasonable indication of the error range of the estimate. I agree it is far from ideal, but it is better than not exploring the issue at all.
Having said that, I believe there have been recent attempts to use models of intermediate complexity to generate larger ensembles of runs for slight variants of parameters, ie, to do as you recommend but with less complex models. Because of the reduced complexity, these studies are not definitive by any means, but they do help explore a greater range of issues in greater depth.
That was very useful Tom Curtis.
The following article contains some very interesting information about one dimension of GCMs, ie. spatial coverage.
Researchers resolve the Karakoram glacier anomaly, a cold case of climate science, Phys.org, Oct 22, 2014
Since some of you know a thing or two about models, I would like to ask you a question that I just can't find an answer to. I'm looking for some credible explanation why 1990 is a baseline year to align model projections with measurements. Since misaligning them is one of the favorite contrarian tactics, I would like to have technical description why it is wrong. Thanks you!
BojanD it would help if you gave an example where a baseline of 1990 was used.
A single year baseline is sometimes used for visualisation of warming rates, but it isn't something you would do to compare model output with observations, instead you would use a baseline period of say thirty years. Different baseline periods are used for largely historical reasons, however if the results depend heavily on the exact baseline you have chosen, then the baseline period is likely to be too short (30 years ought to be about right).
BojanD, I replied to you on the Models Are Unreliable thread.