In Climate clippings 46 I thought the most important segment was the last, on Deep heat. I don’t think it attracted a single comment.
To recap, the world’s oceans have a total mass of 1.37 billion gigatonnes of water. A gigatonne of water equals a cubic kilometre.The average temperature is, I understand, 3.5C, so the capacity for storing energy in the oceans is truly massive.
Around 90% of heat trapped by greenhouse gases ends up in the ocean.
A post by Kevin Trenberth on the earth’s energy balance tells us of a study that will show that energy can easily be “buried” in the deep ocean for over a decade.
Skeptical Science now has a post on this study, by Meehl (2011), including this graph showing periods of more than 10 years with the ocean heat content at 0-700m roughly static:
They think heat was sequestered in deeper layers of ocean during “hiatus decades”:
So far so good.
Now Gavin Schmidt at RealClimate has a post, and wonderfully clear it is but it raises some problems. Joe Romm at Climate Progress quotes Trenberth in emphasising that the heat is not lost in the deep and can come back quite fast to warm us at the surface. Schmidt tells us that he estimates the impact on the deep temperature at something less than 0.1 deg C or so, and:
Neither is this heat going to come back out from the deep ocean any time soon (the notion that this heat is the warming that is ‘in the pipeline’ is erroneous).
Of relevance here is a long paper by Hansen et al Earth’s Energy Imbalance and Implications (in press). They say that that the deep heat is not well mixed and at least some of it can come back. What really counts is what is called the climate response function. They believe that climate models including the NASA GISS model assume a response which is too slow. They come up with this graph:
The phenomenon of decadal variance in ocean heat storage is a natural variation and the climate system doesn’t always produce the expected over short time-lines, which is what decades are. Hansen et al say that paleoclimate data on what happened since the Last Glacial Maximum indicates that deep ocean heat and surface temperature will reach an equilibrium within a millennium. They are confident that the range of 60 to 90% over 100 years represents the possible range and the mid-range of 75% “is plausible for climate sensitivity 3°C for doubled CO2.”
More importantly, though, Hansen et al do their own calculation of global energy imbalance and come up with different results from those of Trenberth and Fasullo (2010) (Panel A) which identified a “missing energy” problem. Here are the two graphs compared:
In Hansen et al (panel B) there is no missing energy. The graph represents a 6-year moving trend.
It needs to be emphasised that the graphs don’t indicate total heat content, rather the amount of warming measured in watts per square metre. So there is no cooling from 2003, merely a slowdown in warming. For the continued increase in ocean heat content, see here.
The thick red line represents where around 7% of the extra heat ends up, according to Skeptical Science:
So it is easy to see that a small percentage change in ocean uptake can result in a large percentage change in the remainder.
From the posts it may be gathered that Trenberth and Fasullo only considered the heat in the top 700 metres. From reading the text, I take their graph to represent the top 2000 metres. Hansen et al have included the whole ocean, using sources they cite.
The top line in both cases represents the top of atmosphere (TOA) net energy balance.
One of the most concerning aspects is the issue of measurement. The Argo system only measures temperature in the top 2000 metres, whereas the average ocean depth is something like 4000 metres. Apparently Argo measures the top 700m better than the rest.
Hansen et al detail difficulties in other areas. The CERES (Clouds and the Earth’s Radiant Energy System) instrument measuring the TOA energy budget produced a result so implausible that it was calibrated using climate models. But they say it is simply not capable of producing the required accuracy to pick up small variations.
With aerosols the situation is even worse. They are not measured at all. Hansen et al:
We also must quantify the causes of changes of Earth’s energy imbalance. The two dominant causes are changes of greenhouse gases, which are measured very precisely, and changes of atmospheric aerosols. It is remarkable and untenable that the second largest forcing that drives global climate change remains unmeasured. We refer to the direct and indirect effects of human-made aerosols.
The Glory satellite mission would have measured the direct effects of aerosols as well as solar irradiance. Unfortunately the rocket launch failed earlier this year. A replacement will not be available until 2015-2016, but this will not measure the indirect effects of aerosols, the effect of clouds, for example. To do this you would need to make “simultaneous measurements of reflected solar and emitted thermal radiation fields” by looking at the same area at the same time. What’s needed is to implement a mission concept defined by Hansen back in 1992. As he describes it in Storms of My Grandchildren you actually need four co-ordinated instruments. He couldn’t persuade Al Gore and he’s still looking for funding, which is estimated at a mere $100 million.
Hansen has 10 pages of explanation detailing how they brought information together to arrive at their estimates. Suffice it to say that observations from particular studies as well as models, basic science and paleoclimate data are brought to bear. Climate change obscurantists will make much of the uncertainties, but the links between trace gases such as CO2 and other GHGs and such phenomena as surface temperature and sea level rise are quite robust and well-established from the basic science, from the observational record and from paleoclimate data.
Trenberth and Fasullo used two standard measurements and found an anomaly. Hansen et al gave solving the anomaly a red hot go. The measurement difficulties and the uncertainties are perhaps the main point. There is much to be done and we need funding for more measurement.
Gavin Schmidt tells us that Roger Pielke Sr claims that the Meehl paper ‘torpedoed’ the use of the surface temperature anomaly as a useful metric of global warming. Schmidt says no-one has claimed it is the whole story. In logic Pielke is right to emphasise the larger picture, but the Earth’s surface is where we live and grow our tucker. It’s the pointy end of the warming story and it’s where our story will be played out. Moreover, Schmidt points out that if “global warming” doesn’t mean “surface temperatures” then confusion will reign. Perhaps Pielke would be happy with that.
Finally, I want to comment on the surface temperature record of the last two decades. This is HadCRUT and NASA GISS from Skeptical Science:
Two comments. First, HadCRUT and NASA GISS may be parting company. HadCRUT assigns the global average to the polar regions, whereas NASA makes an estimate based on the nearest measuring stations. Warming at the poles is multiples greater than warming at the equator. The difference may finally be showing.
Second, looking at NASA GISS there are two exceptional events in the 1990s which I’d suggest don’t contribute to the underlying trend, the Pinatubo effect and the El Nino of 1998. Take them out and you’ve completely lost the pause in warming.
The Meehl article shows us what can happen, according to the models when the heat goes deep (see Figures 1 and 2 above). It seems that the heat has indeed been going deep for the last few years. We’ll have to wait and see what happens next, but to truly understand it we will need to measure more than we are currently doing.
That’s how I see it, but this stuff is reasonably complicated for my aged but untutored brain, so I could be wrong.