A world map with climate anomalies during the World Climate Change Conference 2015. Photo: Reuters/Stephane Mahe
A world map with climate anomalies during the World Climate Change Conference 2015. Photo: Agencies

This is the last of a five-part series on climate science. Read part 1, part 2part 3 and part 4.

The just-published book Unsettled: What Climate Science Tells Us, What It Doesn’t, and Why It Matters has raised a furor. Authored by Steven Koonin, physicist and former chief scientist of the US Department of Energy, the volume contains a systematic critique of present-day climate science, concluding – among other things – that it provides no basis for the notion that the world faces a “climate emergency.” 

Voices can already be heard attempting to dismiss the book on the grounds that “Koonin is not a climate scientist.” In fact, Koonin was deeply involved, alongside his work on astrophysics, nuclear physics and computational physics, in one of the most important areas of climate science: the study of the radiative balance of the atmosphere.

Here his work concentrated on the Earth’s albedo, or the proportion of incoming solar energy that is reflected back into space. The albedo is a critical parameter for climate modeling.

Koonin has another important qualification: his career-long involvement with computer modeling. Although he has not been directly involved in climate models, Koonin authored one of the standard textbooks on computer modeling of complex physical systems, which puts him in a good position to judge some of the key difficulties and pitfalls of climate modeling.

Another supposed ground for dismissing Koonin – in environmentalist circles at least – is his position as chief scientist at British Petroleum (BP) from 2004 to early 2009. But his main focus there was on the transition to clean energy technology, especially large-scale production of renewable biofuels, and long-term strategies for the “era after oil.”

While at BP Koonin played a key role in establishing the Energy Biosciences Institute (EBI) based at the University of California. Founded with a $500 million grant from BP, at its inception the EBI was one of the largest academic-industrial partnerships in history. From the initial focus on biofuels, the EBI has since expanded to many other scientific and technological areas, among them next-generation batteries and other energy storage systems and sustainable chemicals production.

Far from being a “climate denier” – as attacks attempt to portray him – Koonin has all along been a promoter of technologies for reducing carbon emissions to combat global warming, not least of all during his DOE stint under President Obama.

In that capacity, and in other official positions going back at latest to his period as provost of the California Institute of Technology (1995-2004), Koonin has frequently been engaged in evaluations of  research programs across a wide range of fields.

That is doubtless one reason why the American Physical Society (APS) asked him in 2013 to lead an update of its official position on climate. It was the results of the “Climate Change Statement Review Workshop,” which he organized in 2014, that convinced Koonin of the need to carry out a no-holds-barred “Red Team” examination of climate research and claims made in its name.

The following final installment of my interview with Koonin deals with climate modeling, and may be a bit more difficult for some readers in some places. For more explanation, I highly recommend Chapter 4 of his book, where the ABCs of climate modeling are presented in non-technical language.

To get a basic idea about climate modeling – but without Koonin’s specific critiques – readers might also consult, for example, an article by the French climate expert Jean-Marc Jancovici. The last installment of the interview follows:

Jonthan Tennenbaum: One of the things that has always bothered me, is the variety of models and the large discrepancies between their predictions. Are these models based on first principles? Or is there more than physics?

Source: Wikimedia Commons

Examples of climate model predictions for global warming under the SRES A2 emissions scenario of the IPCC, which assumes no special actions to combat global warming, high total energy use, and a world population growing to 15 billion in 2100.  

Steven Koonin: The models are based on physics, but that’s not the whole story. And one of the other parts of the story has to do with sub-group scale parameterizations.

JT: What does that mean?

SK: In order to build one of these large computer models, you cut the ocean and the atmosphere up into rectangular boxes. And because the ocean is very big and the atmosphere is very big, you need a couple of hundred million boxes, both going up and down in the atmosphere and ocean and then covering the whole globe horizontally.

Even when you get to 100 million boxes, it turns out that the size of each box in the atmosphere is about one hundred kilometers on a side. If you make it much smaller, you’ve got too much computer work. But you don’t get a description of any phenomena in the climate system that occur on a scale size much smaller than one hundred kilometers.

That includes clouds, most importantly, but also the topography and so on. The models have to make assumptions about what happens at those smaller scales – winds, temperature, humidity. And, above all, how many clouds do you have? How opaque are they, how much heat do they absorb? Things of that sort.

JT: How sensitive are the models to these factors?

SK: Very sensitive, on the scale that we care about. Remember that we care about 1% effects, because that’s the size of human influences. Particularly we have to care about the clouds in the tropics, where you get the most convection. The warm ocean water evaporates, you get moist air going up. That’s a big source of heat transfer and moisture transfer from the surface into the atmosphere. And it happens on scales that are very small.

JT: So you have to make assumptions about a large number of parameters.

SK: Different models will make different assumptions. And so they get different answers.

JT: In this connection, I want to ask you: what is really meant when climate modelers speak about “tuning” their models?

SK: Whether it’s the convection, or what clouds there are, and so on, you have lots of parameters in there. And you get to adjust those parameters sometimes, from what you thought they were to other values in order to make the model in equilibrium look the way you think it should.

I’ll give you a concrete example. A very important thing you have to get right is the energy balance of the Earth in equilibrium.

About 240 watts per square meter are absorbed by the planet. That is essentially optical radiation, sunlight, and that has to come back out as heat radiation filtered through the atmosphere. You have to get the outgoing power equal to the ingoing power to within less than half a watt.

If you take them out of the balance by one watt and you run the model for one hundred years, you discover the temperature will suddenly get a lot warmer, or a lot colder than what has been observed. So you’ve got to tune the model in order to get the energy balance just right.

JT: So how do the climate modelers do that? And how do you get to the point that you can evaluate the human influences?

SK: The main way in which the response to human influences is computed in the models is the following (I describe it in the book): You start the model off in some sort of plausible configuration. Then you run it for a thousand years to bring it into equilibrium, for  when there was no human influence at all, or pre-industrial concentrations, no human-produced aerosols and so on.

So you have got it cooking along at this nice equilibrium state. But in the course of trying to achieve that equilibrium, you might see, “Ah! It’s drifting up, I have got a power imbalance of half a percent or two percent.” And so you’re going to have to adjust some parameters in the model in order to get it to sit nicely at 288 Kelvin (approximately the present value) for the average surface temperature.   

Once you have brought the planet into reasonable equilibrium, where it is not gaining or losing energy and the surface temperature is about right, then you start to impose human influences on it.

JT: In discussing human influences on the energy balance of the atmosphere you cite the figure of one percent. That still seems relatively small.

SK: It is 1-2 watts per square meter. In the diagram [below] the greenhouse gases that are in the upper half of the plot are warming;  and human aerosols and the volcanic eruptions – the spikes – are cooling.

Courtesy Steven Koonin

JT: So a very small change in albedo would already be very large in this context.

SK: That’s right. Now we can look at the spectrum of heat that’s radiated [from the Earth through the atmosphere into space] as a function of wavelength [see figure below]. For a clear sky over a desert surface, as just one example. The area under these curves is the cooling power of the heat radiation.

If the cooling power is more, the surface temperature will be lower. The jagged curve labeled zero ppm corresponds to an atmosphere that has no CO2 in it.

Courtesy Steven Koomin

If we now crank CO2 up from zero to 400 parts per million (ppm), which is about what it is today, the dark solid line shows that the cooling power has gone down. That is the greenhouse effect associated with CO2. It decreases the cooling power and hence warms the surface.

If we would now increase the CO2 to 800 ppm we get the dashed line. That difference in area is the additional greenhouse effect from doubling the CO2.   

JT: My goodness. it’s almost exactly the same! But now I remember your metaphor in the book about “painting over a black window.” At 400 ppm nearly all of the greenhouse effect from CO2 is already there, so you don’t get much more from doubling the concentration.  

SK: Now, this is only a component, the radiative transfer part, of what goes into the big climate models. But it shows you what changes are imposed on the climate models by doubling CO2. And the answer is it is a very, very small effect.

JT: Summing up in simple language: assessing the human contribution to global warming involves very delicate effects, and the climate modelers do a great deal of playing around with parameters before they make their projections. 

SK:  Yes. People have the impression that this is a science. It has some science in it, certainly, but there is a good bit of art. The main thing is we don’t have a really good way of knowing which model, if any, is right.

Jonathan Tennenbaum received his PhD in mathematics from the University of California in 1973 at age 22. Also an author, linguist and pianist, he is a former editor of FUSION magazine. He lives in Berlin and travels frequently to Asia and elsewhere, consulting on economics, science and technology.