28 November 2012

Against "Modern Energy Access"

Access to energy is one of the big global issues that has hovered around the fringes of international policy discussions such as the Millennium Development Goals or climate policy, but which has been getting more attention in recent years. In my frequent lectures on climate policy I point out to people that 1.3 billion people worldwide lack any access to electricity and an 2.6 billion more cook with wood, charcoal, tree leaves, crop residues and animal waste (an additional 400 million cook with coal).

The "success" scenarios of climate advocates hoping to power the world with carbon-free energy almost always leave a billion or more people in the dark and several billion cooking with dirty fuels. Sometimes, magic is invoked to suggest that "electricity can be brought to everyone" without appreciably increasing carbon emissions. Of course, if we could bring electricity to the 1.3 billion without any access with no effect on emissions, then we could probably do it for 6 billion others.

There is a devil in the details which helps us to keep the energy poor out of view while we debate issues important to rich people, like climate change. That is the very definition of "energy access." The International Energy Agency explains some of the difficulties in defining energy access and gives its definition as follows:
There is no single internationally-accepted and internationally-adopted definition of modern energy access. For our energy access projections to 2030, the World Energy Outlook (WEO) defines modern energy access as “a household having reliable and affordable access to clean cooking facilities, a first connection to electricity and then an increasing level of electricity consumption over time to reach the regional average”. By defining it at the household level, it is recognised that some other categories are excluded, such as electricity access to businesses and public buildings that are crucial to economic and social development, i.e. schools and hospitals.

Access to electricity involves more than a first supply connection to the household; our definition of access also involves consumption of a specified minimum level of electricity, the amount varies based on whether the household is in a rural or an urban area. The initial threshold level of electricity consumption for rural households is assumed to be 250 kilowatt-hours (kWh) per year and for urban households it is 500 kWh per year. The higher consumption assumed in urban areas reflects specific urban consumption patterns. Both are calculated based on an assumption of five people per household. In rural areas, this level of consumption could, for example, provide for the use of a floor fan, a mobile telephone and two compact fluorescent light bulbs for about five hours per day. In urban areas, consumption might also include an efficient refrigerator, a second mobile telephone per household and another appliance, such as a small television or a computer.
I have found when you start talking in terms of "kilowatt-hours per year" people's eyes glaze over. And when I am lecturing about "energy access" students might look up from their smart phone, tablet or laptop to register a look of understanding: "Energy access -- yeah, I have that, gotcha."

Actually I want to tell them, you have wayyyyy more than that. To better explain this issue I have made up the following graph.
When "energy access" is used by organizations like the IEA, they mean something very different than what you, I or my students might take the term to mean in common parlance. (And note, this is no critique of the IEA, they have done excellent work on energy access issues.) The graph above provides a comparison of the 500 kWh per year household threshold for "energy access" used by the IEA to a comparable number for the United States (both numbers are expressed in per capita terms, so 100 kWh per person from IEA and data on US household electricity consumption here and people per household here).

A goal to secure 1.3 billion people access to 2.2% of the electricity that the average American uses might be characterized as a initial start to more ambitious goals, but it is not a stopping point (and again, IEA recognizes that energy access is a process, but this gets lost in broader discussions).

We do not label those who live on $1 per day as having "economic access" -- rather they are desperately poor, living just above the poverty line. Everyone understands that $1 a day is not much. Very few people get that 100 kWh per year is a pitifully small amount of energy. Therefore, I suggest that we start talking in terms of  "energy poverty" measured as a percentage of the average American (or European or Japanese or Australian or whatever energy rich context you'd prefer as a baseline, the results will be qualitatively the same). To use the IEA numbers, one would be in "energy poverty" with access to less than 2% of the energy access enjoyed by those in the rich world.

It is bad enough that the energy poor are largely ignored in our rich world debates over issues like climate change. It is perhaps even worse that our "success stories" often mean creating scenarios where the energy poor attain just 2% of the access to energy that we enjoy on a daily basis. The frustrating irony of course is that the issues that rich world environmentalists most seem to care about might be best addressed by putting energy poverty first, but that is a subject for another time.

27 November 2012

US Hurricane Intensity 1900-2012

The figure above comes courtesy Chris Landsea of the US National Hurricane Center. It shows the annual intensity of US landfalling hurricanes from 1900 to 2012. The figure updates a graph first published in Nature in 2005 ( Figure 2 here in PDF, details described there).

The red bars show the annual data. The grey straight line is the linear trend (no trend) and the black line shows the five-year average. The most recent five years have the lowest landfalling hurricane intensity of any five-year period back to 1900. By contrast 2004 and 2005 saw the most intense seasons of landfalling storms.

The data shown above includes both hurricanes and post-tropical cyclones which made landfall at hurricane strength (i.e., storms like Sandy). In addition to Sandy, there have been 3 other such storms to make landfall, in 1904, 1924 and 1925. The addition of the storms does not make a significant impact on the graph.

26 November 2012

Inequity Among Nations in the Climate Negotiations: A Guest Post

Editor's Note: This is a guest post by Heike Schroeder, University of East Anglia, and Max Boykoff, University of Colorado, who along with Laura Spiers of PwC have co-authored a new piece in Nature Climate Change on the international climate negotiations (available here in PDF). Please feel free to comment on their paper or the climate negotiations more generally, as this is likely to be the only post here on them. Thanks!

Another round of climate negotiations is starting today. On the agenda are two main objectives: the implementation of a second commitment period under the Kyoto Protocol to start right away – on 1 January 2013 – and to make progress toward a new climate agreement to be finalised by 2015. Issues to be discussed include, among others, adaptation finance, strengthening mitigation efforts by developed countries and reducing deforestation.

While it may be viewed as good news that the Kyoto Protocol is moving into a new phase, only the EU countries, Australia and likely Norway and Switzerland will take part in this second commitment period, covering only some 10-12 percent of global emissions. Thus, Kyoto raises the age-old conundrum between focusing on a few willing countries to lead, even if their efforts are wiped out by massive emission rises elsewhere, and waiting until a critical mass of countries is ready to mitigate seriously.

Our study in the current issue of Nature Climate Change (PDF) looks into embedded questions of who represents the interests of a global populace, by way of considerations regarding who attends and participates in climate negotiations. Based on our results, we argue that a restructuring of UN rules and practices around state representation at UN climate conferences is urgently needed. Current practice that give countries a free hand at sending as many delegates representing mainly vested national interests to the COPs results in serious differences in negotiating power between rich and poor countries. Overall participation increased from 757 individuals representing 170 countries at the first Conference of the Parties (COP) in 1995 in Berlin to an all-time high of 10,591 individuals from 194 countries at COP-15 in 2009 in Copenhagen (a 14-fold increase).

Because there are so many parallel negotiating tracks and so much technical detail, small delegations cannot participate in every session while larger delegations can. We also find significant difference in terms of delegation composition across countries. Moving forward we recommend that countries consider capping national delegations at a level that allows broad representation across government departments and sectors of society while maintaining a manageable overall size. We also argue for a stronger role of constituencies in the UNFCCC (e.g. business, environmental non-governmental organizations, local government, indigenous peoples, youth and so on). Finally, formal and informal arenas – negotiations and side events on specific topics at COPs, for example adaptation finance or addressing drivers of deforestation – could be joined up in innovative ways to facilitate exchange of ideas and foster dialogue among various stakeholders.

21 November 2012

Science Academies and the L'Aquila Earthquake Trial

The science academies of the US and UK have responded very differently than several of their European counterparts to the recent verdict in an Italian court against government scientists involved in the L'Aquila affair. The French, German and Italian academies have adopted a much more sophisticated -- and ultimately more constructive -- approach to understanding the implications of the lawsuit for the practice of science advice in government. This contrasts with the ill-informed snap judgement offered by the US and UK academies. This post provides some details on the different approaches.

The US National Academy of Sciences and the UK Royal Society were quick to criticize the Italian court verdict in somewhat hyperbolic terms. Here is the statement in full:
Oct. 25, 2012
Joint Statement Regarding the Recent Conviction of Italian Earthquake Scientists
by Ralph J. Cicerone, President, U.S. National Academy of Sciences, and Sir Paul Nurse, President, The Royal Society (U.K.)

The case of six Italian scientists sentenced to be jailed for failing to warn of the L'Aquila earthquake in Italy in 2009 highlights the difficult task facing scientists in dealing with risk communication and uncertainty.

We deal with risks and uncertainty all the time in our daily lives. Weather forecasts do not come with guarantees and despite the death tolls on our roads we continue to use bikes, cars, and buses. We have also long built our homes and workplaces in areas known to have a history of earthquakes, floods, or volcanic activity.

Much as society and governments would like science to provide simple, clear-cut answers to the problems that we face, it is not always possible. Scientists can, however, gather all the available evidence and offer an analysis of the evidence in light of what they do know. The sensible course is to turn to expert scientists who can provide evidence and advice to the best of their knowledge. They will sometimes be wrong, but we must not allow the desire for perfection to be the enemy of good.

That is why we must protest the verdict in Italy. If it becomes a precedent in law, it could lead to a situation in which scientists will be afraid to give expert opinion for fear of prosecution or reprisal. Much government policy and many societal choices rely on good scientific advice and so we must cultivate an environment that allows scientists to contribute what they reasonably can, without being held responsible for forecasts or judgments that they cannot make with confidence.
As I explained two days before the statement above, the idea that the scientists were being punished for a failure to predict did not reflect the actual complexities of the case.

Fortunately, the Italian, German and French science academies have taken a more measured look at this situation. The Italian Academy has set up a commission to examine the issues raised by the L'Aquila lawsuit, and the French and German academies offered the following statement in support of the Italian commission.

Here is the full statement from the French and German academies, issued last week:
Statement on the handling of risk situations by scientists

In late October, Italian scientists have been sentenced for supposedly not having warned sufficiently against the severe earthquake of L'Aquila 2009. On occasion of this verdict, the German National Academy of Sciences Leopoldina and the French Académie des sciences publish a statement concerning the handling of risks situations by scientists. We forward the statement in the exact wording.

Joint Statement of the German National Academy of Sciences Leopoldina and the French Académie des sciences, 12 November 2012

On the science-based communication of risks following the recent sentencing of Italian scientists

On 22 October 2012, a court in L'Aquila sentenced seven members of the Italian National Commission for the Forecast and Prevention of Major Risks to prison terms of several years. The verdict has sparked a worldwide discussion on the legal aspects of the accountability of scientists who advise government institutions. Scientists must participate in this discussion actively and as objectively as possible. The German National Academy of Sciences Leopoldina and the French Académie des sciences therefore expressly support the Accademia Nazionale dei Lincei, the Italian National Academy of Sciences, in its endeavours to set up an independent expert commission of geologists and legal experts. The role of this commission will be to examine the scientific and legal aspects of the L'Aquila verdict.

Scientific research is substantially motivated by the aim of providing greater protection against natural disasters. In the case of uncontrollable events such as cyclones, earthquakes and volcanic eruptions, scientific forecasting methods are becoming increasingly important. Scientists and representatives of state institutions must work together with mutual trust in order to inform the public responsibly, and on the basis of reliable data, about possible risks.

In their risk forecasts, scientists assess the probabilities of future events. Probability-based statements are per se fraught with uncertainty. At all times, scientists must communicate this fundamental fact as clearly as possible. This is no easy task when it involves communicating with public-sector decision-makers and concerned members of the public who expect clear forecasts. However, scientists cannot – and should not – absolve themselves of this responsibility.

It is very unfortunate when the trust between scientists, state institutions and the affected members of the public is profoundly damaged. This occurred as a result of the devastating earthquake in L'Aquila on 6 April 2009.

It is thus in the interests of all those involved that the events are reconstructed comprehensively, precisely and objectively. Only in this way is it possible to evaluate on a reliable basis whether the persons involved performed their duties appropriately in the situation in question.

The scientific community must also take an active part in the necessary examination process from the start. The decision of the Accademia Nazionale dei Lincei to set up an independent expert commission to examine the L'Aquila verdict is a clear and decisive signal in this regard.
###
It is not too late for the National Academy of Sciences and Royal Society to join the German and French academies in offering support for the Italian commission, and to correct their earlier misinterpretation of the L'Aquila lawsuit. There are difficult and complex issues involved in this case, and scientists everywhere will benefit from the drawing of lessons.

20 November 2012

Anne Glover on EU Science Policy

Today, I had the pleasure to meet Anne Glover, Chief Scientific Advisor to the European Union, in Berlin at an interesting science policy workshop organized by the German Federal Institute for Risk Assessment. Like just about every science advisor to governments that I have met, she is an impressive individual. 

Here are a few comments that she made in oral testimony to Parliament in the UK a few weeks ago:
I started as CSA and was the first person to take up that post in the European Commission in January of this year. I will finish at the end of 2014; so I have three years. I will start off in a slightly light-hearted way. I would say that in the first week or two at the European Commission I set myself the target that at the end of two weeks I would understand how the Commission worked. I now realise that if I can understand part of it by the end of 2014 I’ll be very lucky. There is a lot involved in understanding procedure and how the Commission and Parliament works, and that, in itself, has an impact on what I hope to achieve.

The one single thing that I think would be very important to achieve is how people regard evidence and policy making. For me, that is absolutely central. I would like to develop that a little bit more. From my point of view, science has an obligation to generate the knowledge and the evidence that can be used in policy making. That should be the fundamental platform on which policy is built. That is just as appropriate for every member state as it is for the European Commission.

At the moment, although the policy making process in the European Commission is very robust-if I look at how it is structured, how evidence is gathered and how impact is assessed, it is very impressive-when it gets to the stage where individual member states look at it and Parliament addresses it, the evidence is often unpicked and bits of it are removed in order to find consensus around a particular policy. Although that is part of the democratic process and so I think and expect that that would happen, there is not a great deal of transparency around why the evidence is not being followed.

At the end of 2014 I would like there to be an understanding that, if the evidence is not adhered to in policy making, there would be a statement to say that we accept the evidence, that it is robust and that the evidence is true, but for various reasons we are reducing our reliance on evidence; and that could be social, economic, ethical or whatever. We need that transparency and also accountability so that, if people vote against something where clearly the evidence supports it, there should be a degree of accountability there, and then, for me, we would be in a much better place. At the moment, I think, sometimes evidence is disregarded in policy and, quite rightly, citizens would feel that there is something wrong with the evidence then, and that is not the case in many instances. For me, that is a very important thing.

The second thing would be to try and raise more awareness across Europe about just how impressive the knowledge is that we generate in Europe. In my mind it is really second to none. If you look at the impact of the knowledge that we generate, the infrastructures that we have and the things that we can do as a European Union that no individual member state or indeed any other nation outside Europe could deliver-I am thinking there of things such as the Large Hadron Collider at CERN or the European Fusion for Energy project, for example, with the European Space Agency-they are all examples of where Europe absolutely excels. I would feel that we were in a much better position if citizens understood that and also could appreciate that science is culture. It is not accessible enough and we don’t celebrate it enough. I would like every one of us to be less modest about our achievement in science, engineering and technology in Europe because it is one thing we can truly shout about, claim we are the best and actually be the best.
 My views on the strengths and limitations of any science advisor to governments can be found here.

16 November 2012

The End of Economic Growth?


Listen above to a debate I participated in airing on the CBC tomorrow. Comments welcomed. Here is how the Brent Bambury introduces the program:
In the Deep Sixed series, we examine aspects of life we take for granted today that might not survive tomorrow. This week: economic growth. We consider the future of growth with help from Matthew Lazin-Ryder, host of CBC radio's The Invisible Hand, Jeff Rubin, former chief economist for CIBC World Markets and author of The End of Growth, and Roger Pielke Jr, Professor of Environmental Studies at the University of Colorado at Boulder, and author of The Climate Fix. To vote on the future of growth, visit the Deep Sixed page

15 November 2012

Little Change in Drought Over 60 Years

UPDATE: Courtesy @jfleck a pointer to another new paper from Hoerling et al. in JOC: "We conclude that projections of acute and chronic PDSI decline in the 21st Century are likely an exaggerated indicator for future Great Plains drought severity."

A new paper out in the current issue of Nature finds little evidence to support claims that drought has increased globally over the past 60 years. The authors write:
Drought is expected to increase in frequency and severity in the future as a result of climate change, mainly as a consequence of decreases in regional precipitation but also because of increasing evaporation driven by global warming1–3. Previous assessments of historic changes in drought over the late twentieth and early twenty-first centuries indicate that this may already be happening globally. In particular, calculations of the Palmer Drought Severity Index (PDSI) show a decrease in moisture globally since the 1970s with a commensurate increase in the area in drought that is attributed, in part, to global warming4,5. The simplicity of the PDSI, which is calculated from a simple water-balance model forced by monthly precipitation and temperature data, makes it an attractive tool in large-scale drought assessments, but may give biased results in the context of climate change6. Here we show that the previously reported increase in global drought is overestimated because the PDSI uses a simplified model of potential evaporation7 that responds only to changes in temperature and thus responds incorrectly to global warming in recent decades. More realistic calculations, based on the underlying physical principles8 that take into account changes in available energy, humidity and wind speed, suggest that there has been little change in drought over the past 60 years.
What does this mean?

For one, it means that a widely accepted and oft-repeated consensus position expressed in the IPCC 2007 now appears to have been incorrect. This should not be unexpected as a consensus position is a snapshot of perspectives, and in science, perspectives can change based on new evidence and study. The IPCC SREX, published earlier this year had already stepped back from the conclusions of the IPCC AR4.

A second important conclusion from this paper is that we simply don't know is drought has become worse over the past 60 years. This places drought into a category with tropical cyclones, floods, tornadoes and other phenomena where the evidence does not support claims that things are progressively getting worse -- with more frequent and intense extreme events on climate time scales. Once again the lesson is that if you are looking for a signal of human-caused climate change, it is best not to look at such extremes.

Finally, the paper leads to questions about predictions of future changes in drought. A companion essay in Nature explained:
[The] findings imply that there is no necessary correlation between temperature changes and long-term drought variations, which should warn us against using any simplifications regarding their relationship.
All this said, human-caused climate change remains a reality. However, what is also a reality is that there is very little evidence to support claims that the influence of such changes can be observed in the observational record of extreme events. Advocates who justify action on climate change by appeals to the latest extreme event go well beyond what science can support, and in the process undercut the very cause that they are advocating for.

Roger Pielke Sr. Retires His Blog

Congrats to my dad, who has retired from blogging after more than 7 years. I expect we'll see him comment here and elsewhere as the occasion arises.

I know that his steady efforts to discuss science and challenge claims have been widely appreciated. Please feel free to offer any comments for him here!

14 November 2012

President Obama and the Iron Law of Climate Change

You could not ask for a more clear expression of the iron law of climate change and its implications that that which was given by President Obama today in a press conference (emphasis added):
There’s no doubt that for us to take on climate change in a serious way would involve making some tough political choices, and you know, understandably, I think the American people right now have been so focused and will continue to be focused on our economy and jobs and growth that, you know, if the message is somehow we’re going to ignore jobs and growth simply to address climate change, I don’t think anybody’s going to go for that.

I won’t go for that.


If, on the other hand, we can shape an agenda that says we can create jobs, advance growth and make a serious dent in climate change and be an international leader, I think that’s something that the American people would support.
Well said.

13 November 2012

Ozone Histories of Convenience: Grundmann on Sunstein

NOTE: This is a guest post by Reiner Grundmann, a professor of Science & Technology Studies of the University of Nottingham

Last weekend the eminent legal scholar Cass Sunstein commented in the New York Times expressing his optimism that the new Obama administration will finally embark on a policy leading to climate change mitigation.

He draws a parallel to a previous global challenge, ozone depletion. The Montreal Protocol for the protection of the ozone layer, signed in September 1987, is commonly regarded the only successful international treaty in matters regarding global environmental risks. Sunstein thinks there is an important lesson to be learnt. The lesson is the application of cost-benefit analysis (CBA). Ronald Regan who was at the helm at the time of the treaty negotiations, is the unlikely hero in this story. Sunstein writes:
“The Reagan administration was generally skeptical about costly environmental rules, but with respect to protection of the ozone layer, Reagan was an environmentalist hero. Under his leadership, the United States became the prime mover behind the Montreal Protocol, which required the phasing out of ozone-depleting chemicals.”
We are told that Reagan (like Obama!) embraced CBA. Because the trick worked the first time round, we should expect similar success this time, or so we are made to believe.

Sunstein correctly points out that the US, and other countries as well, would be well advised to explore energy efficiency gains for economic reasons alone. This makes perfect sense and no one in their right mind should object to these aims (albeit Sunstein is silent on necessary long term climate policies). However, it gets a little more complicated than this. In effect Sunstein is trying to rewrite the history of ozone politics, reducing the chain of events that led to the successful reduction of CFCs to an imaginary application of the principle of CBA. Below I will show that this is not borne out by the facts. But before I do so, I should make clear that there are two crucial issues involved here: the problem of international agreement, and the problem of its implementation.

Montreal was – by and large - successful on both fronts. As we know, Kyoto is failing on both. One problem is the non participation of important countries, like China, India and the United States. The other is the problem of implementation. Even countries in the Kyoto club do not succeed in cutting their emissions (they can only claim modest success because of shifting production overseas). Sunstein has an immediate interest in the prospects for a more active role of the newly elected US government, and there is nothing wrong with that. It gets problematic where he invokes a historical precedent in order to nudge Obama towards a policy which seemed to have worked in the past. This imaginary precedent is CBA. The only problem is it didn’t play a decisive role in ozone politics. It was not the reason for the US driving an ambitious treaty in Montreal.

Here are the historical facts. I am not only referring to my own research which you can read here. The widely accepted ‘official’ story of the ozone negotiations has been provided by Richard E. Benedick, in his book Ozone Diplomacy. He lays out in great detail how a coalition of different actors from within atmospheric sciences, EPA, NASA, NOAA, UNEP and the State Department was able to advance an ambitious agenda for international controls of ozone depleting substances, despite resistance from the White House. We must not forget that in 1977 the amendment of the Clean Air Act had regulated ‘non-essential use’ of CFCs without scientific evidence (of lower ozone concentrations, higher UV radiation, or actual harm). The Republicans in government did not want to see this policy repeated.

In its eagerness to prevent further CFC regulations, the Reagan government applied tactics that backfired. Benedick provides many historical details about these mistakes (I recommend reading especially chapter 5 of his book, called ‘Forging the US position’). If anything, Montreal was achieved not because of Reagan’s support, but despite his long time resistance. An advocacy coalition in favor of stringent CFC controls eventually prevailed in the US, and internationally. Many historical contingencies played a role, such as the US CFC manufacturers trying to level the playing field with their European competitors (after all there was noting comparable to the Clean Air Act in Europe). Another contingency was the fact that the EU gave up its resistance to an international treaty after German Greens were elected to parliament for the first time. By the way, little of these contingencies had much to do with getting the science settled, still a prevailing myth in ozone history (propagated by Benedick himself and Mostafa Tolba). For example, the UK, a hardliner against regulations until after Montreal, was simply outmaneuvered at the EU level (quoting uncertain science as a reason for their opposition). It came to accept the fait accompli after the ‘greening of Margaret Thatcher’.

No matter what led eventually to the agreement in Montreal, the phasing out of the problematic gases was not such a big deal as had been claimed by industry. Process and product substitutes became quickly available and the initial ambitiously looking targets could be over-fulfilled within years. No such prospects lie in wait with climate policy. Even if the international community were to somehow to agree to ambitious mitigation targets, these would be just a piece of paper, a dead treaty. At present, there are no prospects for the successful implementation of ambitious mitigation targets, as Sunstein acknowledges.

We know of one famous attempt to apply CBA to climate change. This is Lord Stern’s report which tried to make the case of climate mitigation to the UK Chancellor of the Exchequer, arguing that it pays to pay for prevention. After some years of official endorsement, this approach seems to lose its political credibility (having lost its academic credibility some time ago, some would say from the very beginning).

All of this is no argument in principle against CBA. And it is no argument against increasing energy efficiency everywhere, on the contrary—such efforts are good and should be seen as ‘no-brainers’. All I am arguing is to stick to the known historical facts when pretending ‘to learn from past success’. If the story does not stack up, it is not a good starting point.

Sunstein tries to appeal to Republicans, saying “Look, your great former president did this with ozone, not based on crazy environmental principles, but on the basis of sober analysis of costs and benefits.” Again the political aim of building bridges is commendable. But the historical foundations are not sound.

08 November 2012

I Am "Roughly" 18 Feet Tall: A Critique of Grinsted et al. 2012

UPDATE 18 March 2013: Today Grinsted et al. have another paper out in PNAS in which they follow up the one discussed below. They make the fantabulous prediction of a Katrina every other year. They say in the new paper:
[W]e have previously demonstrated that the most extreme surge index events can predominantly be attributed to large landfalling hurricanes, and that they are linked to hurricane damage (20). We therefore interpret the surge index as primarily a measure of hurricane surge threat, although we note that other types of extreme weather also generate surges such as hybrid storms and severe winter storms. . .
As I showed in this post, which Gristed commented on, the surge record does not accurately reflect hurricane incidence or damage. Another poor showing for PNAS in climate science. 

Last month the Proceeding of the National Academy of Sciences published a paper by Grinsted et al. titled, “Homogeneous record of Atlantic hurricane surge threat since 1923.” In what follows I provide a critique of their paper and offer my argument for why it does not actually tell us much about hurricanes, much less about damage.

The paper looked at 6 tide gauge stations along the US Gulf and Atlantic coasts to develop an annual index of storm surges in the United States. The paper explains why this is important:
[F]rom the economic damage perspective the hurricanes that remain far away from shore in the Atlantic are much less important than those closer to land. Hence in constructing an unbiased record of storms we need to ask what we want to measure. The strong winds and intense low pressure associated with tropical cyclones generate storm surges. These storm surges are the most harmful aspect of tropical cyclones in the current climate (1, 12), and wherever tropical cyclones prevail they are the primary cause of storm surges. A measure of storm surge intensity would therefore be a good candidate measure of cyclone potential impact.
My attention was drawn to the paper because unlike other studies and data, which have found no trends in US landfalling hurricane numbers or intensities, Grinsted et al. do find a trend. They write:
We have constructed a homogeneous surge index on the basis of instrumental records from six long tide-gauge records. We demonstrate that the surge index correlates with other measures of Atlantic cyclone activity and that it responds in particular to major landfalling cyclones. The surge index can be used to identify and estimate potential remaining biases in other records of cyclone activity.

We detect a statistically significant increasing trend in the number of moderately large surge index events since 1923.
They also compare their surge index with our record of normalized damage (Pielke et al. 2008). Because their dataset has an upward trend and the normalized loss dataset has no trend they conclude that our dataset is “suspect.”

I contacted Dr. Grinsted, who has been extremely responsive in providing data and exchanging views, for which I think him. He and co-authors (along with Kerry Emanuel, who edited the paper for PNAS) were provided a chance to offer comments on a first draft of this blog post -- they have not as yet, though the offer will remain open.

The first thing I noticed about their paper is that their surge dataset contains 465 surge events from 1923 to 2008 (July through November) yet over that same time period there were only 147 landfalling hurricanes (data from NOAA). You can see a comparison of the two datasets in the following graph.

Clearly, there has been no trend in hurricane events, yet there has been an increase in surges. I am not sure what this means, but logically it does seem pretty obvious that there have not been more hurricane-related surges as there have not been more landfilling hurricanes.

However, the paper says something different:
To estimate the trend in landfalling storm counts, we count the number of large surge events greater than 10 units in 1 y, which is roughly equivalent to hurricane categories 0–5. This threshold was chosen as a compromise between looking at large events and having sufficiently many events to obtain robust statistics. Since 1923 the average number of events crossing this threshold has been 5.4/y …
The actual number of hurricanes to make landfall averaged 1.7 per year over 1923-2008. So to claim that their selection threshold is “roughly equivalent” to hurricane landfalls is to provide a very generous interpretation of the term “roughly.” It is like me saying that I am "roughly" 18 feet tall.

The lack of precision in event specification was a point that Dr. Grinsted has admitted is a weakness of the study, as discussed in a news article which covered the paper:
There’s one obvious caveat about the new results: not every hurricane creates a storm surge, since they don’t always hit land. And not every storm surge is caused by a hurricane. “The storm surge index,” Grinsted said, “is sensitive to strong winter storms as well.” And it’s quite possible, he said, that the intensity of a given storm surge could be made greater or less by the angle at which a hurricane hits land.

Surges aren’t, in short, a perfect stand-in for hurricanes, but Grinsted said that they’re pretty good. In cases where they could do so, the team has lined up hurricane data with surge data, and, he said, “there are clear correlations. So while our paper might not explain everything, it is still useful."
I reject the notion that the 465 surges used in the paper are a "pretty good" stand-in for 147 hurricanes, and the data supports such a rejection. That this claim made it through the PNAS review process does not help this journal's reputation with respect to the quality of recent papers on climate. But I digress.

To address the discrepancy between surges and hurricanes, during our exchange, Dr. Grinsted performed an additional analysis that did not appear in their paper, which was to look only at the top 150 events in the database, in order to explore whether this subset of their dataset would better capture landfalling hurricanes. This data is shown in the following graph.

You can see that the scale is obviously more appropriate and the trend is reduced (and is significant at the 0.1 level but not 0.05) but the datasets still do not match up well. The correlation between the 465 events and the 147 hurricanes over 1923 to 2008 is 0.54, but when the surge dataset is reduced to the top150 events the correlation with the 147 hurricanes is actually reduced to 0.49. This means that hurricane events explain only 25% of the variance in the surges, telling us that there is a lot more going on in this database than just hurricane-related surges.

The situation with damage is similar -- because it is a further a subset of the 147 hurricanes that cause most damage, far from the 465 surge events of which the vast majority of which are not associated with any damage. However, the top 15 years in terms of surge events from Grinsted et al. account for 46% of the normalized damage from 1923-2008. So there is some valuable information in the surge dataset at the most extreme end of the scale.

It is very important to note that the median date of these top 15 years is 1969, almost exactly the mid-point of the period examined by Grinsted et al. 1923 to 2008, a conclusion which actually supports the finding of no trend in the normalized loss dataset. Thus, a closer reading of the data presented in Grinsted et al. finds that the normalized loss dataset, rather than being “suspect,” is actually pretty robust.

To summarize, Grinsted et al. have created a dataset of storm surges which they have sought to associate with landfalling hurricanes and then further link to hurricane damage. Unfortunately, the connection between the surge dataset and hurricanes is, in their words “rough,” and shown here, tenuous. A further linkage to damage doesn't stand up. However, a closer look at the most extreme years in the surge dataset and its relation to normalized losses does find value. Here the surge dataset actually helps to confirm the finding of no trend in normalized losses.

Thanks again to Dr. Grinsted for engaging.

UPDATE: After writing this post I have been pointed to similar criticisms of Grinsted at al. by Tom Knutson and Gabe Vecchi of NOAA at DotEarth

06 November 2012

The Real Lessons of Ozone Depletion

Over at ChinaDialogue I have an essay up on the real lessons of the successful response to ozone depletion, motivated by the 25th anniversary of the Montreal Protocol. Here is how I start:
Twenty five years ago, the Montreal Protocol on Substances that Deplete the Ozone Layer was introduced for signature by nations around the world. Since that time, the treaty has become arguably the most successful international environmental success story in history. It may also be the one which historians and policy analysts have argued about the most in an effort to draw lessons relevant to the climate debate.

Conventional wisdom holds that action on ozone depletion followed the following sequence: science was made certain, then the public expressed a desire for action, an international protocol was negotiated and this political action led to the invention of technological substitutes for chlorofluorocarbons.

Actually, each chain in this sequence is not well supported by the historical record.
I argue that public opinion was not an important factor driving action, the presence of scientific uncertainty (and those touting it) was not an obstacle for action and the invention of technological substitues for chlorofluorocarbons helped reconfigure ozone politics to enable effective action. In this instance the role of governments was essential, but it was not what you might think -- I quote Karen Litfin:
“The issue resembles a chicken-and-egg situation: without regulation there could be no substitutes but, at least in the minds of many, without the promise of substitutes there could be no regulation.”
As will be the case with efforts to decarbonize the global economy, technological substitutes for carbon intensive energy generation will pave the way for international action. The question is, how do we accelerate technological innovation? If your answer is "targets and timetables for emissions reductions negotiated via a binding international treaty" then I'd suggest that you revisit the lessons of the ozone experience where innovation preceded a treaty.

ChinaDialogue has a companion piece from NRDC's David Doninger, which is largely compatible with my own essay, though he starts his story in 1986 when substitutes were already available.

Comments welcomed!

05 November 2012

Loss Normalization Methodologies: Technical Thread

UPDATE: Skeptical Science has written a bazillion-word post "responding" to my WSJ op-ed in which they (a) do not contest a single empirical claim made in the op-ed, and (b) demonstrate a complete lack of understanding of what it is a loss-normalization seeks to accomplish.

There have been a lot of questions about the methodologies of loss normalization. As might be expected, there is also some poorly informed understandings of this area of research. Unfortunately (and tellingly), some of the activist blogs on climate purportedly interested in science have banned me from commenting or edit my comments to delete substantive content. This post and the comment thread that accompanies it is for technical questions and discussions related to loss normalizations (and you can see my related research here).

A few points to note to start:
  • Human-caused climate change is real
  • Some measures of climate extremes have changed, notably temperature and precipitation extremes, and have been linked to human forcings
  • However, on climate time scales there has not been detection (much less attribution) of increasing disasters (intensity or frequency) to human-caused climate change
  • This conclusion holds globally and regionally
  • The peer reviewed literature and the IPCC SREX are consistent on this point 
  • If you are looking to see changes in the climate system, then look at climate data, not loss data (climate data can however be used to test the fidelity of loss normalization methodologies)
I may add to this list as needed.

Anyone is welcome to participate here and if you see claims made elsewhere that you'd like to inquire about, just enter a comment. Please do respect the focus of this thread.  Thanks!

02 November 2012

Back to Regularly Scheduled Programming


Next week, it's back to regularly scheduled programming here.

To get back on track here is a reminder of my column at The Breakthrough Institute on R&D and economic growth:
It is a claim that you hear often in discussions of the role of research and development in the economy: “Federal investments in R&D have fueled half of the nation’s economic growth since World War II.” This particular claim appeared in a recent Washington Post op-ed co-authored by a member of the US Congress and the chief executive of the American Association for the Advancement of Science. It would be remarkable if true. Unfortunately, it is not.
To read the rest head over here, and please feel free to come back and tell me what you think.

Enjoy your weekend and the Beastie Boys above giving a shout out to NYC!

Mayor Bloomberg's Deft Climate Politics

Whatever the motivations behind Mayor Michael Bloomberg's decision to cite Sandy and climate change as a reason for his endorsement of President Obama, it has the effect of relocating responsibility for Sandy's devastation from NYC City Hall to Washington, DC.

As New Yorkers (and others) affected by Sandy's wrath pick themselves back up and recover, attention will soon focus on the broader reasons for the disaster. While some will continue to link Sandy with energy policy decisions, important questions will have to be asked about why NYC was not better prepared, and what can be done in the months and years ahead to fix that, before the next storm barrels up the coast.

To that end, a few excerpts from the New York City Natural Hazard Mitigation Plan (April, 2009, here in PDF) will indicate that absolutely nothing about Sandy and its impacts should have been a surprise to anyone. It would be fair to ask NY politicians why the city was not better prepared for a disaster that it saw coming.

The report is clear on the general characteristics that make the region susceptible to large storm surges:
Coastal storms, including nor'easters, tropical storms, and hurricanes, can and do affect New York City. New York’s densely populated and highly developed coastline makes the City among the most vulnerable to hurricane-related damage. . .

New York City is particularly vulnerable to storm surge because of a geographic characteristic called the New York Bight. A bight is a curve in the shoreline of an open coast that funnels and increases the speed and intensity of storm surge. The New York Bight is located at the point where New York and New Jersey meet, creating a right angle in the coastline.
The figure immediately above comes from the report and shows that New York is no stranger to hurricanes. The report notes:
According to hurricane probability models, there is a 2.6% chance a hurricane will impact the New York City area (New York City, Westchester, and Long Island) during any given hurricane season. During a 50-year period there is a 13.6% chance a hurricane will impact the New York City area and a 3.3% chance an intense hurricane (Category 3 or higher) will affect the City.
These numbers suggest that NYC had every reason to believe that it would be just a matter of time before a storm of Sandy's magnitude (with a surge equivalent to a Category 2 strength storm) hit the city (and indeed there are numerous experts who said as much). According to the report, a Category 3 strength storm could bring 25 feet or more to NYC -- Sandy plus 10 ft. -- and according to the report such a storm has 3.3% chance of striking over 50 years.

Mayor Bloomberg said in his endorsement of President Obama:
Our climate is changing. And while the increase in extreme weather we have experienced in New York City and around the world may or may not be the result of it, the risk that it may be — given the devastation it is wreaking — should be enough to compel all elected leaders to take immediate action.
Deft politics. Note how responsibility for Sandy is subtlety shifted away.

Yet, Mayor Bloomberg is also an elected leader. What is he going to do about the fact that his city was less prepared than it should have been for a disaster that was expected and one of a sort will certainly recur, climate change or not?

If the media devotes 10% of the energy to this topic that it is devoting to the climate change connection, New Yorkers will be well served.

A Summary of Sandy Discussions

Here is a short guide to the various discussions of Sandy on this blog and from a few of my Tweets this week.
Thanks!

What Role Do Emissions Reductions Have in Reducing Future Hurricane Losses?

Not long ago I wrote a paper exploring the relative sensitivities of future hurricane (tropical cyclone) damage to changes in climate and changes in society, titled, "Future economic damage from tropical cyclones: sensitivities to societal and climate changes" (PDF).

The paper was peer-reviewed and appeared in the Philosophical Transactions of the Royal Society A, as part of a special issue based on an earlier workshop. The damage that analyzes is comprehensive, inclusive of wind, flood, sea level rise and other factors that might increase losses in the future.

Here is the abstract:
This paper examines future economic damages from tropical cyclones under a range of assumptions about societal change, climate change and the relationship of climate change to damage in 2050. It finds in all cases that efforts to reduce vulnerability to losses, often called climate adaptation, have far greater potential effectiveness to reduce damage related to tropical cyclones than efforts to modulate the behaviour of storms through greenhouse gas emissions reduction policies, typically called climate mitigation and achieved through energy policies. The paper urges caution in using economic losses of tropical cyclones as justification for action on energy policies when far more potentially effective options are available.
And here is the conclusion:
This paper finds that under a wide range of assumptions about future growth in wealth and population, and about the effects of human-caused climate change, in every case there is far greater potential to affect future losses by focusing attention on the societal conditions that generate vulnerability to losses. Efforts to modulate tropical cyclone intensities through climate stabilization policies have extremely limited potential to reduce future losses. This conclusion is robust across assumptions, even across unrealistic assumptions about the timing and magnitude of emissions reductions policies on tropical cyclone behaviour. The importance of the societal factors increases with the time horizon. This does not mean that climate stabilization policies do not make sense or that policy makers should ignore influences of human-caused climate change on tropical cyclone behaviour. It does mean that efforts to justify emissions reductions based on future tropical cyclone damages are misleading at best, given that available alternatives have far greater potential to achieve reductions in damage. The most effective policies in the face of tropical cyclones have been and will continue to be adaptive in nature, and thus should play a prominent role in any comprehensive approach to climate policy.
A literature review by Laurens Bouwer of the Institute for Environmental Studies published in Risk Analysis just a few months ago not only came to substantially similar conclusions looking out to 2040. Interestingly, he also found that the sensitivity analysis that I conducted included the largest climate change effects on tropical cyclones of any published study (which makes sense as my analysis was a sensitivity analysis, and did not offer projections). 

Bouwer concluded:
Climate policy through the abatement of greenhouse gas emissions is important, given the likelihood that continued warming of the planet could lead to other (sometimes irreversible) impacts in second half of the 21st century. Mitigation policy therefore seems warranted for avoiding impacts beyond 2050. Also, changes in the frequency of other, smaller scale weather extremes, notably droughts, heat waves, wildfires, and extreme rainfall, although they have not been specifically assessed here, can occur. But changes in risk from major weather hazards (storms and floods) in the short term, up to the middle of the 21st century, are likely to be dominated by changes in exposure and vulnerability. This indicates the very important role for adaptation and risk reduction in strategies for reducing the impacts from weather natural hazards that are expected to occur in the short term.
In short, arguments that climate mitigation policies are a useful tool for addressing disasters have essentially no basis in the scientific literature. This is not an argument against mitigation. It is an argument against justifying mitigation by the notion that disaster losses can be usefully modulated by energy policy.

Papers cited:

Bouwer, L.M. 2012. Projections of future extreme weather losses under changes in climate and exposure. Risk Analysis, DOI: 10.1111/j.1539-6924.2012.01880.x

Pielke Jr R. A. 2007. Future economic damage from tropical cyclones: Sensitivities to societal and climate changes. Philosophical Transactions of the Royal Society A, 365:2717–2729. (PDF)

01 November 2012

Normalized US Hurricane Damage 1900-2012, Including Sandy

The graph above shows normalized US hurricane damage, based on data from ICAT, which applies an extension to the methodology of Pielke et al. 2008. The 2012 estimate for Sandy comes from Moody's, and is an estimate.  The red line represents a linear best fit to the data -- it is flat.

Running Faster on the Climate Treadmill

The Newsweek cover above comes from January, 1996. The Bloomberg Businessweek cover is current.

Efforts to motivate change in energy policy based on linking disasters to human-caused climate change have a long history. How has that worked out?

This time, surely, is different, right?