What’s the Right Target – 350 or 450 ppm?
Thu, 09/17/2009 – 18:08 — maggie_zhou
This is a more concise and somewhat dry version of the analysis of greenhouse gas stabilization targets, that was first put out in this original post. Take your pick!
Greenhouse gas concentration stabilization targets, and consequently their emission reduction targets, should be guided by what science says is necessary, not what politics of the day deems feasible. Three converging lines of compelling scientific evidence now argue strongly for the need to reduce atmospheric greenhouse gas concentrations to below 350 parts-per-million carbon dioxide equivalent (ppm CO2-eq):
- IPCC’s 2007 synthesis of a suite of climate models indicates that developed countries need to reduce emissions to 25-40% below 1990 levels by 2020, and 80-95% below 1990 levels by 2050, in order to stabilize atmospheric greenhouse gases to 450 ppm CO2-eq. That concentration, however, would only provide a roughly 50/50 chance of staying below a catastrophic global average temperature rise of 2˚C above pre-industrial[2-4,14]. By December 2008, the Copenhagen Climate Science Congress, attended by over 2,000 scientists, concluded: “the worst-case IPCC scenario trajectories (or even worse) are being realized.” This indicates that even accepting the 450 ppm stabilization target requires at least the higher end of IPCC’s recommended reductions.
- New paleoclimate analysis indicates that 450+/-100 ppm was the boundary condition between an ice free planet (where sea level was more than 220 feet higher than today) and one with ice sheets 35 million years ago. This and other studies[7,8] point to 350 ppm as the safe upper limit for atmospheric greenhouse gases – a level well below our current level of 387ppm. During the past 800,000 years where ice core data is available, and possibly much longer, atmospheric CO2 concentration oscillated periodically as a result of positive feedback responses to the Earth’s orbital changes, and that lead to the glacial/interglacial cycles of temperature swings. At the peak of interglacial warmth of each cycle (which were also the peaks of CO2 concentration), CO2 never exceeded 300 ppm[6,11]. This led many climate scientists to caution that, due to the lack of sufficient understanding of climate feedback mechanisms, allowing CO2 to well exceed the historic realm relevant to human evolutionary history is too risky.
- Current observations clearly indicate that the Earth is already in a dangerous energy imbalance – absorbing more heat than it is radiating into space – at 387ppm. Polar ice covers are melting far more rapidly than anticipated; methane, a potent greenhouse gas, is bubbling out of warming sea beds and released from thawing permafrost; Alpine glaciers are retreating at record rates worldwide; deserts are expanding on every continent except Antarctica; extreme weather events are increasing dramatically around the globe; bleaching of coral reefs is becoming ever more frequent under the dual stress of ocean warming and acidification (due to CO2 absorption); and we are reaching tipping points that trigger positive feedback mechanisms that may push the warming out of human control.
United States must convene a comprehensive evaluation by climate, ecological, marine, soil, forestry, agricultural, geological and other scientists, as well as energy and technology experts, to identify viable and safe pathways leading to below 350 ppm CO2-eq as quickly as possible. It must be recognized that even the more ambitious end of the IPCC target range, namely, emissions reductions of 40% below 1990 levels by 2020 and 95% below 1990 levels by 2050, is not compatible with a 350 ppm target. Far more stringent emission reductions, coupled with a radical change in agricultural and land use practices to sequester carbon (i.e., opposite of emissions), is needed to return below 350 ppm CO2-eq[6,12].
The emissions cap in the US climate legislation that passed the House in June, The American Clean Energy & Security Act (ACESA, a.k.a. Waxman-Markey), when properly compared with the 1990 baseline emissions level used in the IPCC reports, only aims to reduce US emissions by 1% by the year 2020, and even that small domestic reduction will be severely compromised as massive hard-to-verify offsets allow polluters to avoid making real reductions, on top of all the other problems such as the extremely unreliable and fraud/influence prone cap-and-trade scheme, the invisibility of biomass incineration, etc. Such a target is incompatible with the safety and wellbeing of life on this planet. See also my blog on ACESA here.
- Intergovernmental Panel on Climate Change 4th Assessment Report, Working Group III report. Climate Change 2007: Mitigation of Climate Change (Cambridge University Press, Cambridge, 2007), chapter 13, Box 13.7 on page 776
- Intergovernmental Panel on Climate Change 4th Assessment Report, Working Group I report: Climate Change 2007: The Physical Science Basis (Cambridge University Press, Cambridge, 2007), chapter 10, Table 10.8. on page 826:
450 ppm CO2-eq corresponds to best estimate of 2.1°C rise above pre-industrial, and “very likely above” 1°C rise, and “likely in the range” of 1.4–3.1°C rise. (In other words, the temperature with the highest probability of being realized is 2.1°C rise, with roughly 50/50 odds for actually falling above or below it.)
- Intergovernmental Panel on Climate Change 4th Assessment Report, Working Group I report: Climate Change 2007: The Physical Science Basis (Cambridge University Press, Cambridge, 2007), chapter 10, Supplementary Figure S10.4. (page Sm.10-8), (This figure uses a range of parameter choices, and shows that 450 ppm stabilization has a medium (neither likely, nor unlikely) chance, or around 50% chance of staying below 2°C.)
- Meinshausen, M. (2006): ‘What does a 2°C target mean for greenhouse gas concentrations? A brief analysis based on multi-gas emission pathways and several climate sensitivity uncertainty estimates’, pp.253 – 280 in Avoiding dangerous climate change, H.J. Schellnhuber et al. (eds.), Cambridge: Cambridge University Press. (A 450 ppm CO2-eq stabilization concentration has a 26–78% (most likely 54%) probability of exceeding 2˚C warming relative to pre-industrial).
- Associated Press, March 12,2009, http://climatecongress.ku.dk/
- Hansen, J., M. Sato, P. Kharecha, D. Beerling, R. Berner, V. Masson-Delmotte, M. Pagani, M. Raymo, D.L. Royer, and J.C. Zachos, 2008: Target atmospheric CO2: Where should humanity aim? Open Atmos. Sci. J., 2, 217-231.
- Ramanathan and Y. Feng (2008): On avoiding dangerous anthropogenic interference with the climate system: Formidable challenges ahead. PNAS 2008 105: 14245–14250
- Stabilization at or below 350 ppm CO2-eq provides a 93% probability of staying below 2°C above pre-industrial, with a best guess of 1°C and a likely range of 0.6–1.4°C above pre-industrial.[4,2] It is important to note that, the findings of IPCC suggest that a rise of 1°C in mean global temperatures and, correspondingly, sea surface temperatures, above pre-industrial levels is the maximum that should be aimed for if the global community wishes to protect coral reefs.
- IPCC AR4 WGII. Climate Change 2007: Climate Change Impacts, Adaptation and Vulnerability. WGII Contribution to the IPCC AR4 (Cambridge University Press, 2007), pp 12, 321, and 853
- Hans Joachim Schellnhuber, head of the Potsdam Institute and climate adviser to German Chancellor and the EU, one of Europe’s leading climate scientists, believes that operating well outside the historic realm of CO2 concentrations is risky as long as we have not fully understood the relevant feedback mechanisms.
- Intergovernmental Panel on Climate Change 4th Assessment Report, Working Group I report: Climate Change 2007: The Physical Science Basis (Cambridge University Press, Cambridge, 2007), chapter 6, Figure 6.3, page 444.
- See my calculations using data from Table ES-2, Inventory of U.S. Greenhouse Gas Emissions and Sinks: 1990 – 2006, Published April 15, 2008. Confirmed by World Resources Institute analysis
- The following is an excerpt from the IPCC report, highlighting their emphasis that the conclusions in their report assumed linear responses in the climate system, and did not adequately consider the (now apparent and pervasive) non-linear responses, which leads to their underestimation of climate sensitivity (how quickly temperature rises in response to increased CO2). Of course, such fine prints in an academic publication is completely lost to policy makers, the media and the general public worldwide. This is taken from IPCC AR4 WGI (2007): Solomon SD, Qin D, Manning M, Chen Z, Marquis M, Averyt KB, Tignor M, and Miller HL (eds)], Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the IPCC (Cambridge University Press, Cambridge, 2007), chapter 10, Table 10.8. on page 826 (highlighting mine):
It is emphasized that this table does not contain more
information than the best knowledge of S and that the numbers
are not the result of any climate model simulation. Rather it
is assumed that the above relationship between temperature
increase and CO2 holds true for the entire range of equivalent
CO2 concentrations. There are limitations to the concept of
radiative forcing and climate sensitivity (Senior and Mitchell,
2000; Joshi et al., 2003; Shine et al., 2003; Hansen et al.,
2005b). Only a few AOGCMs have been run to equilibrium
under elevated CO2 concentrations, and some results show
that nonlinearities in the feedbacks (e.g., clouds, sea ice and
snow cover) may cause a time dependence of the effective
climate sensitivity and substantial deviations from the linear
relation assumed above (Manabe and Stouffer, 1994; Senior
and Mitchell, 2000; Voss and Mikolajewicz, 2001; Gregory et
al., 2004b), with effective climate sensitivity tending to grow
with time in some of the AR4 AOGCMs. Some studies suggest
that climate sensitivities larger than the likely estimate given
below (which would suggest greater warming) cannot be ruled
out (see Box 10.2 on climate sensitivity).