Antarctica holds about 27 million cubic kilometers of ice that is constantly flowing, pushed by its own weight and pulled by gravity. If just part of that ice – the West Antarctic Ice Sheet – were to melt into the ocean, it would raise global sea level by 6 meters. That’s more than a theoretical problem. West Antarctica is losing ice mass, and scientists are worried.
Warming air temperatures and warming water both play a role. So does geography.
“As our planet warms, the polar regions are warming faster than anywhere else on our planet and the ice sheets are changing. They’re melting and they’re sliding faster toward the ocean. Global sea level is going up, and we expect that to go up faster as more of the ice melts,” said Robin Bell, a glacialogist at Columbia University’s Lamont-Doherty Earth Observatory who is leading the Changing Ice, Changing Coastlines Initiative with paleoclimatologist Maureen Raymo.
To understand how a massive ice sheet can become destabilized, we need to understand the structure of the land that holds the ice on Antarctica today.
Bell and her colleagues engineered a way to do that in some of the most remote regions on the planet. They took radar and other technology normally used to study the sea floor and attached it to a C-130 cargo plane in a capsule called the IcePod. By flying over the ice sheets – as they’re doing right now over Antarctica’s giant Ross Ice Shelf – they can see where the ice enters the ocean and map the ice layers and the terrain hidden beneath it.
Ice shelves, like Ross, are particularly important to the West Antarctic Ice Sheet’s stability. They jut out over the water ahead of flowing glaciers and slow the glaciers’ flow into the ocean. The biggest threat to ice shelves is warmer water brought in by ocean currents that flows low along the continental shelf and eat away at the base of the ice shelf. This line where ice, water and rock meet is called the grounding line. As the ice erodes, the grounding line moves inland, and geography comes into play: In West Antarctica, most ice shelves are on slopes that slant inward toward the center of the continent. As the grounding line moves inland and into deeper water, the ice shelf becomes unstable and can break apart.
After the Larson B Ice Shelf broke off from Antarctica and disappeared over the span of a few weeks in 2002, the glaciers it held back started flowing at eight times their previous speed. It was a wake-up call, as Bell explains in the video above.
The Ross Ice Shelf is much larger than Larson B and is an outlet for several major glaciers from the West Antarctic Ice Sheet. And it’s only one area of West Antarctica that has scientists concerned.
To the west of the Ross Ice Shelf, on the Amundsen Sea, scientists see evidence that the massive Thwaites and Pine Island Glaciers are also moving faster as their grounding lines recede. At the Pine Island Glacier, the grounding line receded about 31 kilometers between 1992 and 2011, contributing to the glacier’s increasing speed and ice loss starting around 2002. One recent study used computer modeling to look at what might happen and suggests that if the Amundsen Sea glaciers were destabilized, a large part of the West Antarctic Ice Sheet would discharge into the ocean. Another study found that the rate of thinning in West Antarctic ice shelves had increased 70 percent over the past decade based on satellite data, and some ice shelves lost as much as 18 percent of their volume between 1994 and 2012. (To learn more about changing ice sheets, look for the Polar Explorer app being released by Lamont-Doherty Earth Observatory this fall.)
These and other findings led the National Academies of Sciences to issue a recommendation this summer that the U.S. Antarctic Program at the National Science Foundation make changing ice sheets and their contribution to sea level rise one of its top research goals for the next 10 years, particularly in West Antarctica. The fate of the ice sheets has a direct impact on humanity: as land-based ice melts, it raises sea level, and that can threaten coastal communities and economies worldwide.
“Our planet’s large ice sheets contain secrets that will be uncovered by studies of the changing ice and changing coastlines,” Bell said. “New expeditions to poles to decode how they work what makes them flow deform and melt while new studies of ancient shoreline will inform how fast the change occurred in the past. We envision a new phase of exploration and discovery to inform our future.”
Learn more about West Antarctica and the impact of rising temperatures on marine life, part of a video series.
As the Paris climate summit approaches, a new study shows in detail that it is technologically and economically feasible for the United States to sharply reduce greenhouse gas emissions in line with the international goal of limiting global warming to 2 degrees Celsius or less. The report says it is possible to revamp the energy system in a way that reduces per capita carbon dioxide emissions from 17 tons per person currently to 1.7 tons in 2050, while still providing all the services people expect, from driving to air conditioning.
The two-volume report is from the Deep Decarbonization Pathways Project. The project is led by the Sustainable Development Solutions Network, a United Nations-sponsored initiative whose secretariat is at Columbia University’s Earth Institute, and the Institute for Sustainable Development and International Relations. The analysis itself was conducted by the San Francisco-based consulting firm Energy and Environmental Economics Inc., in collaboration with researchers at Lawrence Berkeley National Laboratory and Pacific Northwest National Laboratory.
The first volume describes the technology requirements and costs of different options for reducing emissions. An update of a study released last year, it lays out in detail the changes in the U.S. energy system needed year by year to meet the target, looking at every piece of energy infrastructure—from power plants and pipelines to cars and water heaters—in every sector and every region of the U.S.
The report says this can be done using only existing technology, assuming continued incremental improvements but no fundamental breakthroughs, and without premature retirement of current infrastructure, at a net cost equivalent to about 1 percent of Gross Domestic Product in 2050.
The report finds multiple technology pathways capable of reaching the target, presenting choices that can be made based on innovation, competition and public preference. Passenger cars, for example, could be switched to battery-powered electric vehicles or fuel-cell vehicles. Low-carbon electricity could be provided by renewable energy, nuclear power, or carbon capture and storage. The authors looked closely at the reliability of a power grid with high levels of intermittent wind and solar energy, using a sophisticated model of the electric system’s operation in every hour in every region.
“I think our work throws down a gauntlet to those who claim that decarbonization of the U.S. energy system is impractical and out of reach,” said report lead author Jim Williams, chief scientist at Energy and Environmental Economics and director of the Deep Decarbonization Pathways Project. “Arguments that the U.S. can’t achieve this technologically or economically don’t hold water.”
Williams said, “The challenges are often not what people think. The public has been conditioned to think of climate policy in terms of costs, burdens, loss of services. But if we get it right, we will create a high-tech energy system that is much more in sync with a 21st century economy, and there will be many more economic winners than losers.”
The second volume provides a roadmap for what policy makers at the national, state and local levels need to do to enable a low carbon transition. It describes how businesses and whole regions could benefit in an energy economy where the dominant mode shifts from purchasing fossil fuel, with historically volatile prices, to investment in efficient, low carbon hardware, with predictable costs.
The U.S. study is part of a series by the Deep Decarbonization Pathways Project, an international collaboration of research teams from the world’s 16 highest-emitting countries. This year it has issued country-specific strategies for deep decarbonization also in Australia, Brazil, Canada, China, France, Germany, India, Indonesia, Italy, Japan, Mexico, Russia, South Africa, South Korea and the United Kingdom.
“The DDPP has taken an essential step in low-carbon energy policy, and the work of the U.S. team points the way forward for the Paris summit,” said Earth Institute Director Jeffrey Sachs. “Happily, the U.S. government has also endorsed the idea of preparing deep decarbonization pathways as a critical tool for achieving the transformation to low-carbon energy systems worldwide.”
In September, a joint statement on climate change cooperation by President Obama and President Xi Jinping of China stressed “the importance of formulating and making available mid-century strategies for the transition to low-carbon economies.”
In the run-up to Copenhagen, there was widespread hope that the conference would lead to a legally binding agreement that would include commitments that would keep global temperatures within tolerable levels. None of that happened. Copenhagen did lead to widespread agreement that an increase in global average temperatures of more than 2 degrees Celsius would be intolerable (though the small island states wanted a 1.5 degree goal, which they need to survive). The developed countries also agreed to come up with $100 billion annually, starting in 2020, to assist in mitigation and adaptation measures in the developing countries.
As we head to Paris, the expectations are profoundly lower. The national commitments that countries are putting on the table (“Intended Nationally Determined Contributions”) do not add up to nearly enough to keep us within 2 degrees; instead the plan is to come back every five years and hopefully do better. Nor will they be legally binding; fulfillment of them will be monitored and reported, but there will be no sanctions for missing them, and no one can sue to enforce them. (To be fair, even though the Kyoto reduction requirements were legally binding in theory, there were no meaningful sanctions available for missing them, either.)
It is still mathematically possible to stay within 2 degrees, but the odds of actually doing so seem to be receding by the month.
The $100 billion plan is still on the books, but the pledges made so far are well short of what is needed even for the first year. And there is growing evidence that, even if that amount of money were found every year, it would not be nearly enough to meet the needs—especially if temperatures are on a pathway well above 2 degrees.
One encouraging development is that it was unclear before and during Copenhagen whether China would commit to reducing its greenhouse gas emissions. China’s emissions are continuing to grow at a rapid pace and dominate the world picture, but the central government of the country has made serious promises to cap emissions (though not until 2030), and is participating much more fully in the international climate regime than it did in 2009.
On the other hand, in 2009 there were still real prospects for U.S. climate legislation; the White House and both the Senate and the House were controlled by Democrats who favored such legislation. Today, however, both the Senate and the House are controlled by Republicans who reject the basic science of climate change and are doing everything they can to stand in the way of President Obama’s use of existing statutory authority to fight climate change.
Thus the results of the U.S. national election in November 2016 will be even more important for the future of the global climate than the outcome of the Paris conference.
This post is one in a series reflecting on what has changed since the climate talks of 2009 in Copenhagen. Gerrard was among those writing for State of the Planet about those talks back in 2009, contributing several reports, which you can find in this compilation of stories about the Copenhagen talks. Here is an excerpt:
Many people, including myself, are now looking through the documents and trying to figure out just what they mean. But it is clear that the conference achieved neither a universally accepted binding legal agreement that would have assured a dramatic reduction in greenhouse gas emissions (and perhaps have denoted a return of the Age of Miracles), nor a complete breakdown. …
Major fights lie ahead about whether the measures agreed to will succeed in meeting the developed countries’ goal of keeping future increases in global temperature to 2 degrees Celsius; whether achieving even that goal will be sufficient to prevent catastrophic damage in some of the most vulnerable countries; and many other issues. It is also highly uncertain whether the conference’s results will make the U.S. Senate more or less likely to approve U.S. legislation.
Today’s a special day in the annals of Antarctic exploration, it’s been 100 years to the day since Ernest Shackleton’s ship Endurance was crushed by ice and finally sank after 307 days beset in the pack ice of the Weddell Sea. The disaster ended Shackleton’s hopes of leading the first team to cross the Antarctic continent, but set the stage for one of the most audacious maritime adventures of the era. You can read more about that in Frank Worsley’s excellent book Endurance, or in Shackleton’s own book South. Or you can take the easy way out and read the Wikipedia article here. To mark the occasion the Royal Geographical Society has released a new set of digitized images from the expedition. The images were digitized by scanning the photographic plates directly, the resulting resolution is extraordinary.
There are, not surprisingly, a lot of Antarctic history nerds in Antarctica, so we had a small celebration in honor of the Endurance today. It’s also a good day to reflect on modern Antarctic science and travel. Things have evolved a bit since 1915; the only open small boat journeys that we get to take are to our designated sample sites, and we don’t get to take them in anything approaching exciting conditions. We also have these actual research stations to operate from; for US researchers those are Palmer Station (where I’m writing from), McMurdo Station (less a research station than a logistics hub), and the Amundsen-Scott South Pole Station (which I have not been to). You might be asking exactly who operates these stations and how. Where, for example, does the trash go? What about sewage? There are some key differences between the stations but they all follow the same operational logic (that’s a nice way of saying the operation isn’t always logical). By request here’s a quick look at the inner workings of Palmer Station.
First, the basics. Palmer Station was built by the Navy Seabees over a three year period starting in 1965. It was purpose-built for science and, unlike McMurdo Station, was never a military station*. Today the station is operated by something called the Antarctic Support Contract (ASC) on behalf of the National Science Foundation. The ASC is an interesting construct and the relationship between scientists (the end-users of the stations), ASC itself, the individual ASC personnel on-site, and the National Science Foundation resembles a particularly intricate four-party dance that no one has mastered. A lot of toes get stepped on but, in the end, a lot of science gets done. The ASC operates as a subsidiary of a much larger logistical company and is subject to periodic rebidding. Currently the ASC is held by Lockheed Martin, before that it was held by Raytheon. The parent company changes but the internal structure and personnel of the ASC stay more or less the same.
The maximum capacity of Palmer Station is around 46 people, though a typical summertime population is probably closer to 40. Most of these are ASC personnel. At this exact moment there are 34 people at station, 24 of whom work for ASC. Debating the merits of more or fewer ASC personnel supporting fewer or more scientists would take a much longer blog. Suffice to say that toe’s a little bruised. One issue is that the station is old and it takes quite a few people to keep it running (and the personnel here do a great job of that). Another issue is that the station is set up for science groups to come in and out with a minimal time commitment. That’s convenient for scientists, but discourages coordination among science groups or long-term investment in the system by any one group (the Palmer LTER study is a major exception to this). Because of this two ASC personnel have full time jobs just supporting us in basic tasks; allocating space, procuring chemicals, supplies, fixing equipment, etc.
McMurdo Station has the feel of a South Dakota boom town (although I think all of those are de-booming at the moment) with a peak population around 1,200. As a result of the potential environmental impact of 1,200 people in a somewhat-pristine coastal environment there has been some investment in environmental protection at McMurdo. Sewage, for example, is treated in a top-of-the-line sewage treatment facility that is no different from what you’d find in any small municipality. Unfortunately no such investment has been made at Palmer Station. Our sewage and food waste gets a quick grind in a macerator before being released into Arthur Harbor. While this probably doesn’t have any catastrophic impact on the local ecosystem it certainly does have some impact. You can quickly identify the location of the sewage outfall from the gulls and penguins that congregate there (there was an elephant seal in there yesterday, Jacuzzi-like I suppose?). And while it is certainly a bigger problem at McMurdo, the input of artificial hormones and other pharmacological products into the local seawater is a bit disconcerting. This would be a perfect place to test new sewage treatment technology, I’m not sure why that isn’t done (oh right, $$).
Most of the other waste streams at Palmer are treated with a little more care. Food waste that can’t get macerated (e.g. chicken bones) get burned in a barrel (okay, not much care there), virtually everything else gets transported out by ship. Regular trash gets compressed, bundled, and disposed of in Chile. Laboratory waste, which may contain trace amounts of nasty things, gets transported to Chile, then by cargo vessel to the United States. Actual hazardous lab waste, broken down by type, goes out the same way every two years.
Fortunately, since we end up feeding a lot of it to the wildlife in Arthur Harbor, the quality of food at Palmer Station is very high. There are two chefs on staff and they take it seriously. They succeed in doing this without making it seem excessive; I recall being a bit offended that steak, lobster, and other luxury items are flown – at great expense – into McMurdo Station (yet getting scientific equipment flown in or samples out takes nearly an act of Congress). There’s no air traffic here, everything comes in by ship, and the cuisine leans more toward the good home cooking variety. I enjoy it with minimal guilt.
Station power comes from a surprisingly small diesel generator. This and the backup generator keep the diesel mechanic, who also doubles as a heavy equipment operator, pretty busy. McMurdo Station has experimented with diversifying its power sources with varying degrees of success. It has a small (and I understand underutilized) wind farm, and early on it had a small nuclear power plant. I’m not aware of any similar experiments at McMurdo, and really, I’m not sure what else you could do. It’s very cloudy here and it snows a lot, so solar would be a bad choice. The station is far too small to justify nuclear, and that’s pretty unpopular these days anyway. Plenty windy here, but there are a ton of birds, and I hear that wind turbines and birds are a bad mix. So I think we’re stuck with diesel.
There’s one additional quirk that I think is unique to Palmer Station. Everyone, from the station manager to the station doctor to the scientists, pitches in with housework. Once a week you take your turn cleaning up after dinner, and every Saturday afternoon you draw an additional cleaning task out of a hat. It can be a bit annoying when you have to stop doing science to clean, but it’s worth it for the extra sense of community.
*McMurdo Station was originally Naval Air Facility McMurdo, although the purpose of McMurdo Station has always been (mostly) scientific.
This week the Earth Institute’s International Research Institute on Climate and Society convened a 2-day workshop reflecting on efforts over the past 20 years to improve responses to climate variability, especially risks associated with El Niño. Concerns that the current El Niño has the potential to exceed in severity the devastating El Niño of 1997-1998 permeated the discussion. At the conference I presented a brief overview of the social, economic and political changes that will have a large effect on human impacts from El Niño. I amplify those remarks here. For more information, thoughts and opportunities to engage on questions of how climate, fragility and risk interact, check the Environment, Peace and Security Certificate Program website.
Much of the discussion about the fear that the current El Niño will turn out to be even worse than the devastating 1997-1998 El Niño neglects a crucial fact. Today’s El Niño is unfolding over a world that is in many ways more vulnerable than the world of 1997-1998. Just as today’s climate continues to generate extremes without historical precedent, we are starting to see elements of social vulnerability also without historical precedent.
That is an alarming combination.
It is relevant because historical experience tells us that El Niño roughly doubles the risk of major political insecurity breakdowns in countries affected by its weather impacts. So if the year brings together unprecedented weather extremes and unprecedented patterns of fragility, the risks may be worse than our preparations.
Think of a typical pair of office scissors. Their two blades are not especially sharp, yet they can cut very well because of how they interact. In the same way social impacts that arise from extreme weather depend on what kind of underlying vulnerability such weather encounters. We have heard a lot about the meteorological blade of the scissors.
Let us now consider the societal blade.
Global food prices in 1998 were at their long-term average. They have been markedly higher since the shocks of 2008, and even after a period of abating pressure last year remain 25 percent above their long-term average in real terms. As a result, poor communities and vulnerable regions have an elevated baseline risk of food insecurity. Compounding this effect is the unusually high global levels of income inequality, which Thomas Piketty and others have drawn attention to—the poorest of the poor are worse off in many parts of the world.
Changes to the global food system have diminished our ability to respond to food crises since 1998. Global food stocks have shrunk from about 100 days’ worth of consumption to about 60 today. And changes in where those stocks are held make it far more difficult to direct them to humanitarian crises. Finally, government budget deficits in donor countries are far higher than before, making it harder to mobilize large crisis responses.
Politically, the world is showing signs of heightened fragility. Some elements of this fragility were already underway in 1998, and what is alarming is that they have not yet abated. One measure of such fragility is the number of countries experiencing a transitional political state characterized by neither strong democratic institutions nor strong autocratic institutions. Known as anocracies, such countries are not well equipped to absorb exogenous shocks and are highly vulnerable to various forms of instability. Since the late 1990s, they have been at historically unusual highs.
Other elements of political fragility are worse than in 1998. The amount of territory outside of state control has increased to an unexpected and scary degree since 1998, including a number of countries that qualify as failed states (such as Libya) and countries no long exercising sovereignty over major areas (such as Syria). Such areas pose multiple risks. They provide havens for trafficking, terrorism and other illicit behavior. They trigger population displacement. They augment risk of epidemics. And the people within them suffer high levels of vulnerability to food insecurity and other impacts from climatic stress.
The trend in heightened political fragility is now clear enough to be counted as a defining risk of our age. It is also one of the saddest surprises of the past decade, following over 25 years of broad progress toward enhanced security and stability, as documented by scholars such as Steven Pinker. In the last 10 years, security breakdowns have increased in number and intensity, and the resulting human tragedies and geopolitical upheavals have secured a permanent foothold in our daily headlines.
When the post-WWII record for global refugees and internally displaced populations was broken last year, topping 50 million for the first time, it came amid so much bad news that it scarcely got the attention it deserved.
These changes take place against a backdrop of rapid population change in the poorest countries of the world, which has the effect of increasing the number of people exposed to the risks of El Niño. There are 1.3 billion more people in the world now than in 1998. Calculations with spatial data carried out by Tom Parris and colleagues at ISciences show that an additional 230 million people now reside within the areas most affected by the 1997-1998 El Niño.
That’s like adding an additional Indonesia (203 million people in 1998) and Malaysia (23 million in 1998) to the El Niño front lines.
Moreover, in areas where rapid urbanization is not being met with equally fast increases in jobs and political participation, the potential for protests and instability is also rising. Here, too, the trends are not in our favor. In 1998 poor cities were growing at about 3 million people per year. Today the number is 7.5 million.
If you thought things couldn’t get worse, recall that whatever weather shocks emerge from El Niño today will do so in the context of long-term climate change that is manifesting at a more rapid pace than we earlier anticipated. September 2015 was about half a degree Centigrade higher than September 1997, for example. For the year as a whole, 2015 is shaping up to be the hottest ever on record—if trends continue it will be about a quarter of a degree hotter than 1998 and a third of a degree hotter than 1997.
Fractions of degrees may not seem like much at first glance, but when the global climate system entered lesser degrees of this non-analog state earlier in the decade we witnessed such unprecedented disruptive shocks as the heat waves that triggered the global food crisis associated with the Arab Spring, unusual devastating floods in Pakistan, widespread and traumatic wildfires in the western and southwestern U.S., and unusual large storms such as those affecting Myanmar, Philippines and the United States.
So the fact that the 2015-2016 El Niño will do its damage against an even higher level of baseline climate risk ought to give us serious pause.
Some societal risks we know with some confidence, stemming from analysis of the historical data. Food security problems, population displacement, disease outbreaks and political unrest are among such risks. Others are less well understood. Being less well understood does not make them less significant.
The risks that are relevant when considering how El Niño might interact with the underlying social and political changes underway are less predictable than El Niño itself. Cataclysmic breakdowns in human security are thankfully rare events, shaped by a number of causal forces that are marked by high uncertainty.
We cannot say whether El Niño will definitely trigger specific events culminating in large-scale crises in the coming year, in the same way that we wouldn’t be able to know for sure whether a specific drunk driver will cause an accident.
But as with the drunk driver, we know enough to say the risks are high and scary. We ought to be looking harder at whether we are prepared.
To hear my full talk, visit the IRI El Nino 2015 conference website and watch the beginning of the day two recording.
A snapshot of the changing climate of the West Antarctica Peninsula, where the impact of fast-rising temperatures provides clues about future ecosystem changes elsewhere.
When the ship pulls up at Palmer Station each Antarctic spring, the arriving scientists glance up at the massive glacier that covers most of Anvers Island. It has been retreating about 7 meters per year, and this year is no different.
In this part of Antarctica, on the peninsula that sweeps toward South America, the climate is changing fast.
“Global warming is affecting Antarctica just as it’s affecting everywhere in the world at this point, but it is proceeding faster in both of the polar regions than it is anywhere else on the planet. What happens here is an early warning of what will be happening to ecosystems elsewhere – it’s just happening sooner and faster in Antarctica,” said Hugh Ducklow, the lead principal investigator at Palmer Station and a biogeochemist at Lamont-Doherty Earth Observatory.
Temperatures have been warming on the West Antarctic Peninsula at about 0.5° Celsius per decade since the early 1950s, a rate about four times faster than the global average. While winter sea ice extent in the Southern Ocean as a whole has changed little, the sea ice here begins to advance about 2 months later than it did in the 1980s and retreat about a month earlier. The West Antarctic Peninsula is bathed by relatively warm waters from the Antarctic Circumpolar Current that comes close to the surface near the peninsula, and that current is gaining heat as the oceans warm, studies show.
The changes and their cascading effects are showing in the ice and in the numbers and species of marine wildlife. The population of native Adélie penguin has declined from 15,000 pairs in the area around Palmer Station in the 1980s to fewer than 3,000 today. Penguin species from farther north, the Chinstrap and Gentoo, have started moving in, while Adélie numbers are increasing farther south in a region that hasn’t experienced as much warming. Fur seals and elephant seals, neither native to the area, are also now appearing near the Anvers coast. (Read more about the changing habitat of penguins and seals in this recent post and in the video above.)
Warming temperatures and changes in the sea ice matter for the entire marine food chain in this region where whales feed in the summer and large numbers of sea birds breed.
The ice is an important factor in the strength of the spring phytoplankton bloom and for the growth of ice algae, which are both important food sources for krill, which in turn are the main food source for the region’s penguins, whales and seals.
When sea ice covers the coastal water in early spring, it prevents the spring bloom from starting too early, when it could be disrupted by storms, explained Jeff Bowman, a marine biologist from Lamont who is currently working at Palmer Station. As temperatures rise, the sea ice leaves earlier, and climate phenomena that drive weather patterns could impact marine life in different ways. Studies suggest that the Southern Annular Mode (SAM) is more likely to be positive, meaning stronger winds will be more common, likely disrupting phytoplankton growth, and tropical storms could send precipitation across the Southern Ocean that can put penguin eggs and chicks at risk. (Read more about phytoplankton and what Bowman is seeing at Palmer Station in his research blog, Polar Microbes.)
The scientists at Palmer Station see changes like these up close every year, and they have collected data through the Long-Term Ecological Research program for the past two decades to track ecological and environmental changes and how those shifts cascade through the ecosystem. Two years ago, a video team joined them. You can watch the Palmer Station scientists at work and see the surrounding environment in the movie Antarctic Edge: 90 Degrees South.
Learn more about Lamont-Doherty Earth Observatory scientists and their work.
The plan, going into Copenhagen in late 2009, was to broaden and deepen the Kyoto Protocol. This plan failed. The draft agreement prepared in advance of this conference was very long and filled with brackets, indicating that countries could not agree about very much. Once it became clear that an agreement about limiting emissions could not be negotiated, a short agreement was put together on the spot by a subset of countries.
This agreement set a global goal for limiting temperature change and invited all countries to submit pledges for the contributions they intended to make to this global effort. This agreement was not “legally binding,” and an analysis of the pledges submitted after this conference indicated that they fell far short of the emission limits that would need to be made if the global goal were to be achieved.
The Paris conference will build on the foundation laid so hastily in Copenhagen. The intention is to negotiate a “legally binding” agreement that will include arrangements for measuring, reporting, and verifying emissions. Rather than negotiate emission limits, countries are submitting new “Intended Nationally Determined Contributions.”
An analysis by the United Nations Framework Convention on Climate Change Secretariat shows that these intended contributions, if fulfilled, will still allow global emissions to increase through 2030. The claim is that the pledges being made will reduce emissions relative to “business as usual,” but this is a hard claim to substantiate since “business as usual” isn’t observable. Moreover, it isn’t obvious that countries will fulfill their pledges.
All of the pledges made in Paris will be voluntary. It is hoped that, by a process of “pledge and review,” these pledges can be strengthened—and that countries will feel obligated to fulfill them. However, countries have not always fulfilled their pledges in the past, and it isn’t obvious that this agreement is going to cause countries to behave very differently in the future.
Scott Barrett is Lenfest-Earth Institute Professor of Natural Resource Economics, Columbia University. His website is www.globalpublicgoods.com. He and Carlo Carraro and Jaime de Melo have written a new e-book, out this month, that you can download: “Towards a Workable and Effective Climate Regime.”
This post is one in a series reflecting on what has changed since the climate talks of 2009 in Copenhagen. Barrett was among those writing for State of the Planet about those talks back in 2009. Here is an excerpt from back then (the full text is here and here):
The three pages of text that emerged after years of preparation and two weeks of intense negotiation in Copenhagen signally fail to address what the document correctly calls “one of the greatest challenges of our time”—global climate change. To many, the Copenhagen Accord will seem a setback; actually, it is a continuation of a long history of failure. The essential problem lies with the strategy of addressing this complex issue by means of a single agreement. Breaking this colossal problem into smaller pieces would allow us to achieve more.
… Climate change is the greatest collective action problem in human history, so we should not be surprised that it has been difficult to address. But our approach has made it harder than necessary. A better way to negotiate would be to break this colossal problem up into smaller pieces, addressing each piece using the best means appropriate.
The lines of data are slowly creeping across our Ross Ice Shelf GIS map and with each new line comes an improved understanding of Ross Ice Shelf. What can you learn from a ‘snapshot’ of data? The radar image above contains a nice story. You can see the ice thickness in the Y-axis of the annotated radar image. The ice shelf is approximately 300 meters thick. For scale this means you could stand 3 statues of liberty one on top of another and still have 21 meters of ice layered above them. The top layer on the ice shelf is snow that has accumulated on the surface of the shelf, layered almost flat as it fell on a level ice surface. Below you can see the ice that has flowed in from the Antarctic ice sheet with rumpling and roughness collected as it moved over the rougher terrain of the bed topography. Below that you can see the faint outline of the bottom of the ice shelf. This is where the radar stops, unable to image through the ocean water.
The radar and gravity work together to create a complete image of the Ross Ice Shelf and the bed below. Radar provides information on the ice layers but stops where the gravity excels, at the ice/ocean interface. With two gravimeters strapped down side by side in the LC130 and humming away as they collect data, the dual instrumentation has the project well covered.
A schedule of flying two flights a day can be exhausting. However, the limited time in an Antarctic field season led to the plan to fly with two crews so data can be collected day and night; after all radar, gravity and magnetics don’t need the light to collect images. The personnel have been broken into teams so there is a constant rotation of working, sleeping and data review. The small cohort in the science team has been training each other on the various instrument operations to provide more flexibility in the flight planning.
Being on the ice can be an intense and compressed time…but it can also be filled with unexpected problems and delays. Weather has cancelled several flights, as have priority needs of the guard to handle emergencies and support missions. The cold, dry static environment can be hard on equipment and a couple of the laptops used to manage the data have stopped working adding stress to the workload, as has illness. Although the team has been challenged by the cold and the weather they have managed 6 flights with more on the upcoming docket.
The team has taken advantage of down days to share presentations about the ROSETTA project with the other residents of McMurdo and Scott Bases. McMurdo and Scott are both located on Ross Island just at the edge of the Ross Ice Shelf. McMurdo is home to the U.S. research teams housing about 1000 residents during the austral summer season. Scott Base is home to the New Zealand teams, with close to 100 during the austral summer season. Sharing science is one of the perks of polar fieldwork.
Check out the newest lines on the GIS map and stop back for more. As we write the team is gearing up for another flight….and more lines of data with more stories.
For more about this NSF and Moore Foundation funded project, check our project website: ROSSETTA.
Margie Turrin is blogging for the IcePod team while they are in the field.
Smoke from forest fires has been choking cities across Southeast Asia for months. The hazy, yellow blanket poses serious public health and economic risks for the communities it envelops. Indonesian authorities have been working hard to put out the fires, but have had trouble preventing the fires, which are intentionally set to clear land for agriculture. The government has resorted to using warships to evacuate its citizens from fire-affected areas. However, few of the other inhabitants of Indonesia’s forests, like the endangered orangutans, have made it to safety.
The fires are more than a local menace—they pose a global threat as well. They are huge—visible from space—and have begun to garner worldwide attention. Burning peat releases immense stores of carbon dioxide to the atmosphere, strongly contributing to global warming. Since September, daily CO2 emissions from the fires alone routinely surpass the average daily emissions from the entire U.S. economy.
Why is so much carbon released from these fires?
The forests now burning had been growing on, and were contributing to, vast stores of peat—the partially decayed plant material that accumulates where growth is faster than decay. Peatlands (places where peat accumulates) are an important part of the Earth’s climate system—they are the primary locations where carbon from the atmosphere is sequestered on land. While they cover only 3% of the land surface, they hold 30% of soil organic carbon.
Since the end of the last ice age, approximately 600-700 gigatonnes of carbon—that’s 600-700 billion tonnes—have been sequestered from the atmosphere as peat, roughly equivalent to the total amount of carbon that was in the atmosphere before industrial times. As tropical peat forests are drained and burned, carbon that took thousands of years to accumulate is rapidly released back into the atmosphere.
Guido van der Werf, a forest scientist at the University of Amsterdam, estimated the total emissions from the fires this year to be 1.62 billion tonnes of CO2, or about 0.44 gigatonnes of carbon (GtC) so far this season.
Let’s put this number into some context.
Of the 600-700 GtC that have been sequestered in peatlands globally since the last ice age, about 100 GtC are in the tropics; the rest are in boreal and Arctic regions. That means that the emissions from the fires this season alone are nearly half a percent of the total carbon accumulated in all tropical peatlands since the last ice age ended a little more than 11,000 years ago, or 0.15% of all the carbon emitted by human activity since the industrial age began.
Indonesian peatlands accumulate carbon at a rate of about 55 grams of carbon per meter squared per year—higher than most tropical peatlands elsewhere. If we generously extrapolate this rate to the about 400,000 km2 of tropical peatlands worldwide, the rate of accumulation of all tropical peatlands globally is only about 0.02 GtC annually. At 0.44 GtC, the Indonesian peat fires this season have emitted what took the entire Earth’s tropical peatlands at least 22 years to accumulate.
Forest fires are set each year, often illegally, for agriculture. The most common purpose is to produce palm oil, an economically important commodity, which is an ingredient in thousands of foods and cosmetics. Indonesia produced about 31 million tonnes of palm oil last year, and is projected to produce 31.5 million tonnes this year. The U.S. Energy Information Administration reports that burning a gallon of gasoline releases 2.5 kilograms of carbon to the atmosphere. This year, a gallon of palm oil from Indonesia will have released almost 10 times that, 24 kilograms of carbon, from burning peat forest.
Palm oil can be produced sustainably, but there is great economic pressure on small farmers to produce palm oil as quickly and cheaply as possible. Enforcement of laws that prohibit burning of peat forest is left to local authorities who are more easily influenced by their constituents and the palm oil companies that support them than by the Indonesian national government. Consumer pressure may be the only way to reduce these unsustainable agricultural practices.
What makes this year so bad?
El Niño conditions in the Pacific Ocean have made Indonesia and the rest of Southeast Asia particularly dry, lowering the water tables in the peat forests, making much more of the peat available to burn. Normally, peat does not burn below the water table, where it is water saturated. Further, many places where land is cleared by burning are also drained, artificially lowering water tables even further.
During the last major El Niño, 1997-98, about 0.95 GtC were released from Indonesian peat forest fires, and though this year only about 0.44 GtC have burned away, the season is not yet over.
It’s been a busy few days as we wrap up ice sampling and make the transition to sampling by boat at the regular Palmer LTER stations. This afternoon we’ll break down the ice removal experiment we started over a week ago. On Monday we went out for the final sampling at our ice station – though if the ice sticks around for a couple more weeks we’ll try to go out one final time to see how the spring ice algal bloom is developing. The heavy snow cover on the sea ice has delayed the start of the bloom, however, things are starting to happen. During our last sampling effort we lowered a GoPro camera underneath the ice to take a look.
You’ll notice a couple of interesting things about the underside of the ice. First, it’s extremely rough. Landfast sea ice often looks like this; the ice forms from many small flows being compressed together against the shoreline during the fall. As a result there is a lot of “rafting” of small ice floes atop one another. This can present some real challenges when selecting a sampling spot. The first couple of holes that we tried to drill exceeded what we knew to be the mean thickness of the sea ice. It took a few tries to find a representative spot.
You’ll also notice that the ice has a distinct green color, concentrated on the lower (or higher, in the video) rafts. That’s the start of the ice algal bloom. If the ice was snow free the bloom would have developed by now into a thick carpet. You can contrast the video above with the image at right of sea ice sampled from McMurdo Sound roughly three weeks earlier in the season (in 2011). Although much thicker that ice was covered by only a few centimeters of snow. If the Arthur Harbor ice sticks around for a couple more weeks it will develop some good growth (unless the krill come along and graze the algae down). You might be wondering why, if the algae are limited by the availability of light, they are concentrated on the deeper rafts further from the light. I’m not entirely sure, but I have a hypothesis. I’ve been searching for a literature reference for this and haven’t located one yet, but I recall hearing a talk from an expert on the optical physics of sea ice describing how the sunlight that manages to penetrate sea ice reaches a maximum some distance below the ice. This might seem counter-intuitive, but makes sense if you consider the geometry of the floes that coalesced to make the ice sheet.
As you can observe in the video the light is largely penetrating the ice around the edges of these floes. The rays of light enter the water at an angle, and intersect at some distance below the ice determined by the mean size of the floes and (I’m guessing) the angle of the sun. The depth where this intersection happens is the depth of greatest light availability. Above this depth the water is “shaded” by the ice floes themselves. In our case I think this depth corresponds with the depth of those deeper floes. Unfortunately our crude hand-deployed light meter and infrequent sampling schedule are insufficient to actually test this hypothesis. We’d need a much higher-resolution instrument that could take measurements throughout the day. Something to think about for the future.
In the meantime Rutgers University undergraduate Ashley Goncalves, spending her Junior year with the Palmer LTER project at Palmer Station, made this short video that describes the process of collecting water for our experiment from below the ice in Arthur Harbor. Let the boating begin!
There is a lot more reason for optimism about the Paris climate talks than there was before Copenhagen in 2009. In particular, this time around, President Obama has taken clear steps to reduce U.S. emissions of greenhouse gases, mainly via executive action in the face of an intransigent Congress. Perhaps as important was the president’s recent cancellation of the Keystone pipeline, which was a largely symbolic move, but drew public attention to the issue of “stranded assets.” Why should fossil fuel companies continue to explore for more oil and gas, and produce fuel from increasingly low-grade reservoirs, using technically difficult methods, when identified reserves are already larger than any safe limit on total emissions?
This said, the commitments made by various nations in advance of the Paris meeting are far too small to effectively curb increasing atmospheric CO2, as articulated by Steve Koonin in a New York Times op-ed a week ago, for example. There are two ways of looking at this. The first is Koonin’s: “The flood is coming, start building your ark.” No doubt he is right, to some degree. It seems probable that growth of fossil fuel emissions will continue, and atmospheric CO2 will exceed 600 ppm by mid-century. At that point, many expensive adaptations to climate change will already be underway. Also, the negative consequences of greenhouse gas accumulation may be clear enough to warrant implementation of a palette of methods for carbon-dioxide removal from air. (Interested readers should refer to the National Research Council report on this topic). Alternatively, we will be stuck with high greenhouse gas concentrations and continued warming for centuries to come.
Another perspective is that in Paris the international community will commit themselves to taking effective action designed to curb emissions and avoid warming beyond 2° C. Having made this commitment, together with some first steps, they may return in future years to amend their specific regulations, in order to succeed in their agreed goal. This is essentially what happened with regard to regulating CFC (chlorofluorocarbon) emissions in order to preserve the Earth’s stratospheric ozone layer. The first international treaty, The Montreal Protocol, signed in 1987, committed nations to effective action. But the specific regulations in that treaty were insufficient for success. Because a commitment had been made, however, subsequent revisions were relatively easy to implement, and ultimately success was—more or less—achieved.
Finally, it is just possible that Paris is less important than it seems. The dramatic fall in price for electrical generation using solar photovoltaic (PV) technology has been accompanied by 35 percent annual growth of PV capacity in this century, with wind power capacity growing in parallel at more than 25 percent per year. This is, in part, a success story for tax breaks and other incentives that have driven rapid growth, which in turn reduced the unit costs.
Conventional wisdom is that this rapid pace will soon diminish. When wind and solar electrical generation exceed a few tens of percent of the total, energy storage methods will become essential, to account for the temporally intermittent, spatially disbursed nature of renewable power generation. In turn, energy storage technology is improving very slowly, so most people think that a storage bottleneck will persist past 2050.
However, perhaps this conventional wisdom is wrong. If the international community were to fully understand the threat of climate change, and the likely cost of mitigation and adaptation, perhaps we would commit to continued tax breaks and incentives, and propel the renewable energy transition toward completion. In the long run, I am sure this would be less expensive than coping with the consequences of growth in greenhouse gas emissions through 2050. The energy industry can be incredibly nimble, as recently exemplified by 35 percent annual growth in fracked oil production in North Dakota. If solar PV and wind, including storage, were to become truly cost-competitive at the utility scale, there is no doubt that renewable capacity would continue to double every few years.
Finally, one more thought. The energy transition, and/or mitigation of CO2 emissions, will surely be expensive. However, it is not clear to me why this is considered to be a drain on the economy. We already spend plenty of money taking care of waste products, via sewage treatment and garbage disposal. It seems to me that large-scale replacement of energy infrastructure, or carbon dioxide removal from air, like the recent replacement of communications and data storage infrastructure, will serve to create jobs and economic growth. Of course, such processes will involve a net transfer of resources to some groups, away from others. Perhaps that is the main impediment to progress.
This post is one in a series reflecting on what has changed since the climate talks of 2009 in Copenhagen. Keleman was among those writing for State of the Planet about the Copenhagen talks in 2009. Here is an excerpt from back then (the full text is here):
… Somehow, some people have come to believe that if a single study suggesting human-induced climate change is incorrect, the entire scientific basis for the hypothesis is invalidated. A corollary, implicitly adopted by some “believers” and “skeptics” alike, is that predictions of warming due to human CO2 emissions must be almost certain in order to justify major efforts to reduce CO2 output.
… Nevertheless, everyone involved needs to embrace the idea that all scientists are skeptics; that all scientific theories are open to doubt; and in particular that future projections of climate change are subject to considerable uncertainty. Furthermore, the economic and environmental impacts of warming are also uncertain, as are the costs of CO2 mitigation. When scientists hide these uncertainties, or simply don’t discuss them, they lose credibility.
… Does this mean that no political action should be taken until scientific uncertainties are resolved? Of course not. … atmospheric CO2 concentration continues to rise, more rapidly and to higher values than recorded in gas trapped in glacial ice over the past 500,000 years. This is mainly due to use of fossil fuels, and it is pushing us further and further into uncharted territory. Though there are many other factors that influence global climate, there is no doubt that CO2 is a greenhouse gas. And, in addition to the threat of climate change, there are ample reasons to conserve energy and reduce our dependence on fossil fuels. The longer we delay, the higher will be the cost of limiting CO2 in the atmosphere. The cost may be high now, but it will only get higher in the future.
The Science, Revisited
The climate is changing. We’re causing it. It’s going to affect our lives and our livelihood, if it isn’t already. It’s going to be expensive. But we can do something about it.
That’s how a group of young scientists at a conference in 2013 summed it up. This video, shot by climate scientist William D’Andrea of the Lamont-Doherty Earth Observatory, explains in the simplest terms possible what we ought to know about climate change, and why we should care.
This is one in a series of posts looking back at some key State of the Planet stories about climate science. The original post about D’Andrea’s project, with a second video, is here.