If you asked scientists a few years ago if a specific hurricane has been caused by climate change, most would have told you that, while it raises the risks, no single weather event could yet be attributed to climate change.
That’s starting to change.
New research and more powerful computer models are advancing scientists’ ability to tease apart the forces that can worsen extreme weather. Last summer, in an early example of this kind of work, Columbia University scientists were able to make the first estimates of the role climate change played in the recent California drought.
In a new report released March 11, a committee of the National Academy of Sciences assesses the young field of attribution studies for several types of extreme events. It recommends future research and guidance to help the field advance and contribute to understanding of the risks ahead.
“I think the most important message from this report is that attribution can be done. Science has reached a point where we can begin to see the human influence on climate in some individual events,” said Adam Sobel, a member of the National Academy of Sciences committee and a research scientist at Columbia University’s Lamont-Doherty Earth Observatory and Fu Foundation School of Engineering and Applied Sciences.
No weather event has a single cause, but rising temperatures may intensify a drought, for example, or make heat waves more frequent. Extreme events that are most closely connected to temperature – heat waves in particular – tend to have the greatest certainty in climate attribution studies, followed by drought and extreme precipitation, the committee writes. Tornadoes, which occur on a small scale, and wildfires, which are influenced by other human actions such as brush clearing, are the most challenging, but advances in the science could make attribution possible even for these events.
Understanding the Forces Behind Extreme Weather
Attribution studies generally try to determine how the intensity or frequency of an event was altered by climate change or other factors, such as weather patterns like El Niño. Often, they take the details of an event – extreme rainfall, for example – and put them into a computer model, then use the model to show what would have happened without certain variables, such rising temperatures caused by increased greenhouse gases. Some use models to estimate the probability of an event occurring with or without changes, then use the probabilities to estimate the fraction of attributable risk.
“Attribution depends not just on how good our models are at simulating each event type, but also how well we understand the basic physics of each one and how good our historical observations are,” Sobel said.
Fundamental research into understanding the forces behind extreme weather has been underway across Lamont and the School of Engineering for many years, particularly for use in risk assessment.
Scientists at the two schools have been developing increasingly sophisticated models, both to understand the physics of how extreme weather behaves and, from a practical standpoint, to assess its future risk to communities. Others have been reconstructing the history of temperature swings and droughts in regions around the world using tree rings, ocean sediment and other natural records.
Some of the newest work in severe weather risk assessment is being done on tornadoes and hail as increasing computing power makes high-resolution modeling possible. Sobel and Michael Tippett have been developing environmental indices for tornadoes and applying them to El Niño and other weather variations to build seasonal forecasts. The underlying science for the El Niño work,which correctly predicted increasing tornadoes across the U.S. South last fall, is closely related to what would be used for attribution, Sobel said.
Connecting Attribution and Forecasting
In the National Academy of Sciences report, the committee writes that attribution studies are strongest for events that have a long-term historical record of observations and can be simulated in climate models. The committee also suggests guidance, including using multiple lines of evidence and clearly communicating how the attribution question is framed.
Looking ahead, the committee suggests connecting attribution analysis with operational weather forecasting to provide valuable warnings of the likelihood of extreme events occurring days or even months in advance. If a community knows it faces a growing risk of extreme events, it can make better choices. “For example, in the wake of a devastating event, communities may need to make a decision about whether to rebuild or relocate. Such a decision could hinge on whether the occurrence of an event is expected to become more likely or severe in the future – and if so, by how much,” the authors write.
“People pay attention to extreme events in the present in a way that they never will to any future climate projection,” said Sobel, who also leads the Columbia Initiative on Extreme Weather and Climate. “Such events become teachable moments – occasions for people to think about what climate change really means. In those moments, it is natural for people to want to understand how climate change actually influenced the event that just happened. Attribution tries to answer that.”