Author: Aaron Dubrow

Bay Area storms get wetter in a warming world

Source(s): The University of Texas at Austin

The December 2014 North American Storm Complex was a powerful winter storm, referred to by some as California's "Storm of the Decade." Fueled by an atmospheric river originating over the tropical waters of the Pacific Ocean, the storm dropped 8 inches of rainfall in 24 hours, sported wind gusts of 139 miles per hour, and left 150,000 households without power across the San Francisco Bay Area.

Writing in Weather and Climate Extremes this week, researchers described the potential impacts of climate change on extreme storms in the San Francisco Bay area, among them the December 2014 North American Storm Complex.

Re-simulating five of the most powerful storms that have hit the area, they determined that under future conditions some of these extreme events would deliver 26-37% more rain, even more than is predicted simply by accounting for air's ability to carry more water in warmer conditions.

However, they found these increases would not occur with every storm, only those that include an atmospheric river accompanied by an extratropical cyclone.

The research — funded by the City and County of San Francisco and in partnership with agencies including the San Francisco Public Utilities Commission, Port of San Francisco, and San Francisco International Airport — will help the region plan its future infrastructure with mitigation and sustainability in mind.

"Having this level of detail is a game changer," said Dennis Herrera, General Manager of the San Francisco Public Utilities Commission, which was the lead City agency on the study. "This groundbreaking data will help us develop tools to allow our port, airport, utilities, and the City as a whole to adapt to our changing climate and increasingly extreme storms."

These first-of-their-kind forecasts for the city were made possible by the Stampede2 supercomputer at the Texas Advanced Computing Center (TACC) and the Cori system at the National Energy Research Scientific Computing Center (NERSC) — two of the most powerful supercomputers in the world, supported by the National Science Foundation and Department of Energy respectively.

Hindcasting with the Future in Mind

Certain facets of our future climate are well established – higher temperatures, rising seas, species loss. But how will greater greenhouse gas concentrations and warmer air and oceans effect extreme weather, like hurricanes, tornadoes, and heavy rainfall? And where precisely will these changes be the greatest and under what conditions?

Forecasting the natural hazards of the future is the mission of Christina Patricola, Assistant Professor of Geological and Atmospheric Sciences at Iowa State University and lead author on the Weather and Climate Extremes paper. Her research helps quantify and understand the risks we face from natural hazards in the future.

Using supercomputers allowed Patricola to model the region with 3 kilometer resolution. Scientists believe this level of detail is needed to capture the dynamics of storm systems like hurricanes and atmospheric rivers, and to predict their impact on an urban area.

For each of the historical storms, Patricola and her collaborators ran 10-member ensembles – independent, slightly different simulations — with 3 kilometer resolution, a process called ‘hindcasting' (as opposed to forecasting). They then adjusted the greenhouse gas concentrations and sea-surface temperatures to predict how these historical storms would look in the projected future climates of 2050 and 2100.

Patricola calls these "storyline" experiments: computer models that are meant to be instructive for thinking about how historically-impactful storm events could look in a warmer world. Focusing on events that were known to be impactful to city operations provides a useful context for understanding the potential impacts of events if they occurred under future climate conditions.

To identify the type of storm event, the researchers examined sea-level pressure, precipitable water, and 10-m winds throughout the evolution of each storm, with these variables shown at the peak of the event. Here storm total precipitation (mm) from the (a) gridMET observations and (b) ensemble mean of the historical simulations on the 3 km resolution domain from the December 1-7, 2014 storm. [Patricola et al. 2022]

The study doesn't address changes in the frequency of extreme storms in the future and therefore can't address how precipitation will change overall, she said. (Another pressing question for California planners.) But they can help decision-makers understand trends in the intensity of the worst-case-scenario storms and make informed choices.

On the West Coast, much of the precipitation that falls is associated with atmospheric rivers (ARs), which transport a substantial amount of moisture in a narrow band, Patricola explained. Some of the storms they looked at featured ARs alone. Others had ARs at the same time as low-pressure systems known as extratropical cyclones (ETCs).

"We found something very interesting," she said. "Precipitation increased substantially for events with an atmospheric river and a cyclone together, whereas precipitation changes were weak or negative when there was only an atmospheric river."

The difference, she believes, lies in the lifting mechanism. In general, heavy precipitation requires moist air to ascend. While the AR-only storms showed a future increase in atmospheric moisture, the storms with an AR and ETC showed a future increase in atmospheric moisture and rising air. Additional investigations will explore this relationship.

High Performance Climate Science

Patricola has used TACC supercomputers for climate and weather modeling since 2010, when she was a graduate student at Cornell University working with leading climate scientist, Kerry Cook (now at The University of Texas at Austin). She recalls that her first models had a horizontal resolution of 90 km – 30 times less resolved than today — and were considered state-of-the-art at the time.

"It was a very big help to have the resource from TACC and NERSC for these simulations," she said. "We're interested in extreme precipitation totals and hourly rainfall rates. We had to go to a high resolution of 3 km to make these predictions. And as we increase resolution, the computational expense goes up."

Patricola has used the methodology she developed to understand other phenomena, like how tropical cyclones may change in the future. She and collaborator Michael Wehner reported on these changes in a 2018 Nature paper. "If a hurricane like Katrina happened at the end of the 21st century, what could it be like? More rainfall, higher winds? Our method can be used for any type of weather system that can be hindcasted."

In the next phase of the San Francisco project, Patricola will work with city staff and their collaborators to understand what the weather changes mean in terms of city operations.

"This project is relatively unique and one of the initial projects like this, working in very close collaboration between city agencies and climate scientists," she said. "It can serve as a good example of what climate science can do to provide the best possible information to cities as they prepare for the future."

Explore further

Country and region United States of America
Share this

Please note: Content is displayed as last posted by a PreventionWeb community member or editor. The views expressed therein are not necessarily those of UNDRR, PreventionWeb, or its sponsors. See our terms of use

Is this page useful?

Yes No Report an issue on this page

Thank you. If you have 2 minutes, we would benefit from additional feedback (link opens in a new window).