California: Planning for a shorter rainy season and more frequent extreme storms

Upload your content

By Claire Kouba and J. Pablo Ortiz Partida

California’s hydrologic future is muddled by a fundamental uncertainty: will the state get wetter or drier? Climate models disagree on this question, but provide insights on other important water management questions.

The wetter or drier question has been studied often in government reports (DWR CCTAG, 2015; U.S. Bureau of Reclamation, 2016) and a variety of academic studies (Connell-Buck et al., 2011; Dogan et al., 2019; Medellín-Azuara et al., 2008). Forecasts for California mean annual precipitation commonly range from at least 20% wetter to 20% drier on average.

This focus on the uncertainty of future mean annual precipitation has unnecessarily deterred investment in adaptive management of water resources (Persad et al., 2020). While there is little model agreement on change in mean annual precipitation, there is much more model agreement on other hydroclimate metrics relevant to water resources management, including:  

  • snowpack declines
  • increased fraction of precipitation on extreme rainfall days
  • a shorter, sharper rainy season
  • increased ET
  • higher frequency of extremely wet and extremely dry years, and
  • higher incidence of “whiplash” years where an extreme dry year follows an extreme wet year or vice versa.

Future shifts in these metrics were estimated in a recent study (Persad et al., 2020) using 10 statistically downscaled global climate models and two common emissions scenarios. These predicted shifts were used to alter the climate inputs to a regional hydrologic model for the Scott Valley in the Klamath Basin (Case Study 1), and to assess changes in Oroville reservoir storage based on outputs by Knowles et al.(Knowles et al., 2018)(Case Study 2).

Case Study 1 – Greater extremes threaten regional groundwater sustainability

In the first case, a more extreme rainfall regime was simulated in Scott Valley with primarily agricultural land cover using the Scott Valley Integrated Hydrologic Model (Foglia et al., 2018; Tolley et al., 2019). The objective was to estimate effects of increased average storm intensity on sustainable groundwater management.

To simulate the upper range of scenarios in this region with RCP 8.5 emissions, the historical rainfall record was altered so daily precipitation was redistributed within each water year such that the 5% of highest-rainfall days received 7% more water, and the remaining days with rain were proportionally reduced to keep total annual precipitation at its historical value for each water year. All other model inputs (reference ET, stream inflows, etc.) remained the same as estimated historical values.

The results suggest that a temporal concentration of rainfall increases both recharge through soils to the aquifer and crop irrigation demand. This is because there are fewer days when rainfall is sufficient for crop water needs (increasing the number of days when irrigation is needed), while the number of days when rainfall exceeds soil field capacity increases (increasing total volume of infiltration that becomes groundwater recharge rather crop transpiration). Also, because this case does not incorporate effects of other predicted phenomena, such as increased average ET, this is a conservative prediction of increased irrigation demand.

Notably, though the increases in recharge and irrigation demand are small (2% or less) over the whole model period, it can more substantially impact water use behavior and the sustainability of groundwater budgets in some years, such as 2010.

Case Study 2 – Shifting inflows reduce reservoir storage

Continuing the current rates of heat-trapping gas emissions would likely further concentrate reservoir inflows into already wet winter months (November-March), as shown in many climate change studies for California since the late 1980’s (Cayan et al., 2008; Gleick, 1989; Hayhoe et al., 2004). The second case study assessed how changing seasonality of inflows from predicted shifts in timing and type of winter precipitation would affect Lake Oroville, California’s second largest reservoir.

Ironically, even though average reservoir inflows may be greater with severe climate change, the timing shift means the extra water would come when current operation rules require releasing excess water to protect against floods (Knowles et al., 2018). Because the extra outflows would occur in the wet winter months, when downstream agricultural water users don’t need it, such releases reduce average water storage in the reservoir and ultimately reduce water availability for the dry season. The data show that stored water declines by roughly 17 percent annually and by more than 35 percent during September and October, when reservoir storage is already at its lowest.

Moving forward

To capture the variability of potential future climate, operational models driven by daily or subdaily inputs are needed (e.g., Willis et al., 2011) . Most current regional water system models use monthly inputs, making it more difficult to evaluate changes that might result from a higher frequency of extreme storms.  

Overall, with uncertainty in future mean annual precipitation, we need not rely on assumptions of stationarity in hydroclimate forecasts. These findings suggest that researchers and agencies can begin incorporating some less-discussed hydroclimate shifts into water planning efforts.

Explore further

Country and region United States of America
Share this

Please note: Content is displayed as last posted by a PreventionWeb community member or editor. The views expressed therein are not necessarily those of UNDRR, PreventionWeb, or its sponsors. See our terms of use

Is this page useful?

Yes No
Report an issue on this page

Thank you. If you have 2 minutes, we would benefit from additional feedback (link opens in a new window).