The Dam-biguity

Catastrophe modeling of floods is not just a problem of stochastic rainfalls, run-off and channel flows. It also requires anticipating the actions of the human factors; for flood is as much a man-made peril as it is a natural peril.

The passive human interventions, such as the permanent flood defenses, are less of a challenge to model than the active interventions. Will the portable flood defenses be installed in time?

Perhaps they have already been borrowed by some upstream community, as happened for one town in the U.K. along the River Severn in the summer 2007 floods.

In “active flood management,” land and properties upstream get deliberately sacrificed to protect a downstream concentration of value. In modeling one can assume the decisions are rational and involve carefully calculated trade-offs. The same cannot always be said for human actions.

In 1927 the grandees of New Orleans, concerned that the city was about to be inundated by the Mississippi river, blew up the levees 13 miles downstream of the city with 39 tons of dynamite (with the idea of speeding up the flow of water through the city). The action proved completely unnecessary.

The Great Mississippi Flood of 1927 in Natchez, Mississippi, showing a submerged train with boats brought in for rescue (Courtesy of NOAA’s National Weather Service Collection from the family of Captain Jack Sammons, Coast and Geodetic Survey)

One of the biggest of all these challenges of flood modeling concerns how to factor in the role of dams.

  • What is the water level in the dam likely to be when the flood wave arrives?
  • How are the operators of the dam likely to have behaved ahead of the flood?

In many low latitude countries the problem for the operators is they often have two irreconcilable objectives.

Objective 1: The dam operator has to hold onto as much water as possible through the rainy period so that water remains available to all agricultural, industrial and domestic users through the dry season. The reservoir should be completely full the day the rain stops.

Objective 2: The dam operator is expected to hold back a large proportion of a flood wave, releasing the water after the wave has passed. To be effective the operator needs to have as little water as possible in the reservoir before the flood arrives.

Simply because dry years tend to happen more often than extreme floods, most operators work to the first objective.

In Thailand there was a drought in 2010 and the dam operators were accused of not holding enough water in reserve, so they topped up their reservoirs at the start of 2011 and had little capacity to manage the flood waves of the ensuing summer and autumn.

Earlier the same year, much the same situation happened in Brisbane, Australia. After catastrophic floods in 1974 a main branch of the Brisbane River had been dammed to create Lake Wivenhoe. Over the years the dam was increasingly used for water retention. When the intense rains came in early January 2011 the dam operators soon ran out of any storage capacity. In March 2011 the Insurance Council of Australia claimed that “release from Wivenhoe Dam raised water levels in the Brisbane River by up to 10 meters” – and that the January flood event could be classed as a “dam release flood.”

“Dammed if you do and dammed if you don’t.”

Being a dam operator can be a very stressful function! Ideally dam operators need optimization software to assist in this process – including long-range rainfall forecasts to determine their optimum strategy and costs for the flooding as well as an expected price of water in a period of low rainfall.

For now, when developing a river flood catastrophe loss model it is safest to assume that the dam will not be functioning at the optimum for flood wave reduction. In episodes of prolonged heavy rainfall the reservoir will cease to have any capacity for water retention – so that the flooding downstream will be as if the dam did not exist.

Chief Research Officer, RMS
Robert Muir-Wood works to enhance approaches to natural catastrophe modeling, identify models for new areas of risk, and explore expanded applications for catastrophe modeling. Recently, he has been focusing on identifying the potential locations and consequences of magnitude 9 earthquakes worldwide. In 2012, as part of Mexico's presidency of the G20, he helped promote government usage of catastrophe models for managing national disaster risks. Robert has more than 20 years of experience developing probabilistic catastrophe models. He was lead author for the 2007 IPCC 4th Assessment Report and 2011 IPCC Special Report on Extremes, is a member of the Climate Risk and Insurance Working Group for the Geneva Association, and is vice-chair of the OECD panel on the Financial Consequences of Large Scale Catastrophes. He is the author of six books, as well as numerous papers and articles in scientific and industry publications. He holds a degree in natural sciences and a PhD in Earth sciences, both from Cambridge University.

Leave a Reply

Your email address will not be published. Required fields are marked *