Monthly Archives: July 2013

Diving into Flood Risk

Central Europe is still recovering from the massive flooding that followed one of the wettest months of May in recorded history for this region. In late May and early June, a period of intense rainfall caused major river systems, including the Danube and Elbe, which were already flowing above normal levels, to swell rapidly and burst their banks. Further showers and thunderstorms raised river levels higher, bringing localized flash floods.

Flooding near the source of precipitation occurred rapidly but destruction spread as the flood wave propagated downstream over the following week, impacting numerous cities over a vast area. Dike overtopping and breaching was common during this event, with significant on-floodplain flooding. Large areas of Germany, Austria, and the Czech Republic were seriously affected, while Switzerland, Poland, Slovakia, Hungary, Croatia, and Serbia were also impacted to lesser extents.


June 10, 2013 flooding and overflow of the banks of the Elbe along the lower course at Havelberg in Saxony-Anhalt. (Source: CC-BY-SA-3.0

According to Aon Benfield, the Central Europe floods were the costliest economic disaster during the first half of 2013—and the costliest to insurers, with expected payouts of US$5.3 billion (£3.5 billion) or more, with Germany experiencing the greatest loss.

This was a very different type of event to last year’s 2012 U.K. floods, where the sheer persistence of rainfall throughout the year saturated the soil and raised groundwater levels. Consequently, numerous small-scale, off-floodplain floods were observed, with surface water or pluvial flooding composing a significant component of the total insured loss.

Although most individual event losses were not notable, 2012’s estimated accumulated loss reached US$1.8 billion (£1.2 billion) (the second largest U.K. flood-related insured loss after the 2007 floods).

The contrasting nature of the past two years of attritional and catastrophic European floods raises the question: How can the insurance industry effectively manage flood risk in the face of such a widespread and variable peril? And, what role should the government play in mitigating the risk?

In part, government spending on flood defenses helps to manage the hazard; as demonstrated in Prague this year, where new defenses successfully protected vulnerable locations. However, budgets for such schemes are finite, meaning some regions will remain vulnerable and defense schemes are often targeted towards protecting against flooding in the floodplain or at the coast, which will not protect against the type of inland flooding observed in the U.K. in 2012.

For properties that remain in vulnerable locations, affordable insurance is their only means of protection but the frequency and severity of the hazard at such locations, makes it challenging for the insurance industry to offer affordable cover.

While there are positive initiatives on the horizon, such as the proposed Flood Re scheme in the U.K. (a government and industry backed flood pool), which is planned to replace the existing Statement of Principles (a voluntary commitment by the U.K. insurance industry). Making such schemes work requires a comprehensive evaluation of the hazard; a piecemeal approach will not suffice.

As demonstrated in the U.K. in 2012, it is necessary to consider all sources of flooding on- and off- the floodplain, including surface water flooding. The sheer scale of the 2013 Central European floods also showed that this peril can’t be viewed through the constraints of geographical boundaries; rather, large-scale catchments must be assessed.

In 2015 RMS will release a new and expanded pan-European inland flood model on RMS(one). The model will cover 13 at-risk countries, including those affected in 2012 and 2013, to comprehensively and consistently evaluate the risk from all sources of inland flooding, considering all underlying aspects of the hazard, at a high resolution across Europe. The scale and resolution of this model will, for the first time, enable the (re)insurance industry to assess European flood risk in a coherent manner, with the level of detail required to manage this dynamic and extreme peril.

Saving Lives with a Cat Model

On July 18 and 19, I was part of a two-day workshop in New York organized by UN agencies, and involving international NGOs and government aid agencies, to promote the goal that reductions in disaster casualties become part of the 2015 revision of the Millennium Development Goals. We might just be on the edge of a whole new era and market for catastrophe modeling.

The UN Millennium Development Goals (MDGs), introduced in 2005, comprised eight powerful and simple goals around worldwide poverty reduction, all designed to be measurable and achievable by 2015. One goal is to reduce maternal mortality by three quarters; another to reduce deaths of children under five by two thirds. Using the mortality statistics from each country, it would be clear within a few years whether that country was on track to meet its target.

However, the original Millennium Development Goals included nothing around disasters, even though disasters had the potential to cause large numbers of casualties and push whole communities back into poverty.

A key problem with disasters concerns what could be measured. Between 1900 and 2010 fewer than ten people died from earthquakes in Haiti. Then in February 2010 more than 200,000 were killed. You can’t simply measure casualties over a five year period and expect to sample the “true” risk.

The Millennium Development Goals are coming up for renewal in 2015—and a number of governments, including those from the UK, Netherlands, and Japan, are lobbying hard to get a goal around catastrophe mortality included in the next generation of MDGs.

And this is where catastrophe models can have an important role. For catastrophe models were developed to help solve the identical problem in insurance. You can’t simply wait for catastrophes to happen to discover the technical price for catastrophe insurance. The big losses are too infrequent and variable. The model creates a virtual population of ten thousand or a hundred thousand years of catastrophes, from which it is now possible to find the annual average loss cost.

The insurance catastrophe model has to be reconfigured to measure casualties. For example, most earthquake casualties occur when buildings collapse, so the building inventory and vulnerability data need to be focused around this outcome. Then the distribution of “human exposure” within the buildings must be calculated for at least two key times of day, because it makes a big difference whether people are at work or at home. An earthquake at 2 a.m. may have only 10% of the casualties of an earthquake at around 2 p.m.

None of this procedure is new. At RMS we have used this basic methodology for estimating the workers compensation claims that will result from a California earthquake, as well as the total number of expected fatalities.

For humanitarian applications, such as the Millennium Development Goals, the casualty catastrophe model ideally needs to be run at the individual building level, because then we can use the model to explore how to meet measurable targets. We could rank the buildings—such as schools or hospitals—that present the greatest danger to loss of life in a city (measured in terms of expected annual fatalities)—and use that list to prioritize where to focus rebuilding or strengthening, so as to achieve the biggest reduction in casualties for the efforts involved.

A new Millennium Development Goal with the target to achieve a decadal 50 percent reduction in disaster fatalities would be very powerful. However, such a goal can only be set and measured through a catastrophe model. It took a few years for insurers to accept that catastrophe models were required to establish average catastrophe costs.

How long will it take the disaster community to that accept models provide the only way to measure average annual disaster fatalities? Fortunately the UN’s own efforts on quantifying disaster impacts in the UNISDR 2013 Global Assessment Report now fully endorse probabilistic methods.