Author Archives: Robert Muir-Wood

About Robert Muir-Wood

Chief Research Officer, RMS
Robert Muir-Wood works to enhance approaches to natural catastrophe modeling, identify models for new areas of risk, and explore expanded applications for catastrophe modeling. Recently, he has been focusing on identifying the potential locations and consequences of magnitude 9 earthquakes worldwide. In 2012, as part of Mexico's presidency of the G20, he helped promote government usage of catastrophe models for managing national disaster risks. Robert has more than 20 years of experience developing probabilistic catastrophe models. He was lead author for the 2007 IPCC 4th Assessment Report and 2011 IPCC Special Report on Extremes, is a member of the Climate Risk and Insurance Working Group for the Geneva Association, and is vice-chair of the OECD panel on the Financial Consequences of Large Scale Catastrophes. He is the author of six books, as well as numerous papers and articles in scientific and industry publications. He holds a degree in natural sciences and a PhD in Earth sciences, both from Cambridge University.

The Impact of Insurance on Claiming

The term “observer effect” in physics refers to how the act of making an observation changes the state of the system. To measure the pressure in a tire you have to let out some air. Measure the spin of an electron and it will change its state.

There is something similar about the “insurer effect” in catastrophe loss. If insurance is in place, the loss will be higher than if there is no insurer. We see this effect in many areas of insurance, but now the “insurer effect” factor is becoming an increasing contributor to disaster losses. In the U.S., trends in claiming behavior are having a bigger impact on catastrophe insurance losses than climate change.

So, the question is – if you take away insurance what would happen to the costs of the damage, from a hurricane, flood, or earthquake? The problem with answering this question is that in the absence of an insurance company writing the checks, no-one consistently adds up all these costs, so there is little relevant data.

Beware of the Shallow Flood

Since the 1980s, the Flood Hazard Research Centre at Middlesex University, based in northwest London, has focused on costing residential and commercial flood damages. In 1990, they collected information on component losses in properties for a range of flood depths and durations – but without any reference to what had been paid out by insurance. In 2005 they revisited these cost estimates, now applying the principles that had been established around compensation for insurance claims. After bringing their 1990 values up to 2005 prices, they could compare the two approaches. For short duration flooding of less than 12 hours, the property costs of a shallow 10-centimeter (cm) (4 inch) flood had increased sevenfold, reducing to 5.2 times greater at 30 cm (one foot) and 3.5 times greater for 1.2 meter (four foot) floods.

Photo by Patsy Lynch/FEMA

Photo by Patsy Lynch/FEMA

For longer duration floods of more than 12 hours, the comparative ratios were around 60 percent of these. For household goods, the increases were even steeper; a tenfold increase at 10 cm flood depth, 6.2 times greater at 30 cm and 4.1 times greater at 1.2 meter. The highest multiple of all was for the shallowest 5 cm “short duration” flood where the loss for contents was 15.4 times greater.

Factoring in the Insurance Factor

The study revealed the highest “inflation” was for the shallowest floods. Some of the differences reflected changes in practice, but there may also be more expensive and vulnerable materials covering floors, as well as stationed on floors. However, the “insurance factor” is also buried in these multiples. Examining and itemizing what is included in this factor shows that the replacement of “new for old” has become standard, even though the average product is half way through its depreciation path. New for old policies discourage action by the policyholder to move furniture and other contents out of reach of a flood.  Salvage seems to have become out of the question along with a reduced acceptance of any spoilage however superficial. Redecoration is not just to the affected wall but for the whole room or even the whole house.

Yet many items in flood recovery might not be something that can easily be costed. What is the price of the homeowner working evenings to make repairs, borrowing power tools from a neighbor, or spending a few nights sleeping at a friend’s house while the house is being fixed? How do we find a cost for the community coming together to provide items such as toys and clothes after everything was swept away in a flood?

We know another feature of the insurance factor concerns urgency in making the repairs. In early December 1703, the most intense windstorm of the past 500 years hit London. So many roofs were stripped of their tiles that the price of tiles went up as much as 400 percent. As described by Daniel Defoe in his book “The Storm”, most people simply covered their roofs with deal boards and branches, and the roofs stayed in that state for more than a year until tiles had returned to their original prices. There was no storm insurance at the time. Today an insurer would not have had the luxury of waiting for prices to come down. They would be expected to make the repairs expeditiously.

And then there is the way in which the lightly worded insurance terms become exploited. There was the beach front hotel in Grand Cayman after Hurricane Ivan in 2004, or the New Orleans restaurant after the 2005 flooding, or the business within Christchurch’s central business district after the earthquake, all of which had no incentive to re-open because there were no tourists or customers, and therefore kept their business interruption (BI) policies spinning.

The Contractors After the Storm

And that is before we get into issues of “soft fraud”, the deliberate exaggeration of the damages and the costs of repairs. One area particularly susceptible to soft fraud is roofing damage. By all accounts, in states such as Texas or Oklahoma, freelance contractors turn up within hours of a storm, even before the insured has thought to file a claim, and ask for permission to get up on the roof. They then inform the homeowner that the whole roof will need to be replaced, and the work needs to be signed off immediately as the roof is in danger of collapse. The insurance company only hears about the claim when presented with a bill from the contactor. In surveying the cost of individual hailstorm claims from 1998 to 2012, RMS found that claims have increased by an average 9.3 percent each year over fifteen years. It seems roof repair after a hail storm has a particular attraction for loss inflation.

In the U.S., the Coalition Against Insurance Fraud estimates that fraud constitutes about 10 percent of property-casualty insurance losses, or about $32 billion annually, while one third of insurers believe that fraud constitutes at least 20 percent of their claims costs. But it is often not so simple as to designate an increased cost as “fraud”. It is the nature of the insurance product – to make generous repayments, to keep the client onside, so that they continue to pay their increased premiums for many years to come.

The challenge for catastrophe modeling is that the costs output from the model need to be as close as possible to what will be paid out by insurers. Ultimately, we must treat the world as it is, not how it might have been in some component-costed utopia.  In which case, we must provide the ability to represent how claims are expected to be settled, and explore the trends in this claiming process so as to best capture current risk cost.

The way that insurance alters claims costs is not a topic studied by any university research department, or international agency as they attempt to develop their vulnerability functions. It is something you can only learn by immersing yourself in the insurance world.

Has That Oilfield Caused My Earthquake?

“Some six months have passed since the magnitude (Mw) 6.7 earthquake struck Los Angeles County, with an epicenter close to the coast in Long Beach. Total economic loss estimates are more than $30 billion.  Among the affected homeowners, the earthquake insurance take-up rates were pitifully low – around 14 percent. And even then, the punitive deductibles contained in their policies means that homeowners may only recover 20 percent of their repair bills.  So, there is a lot of uninsured loss looking for compensation. Now there are billboards with pictures of smiling lawyers inviting disgruntled homeowners to become part of class action lawsuits, directed at several oilfield operators located close to the fault. For there is enough of an argument to suggest that this earthquake was triggered by human activities.”   

This is not a wild hypothesis with little chance of establishing liability, or the lawyers would not be investing in the opportunity. There are currently three thousand active oil wells in Los Angeles County. There is even an oil derrick in the grounds of Beverly Hills High School. Los Angeles County is second only to its northerly neighbor Kern County in terms of current levels of oil production in California.  In 2013, the U.S. Geological Survey (USGS) estimated there were 900 million barrels of oil still to be extracted from the coastal Wilmington Field which extends for around six miles (10 km) around Long Beach, from Carson to the Belmont Shore.

Beverly Hills High School Picture Credit: Sarah Craig for Faces of Fracking / FLICKR

Beverly Hills High School   Picture Credit: Sarah Craig for Faces of Fracking / FLICKR

However, the Los Angeles oil boom was back in the 1920s when most of the large fields were first discovered. Two seismologists at the USGS have now searched back through the records of earthquakes and oil field production – and arrived at a startling conclusion. Many of the earthquakes during this period appear to have been triggered by neighboring oil field production.

The Mw4.9 earthquake of June 22, 1920 had a shallow source that caused significant damage in a small area just a mile to the west of Inglewood. Local exploration wells releasing oil and gas pressures had been drilled at this location in the months before the earthquake.

A Mw4.3 earthquake in July 1929 at Whittier, some four miles (6 km) southwest of downtown Los Angeles, had a source close to the Santa Fe Springs oil field; one of the top producers through the 1920s, a field which had been drilled deeper and had a production boom in the months leading up to the earthquake.

A Mw5 earthquake occurred close to Santa Monica on August 31, 1930, in the vicinity of the Playa del Rey oilfield at Venice, California, a field first identified in December 1929 with production ramping up to four million barrels over the second half of 1930.

The epicenter of the Mw6.4 1933 Long Beach earthquake, on the Newport-Inglewood Fault was in the footprint of the Huntingdon Beach oilfield at the southern end of this 47 mile-long (75 km) fault.

As for a mechanism – the Groningen gas field in the Netherlands, shows how earthquakes can be triggered simply by the extraction of oil and gas, as reductions in load and compaction cause faults to break.

More Deep Waste Water Disposal Wells in California than Oklahoma

Today many of the Los Angeles oilfields are being managed through secondary recovery – pumping water into the reservoir to flush out the oil. In which case, we have an additional potential mechanism to generate earthquakes – raising deep fluid pressures – as currently experienced in Oklahoma. And Oklahoma is not even the number one U.S. state for deep waste water disposal. Between 2010 and 2013 there were 9,900 active deep waste water disposal wells in California relative to 8,600 in Oklahoma. And the California wells tend to be deeper.

More than 75 percent of the state’s oil production and more than 80 percent of all injection wells are in Kern County, central California, which happens to be close to the largest earthquake in the region over the past century on the White Wolf Fault: Mw7.3 in 1952. In 2005, there was an abrupt increase in the rates of waste water injection close to the White Wolf Fault, which was followed by an unprecedented swarm of four earthquakes over Magnitude 4 on the same day in September 2005. The injection and the seismicity have been linked in a research paper by Caltech and University of Southern California seismologists published in 2016. One neighboring well, delivering 57,000 cubic meters of waste water each month, was started just five months before the earthquake swarm broke out. The seismologists found a smoking gun, a pattern of smaller shocks migrating from the site of the well to the location of the earthquake cluster.

To summarize – we know that raising fluid pressures at depth can cause earthquakes, as is the case in Oklahoma, and also in Kern County, CA. We know there is circumstantial evidence for a connection between specific damaging earthquakes and oil extraction in southern California in the 1920s and 1930s. According to the location of the next major earthquake in southern or central California, there is a reasonable probability there will be an actively managed oilfield or waste water well in the vicinity.

Whoever is holding the liability cover for that operator may need some deep pockets.

Billions in Liabilities: Man-Made Earthquakes at Europe’s Biggest Gas Field

The Groningen gas field, discovered in 1959, is the largest in Europe and produces up to 15 per cent of the natural gas consumed across the continent. With original reserves of more than 100 trillion cubic feet, over the decades the field has been an extraordinary cash cow for the Dutch government and the two global energy giants, Shell and ExxonMobil, which partner in managing the field. In 2014 alone, state proceeds from Groningen were approximately €9.4 billion ($9.8 billion).

But now, costs to the Dutch government are mounting as the courts have ordered that compensation is paid to nearby propery owners for damage caused by the earthquakes induced by extracting the gas. Insurers who were covering liabilities at the field now find that the claims have the potential to extend beyond the direct shaking damage to include the reduction in property values caused by this ongoing seismic crisis. And the potential for future earthquakes and their related damages has not disappeared – a situation which again illustrates the importance of modeling the risk costs of liability coverages, a new capability on which RMS is partnering with its sister company Praedicat.

The Groningen gas reservoir covers 700 square miles and, uniquely among giant gas fields worldwide, it is located beneath a well-populated and developed region. The buildings in this region, which half a million people live and work in, are not earthquake resistant: 90% of properties are made from unreinforced masonry (URM).

The ground above the gas field has been subsiding as the gas has vented out from the 10,000-feet deep porous sandstone reservoir and the formation has compacted. This compaction helps squeeze the gas out of reservoir, but has also led to movement on pre-existing faults that are present throughout the sandstone layer, a small number of which are more regional in extent. And these sudden fault movements radiate earthquake vibrations.

How A Shake Became a Seismic Crisis

The first earthquake recorded at the field was in December 1991 with a magnitude of 2.4. The largest to date was in August 2012 with a magnitude of 3.6. In most parts of the world, such an earthquake would not have significant consequences, but on account of the shallow depth of the quake, thick soils and poor quality building construction in the Groningen area, there were more than 30,000 claims for property damage, dwarfing the total number from the previous two decades.

Since the start of 2014 the government has limited gas production in an attempt to manage the earthquakes, with some success. But the ongoing seismicity has had a catastrophic effect on the property market, which has been compounded by a class-action lawsuit in 2015. It was filed on behalf of 900 homeowners and 12 housing co-operatives who had seen the value of their properties plummet. The judge ruled that owners of the real estate should be compensated for loss of their property’s market value, even when the property was not up for sale. The case is still rumbling on through the appeal courts but if the earlier ruling stands, then the estimates of the future liabilities for damage and loss of property value range from €6.5 billion to €30 billion.

Calculating the Risk

While earthquakes associated with gas and oil extraction are known from other fields worldwide, the massive financial risk at Groningen reflects the intersection of a moderate level of seismicity with a huge concentration of exposed value and very weak buildings. And although limiting production since 2014 has reduced the seismicity, there still remains the potential for further highly damaging earthquakes.

Calculating these risk costs requires a fully probabilistic assessment of the expected seismicity, across the full range of potential magnitudes and their annual probabilities. Each event in the simulation can be modeled using locally-calibrated ground motion data as well as expected property vulnerabilities, based on previous experience from the 2012 earthquake.

There is also the question of how far beyond actual physical damage the liabilities have the potential to extend and where future earthquakes can affect house values. The situation at Groningen, where it took almost thirty years of production before the earthquakes began, highlights the need for detailed risk analysis of all energy liability insurance covers for gas and oil extraction.

Friday 13th and the Long-Term Cost of False Alarms

If the prospect of flooding along the East Coast of England earlier this month was hard to forecast, the newspaper headlines the next day were predictable enough:

Floods? What floods? Families’ fury at evacuation order over storm surge … that never happened (Daily Mail)

East coast residents have derided the severe storm warnings as a ‘load of rubbish’ (The Guardian)

Villagers shrug off storm danger (The Times)

The police had attempted an evacuation of some communities and the army was on standby. This was because of warnings of a ‘catastrophic’ North Sea storm surge on January 13 for which the UK Environment Agency applied the highest level flood warnings along parts of the East Coast: ‘severe’ which represents a danger to life. And yet the flooding did not materialize.

Environment Agency flood warnings: January 13 2017

Water levels were 1.2m lower along the Lincolnshire coast than those experienced in the last big storm surge flood in December 2013, and 0.9m lower around the Norfolk towns of Great Yarmouth and Lowestoft. Predicting the future in such complex situations, even very near-term, always has the potential to make fools of the experts. But there’s a pressure on public agencies, knowing the political fallout of missing a catastrophe, to adopt the precautionary principle and take action. Imagine the set of headlines, and ministerial responses, if there had been no warnings followed by loss of life.

Interestingly, most of those who had been told to evacuate as this storm approached chose to stay in their homes. One police force in Essex, knocked on 2,000 doors yet only 140 of those people registered at an evacuation centre. Why did the others ignore the warnings and stay put? Media reports suggest that many felt this was another false alarm.

The precautionary principal might seem prudent, but a false alarm forecast can encourage people to ignore future warnings. Recent years offer numerous examples of the consequences.

The Lessons of History

Following a 2006 Mw8.3 earthquake offshore from the Kurile Islands, tsunami evacuation warnings were issued all along the Pacific coast of northern Japan, where the tsunami that did arrive was harmless. For many people that experience weakened the imperative to evacuate after feeling the three-minute shaking of the March 2011 Mw9 earthquake, following which 20,000 people were drowned by the tsunami. Based on the fear of what happened in 2004 and 2011, today tsunami warnings are being ‘over-issued’ in many countries around the Pacific and Indian Oceans.

For the inhabitants of New Orleans, the evacuation order issued in advance of Hurricane Ivan in December 2004 (when one third of the city’s population moved out, while the storm veered away), left many sceptical about the mandatory evacuation issued in advance of Hurricane Katrina in August 2005 (after which around 1500 drowned).

Agencies whose job it is to forecast disaster know only too well what happens if they don’t issue a warning as any risk looms. However, the long-term consequences from false alarms are perhaps not made explicit enough. While risk models to calculate the consequence are not yet available, a simple hypothetical calculation illustrates the basic principles of how such a model might work:

  • the chance of a dangerous storm surge in the next 20 years is 10 percent, for a given community;
  • if this happens, then let’s say 5,000 people would be at grave risk;
  • because of a recent ‘false’ alarm, one percent of those residents will ignore evacuation orders;
  • thus the potential loss of life attributed to the false alarm is five people.

Now repeat with real data.

Forecasting agencies need a false alarm forecast risk model to be able to help balance their decisions about when to issue severe warnings. There is an understandable instinct to be over cautious in the short-term, but when measured in terms of future lives lost, disaster warnings need to be carefully rationed. And that rationing requires political support, as well as public education.

[Note: RMS models storm surge in the U.K. where the risk is highest along England’s East Coast – the area affected by flood warnings on January 13. Surge risk is complex, and the RMS Europe Windstorm Model™ calculates surge losses caused by extra-tropical cyclones considering factors such as tidal state, coastal defenses, and saltwater contamination.]

The Cost of Shaking in Oklahoma: Earthquakes Caused by Wastewater Disposal

It was back in 2009 that the inhabitants of northern Oklahoma first noticed the vibrations. Initially only once or twice a year, but then every month, and even every week. It was disconcerting rather than damaging until November 2011, when a magnitude 5.6 earthquake broke beneath the city of Prague, Okla., causing widespread damage to chimneys and brick veneer walls, but fortunately no casualties.

The U.S. Geological Service had been tracking this extraordinary outburst of seismicity. Before 2008, across the central and eastern U.S., there were an average of 21 earthquakes of magnitude three or higher each year. Between 2009-2013 that annual average increased to 99 earthquakes in Oklahoma alone, rising to 659 in 2014 and more than 800 in 2015.

blog_100117

During the same period the oil industry in Oklahoma embarked on a dramatic expansion of fracking and conventional oil extraction. Both activities were generating a lot of waste water. The cheapest way of disposing the brine was to inject it deep down boreholes into the 500 million year old Arbuckle Sedimentary Formation. The volume being pumped there increased from 20 million barrels in 1997 to 400 million barrels in 2013. Today there are some 3,500 disposal wells in Oklahoma State, down which more than a million barrels of saline water is pumped every day.

It became clear that the chatter of Oklahoma earthquakes was linked with these injection wells. The way that raising deep fluid pressures can generate earthquakes has been well-understood for decades: the fluid ‘lubricates’ faults that are already poised to fail.

But induced seismicity is an issue for energy companies worldwide, not just in the South Central states of the U.S.. And it presents a challenge for insurers, as earthquakes don’t neatly label themselves ‘induced’ and ‘natural.’ So their losses will also be picked up by property insurers writing earthquake extensions to standard coverages, as well as potentially by the insurers covering the liabilities of the deep disposal operators.

Investigating the Risk

Working with Praedicat, which specializes in understanding liability risks, RMS set out to develop a solution by focusing first on Oklahoma, framing two important questions regarding the potential consequences for the operators of the deep disposal wells:

  • What is the annual risk cost of all the earthquakes with the potential to be induced by a specific injection well?
  • In the aftermath of a destructive earthquake how could the damage costs be allocated back to the nearby well operators most equitably?

In Oklahoma detailed records have been kept on all fluid injection activities: well locations, depths, rates of injection. There is also data on the timing and location of every earthquake in the state. By linking these two datasets the RMS team was able to explore what connects fluid disposal with seismicity. We found, for example, that both the depth of a well and the volume of fluid disposed increased the tendency to generate seismic activity.

Earthquakes in the central U.S. are not only shallow and/or human-induced. The notorious New Madrid, Mo. earthquakes of 1811-1812 demonstrated the enormous capacity for ‘natural’ seismicity in the central U.S., which can, albeit infrequently, cause earthquakes with magnitudes in excess of M7. However, there remains the question of the maximum magnitude of an induced earthquake in Oklahoma. Based on worldwide experience the upper limit is generally assumed to be magnitude M6 to 6.5.

Who Pays – and How Much?

From our studies of the induced seismicity in the region, RMS can now calculate the expected total economic loss from potential earthquakes using the RMS North America Earthquake Model. To do so we run a series of shocks, at quarter magnitude intervals, located at the site of each injection well. Having assessed the impact at a range of different locations, we’ve found dramatic differences in the risk costs for a disposal well in a rural area in contrast to a well near the principal cities of central Oklahoma. Reversing this procedure we have also identified a rational and equitable process which could help allocate the costs of a damaging earthquake back to all the nearby well operators. In this, distance will be a critical factor.

Modeling Advances for Manmade Earthquakes

For carriers writing US earthquake impacts for homeowners and businesses there is also a concern about the potential liabilities from this phenomenon. Hence, the updated RMS North America Earthquake Model, to be released in spring 2017, will now include a tool for calculating property risk from induced seismicity in affected states: not just Oklahoma but also Kansas, Ohio, Arkansas, Texas, Colorado, New Mexico, and Alabama. The scientific understanding of induced seismicity and its consequences are rapidly evolving, and RMS scientists are closely following these developments.

As for Oklahoma, the situation is becoming critical as the seismic activity shows no signs of stopping: a swarm of induced earthquakes has erupted beneath the largest U.S. inland oil storage depot at Cushing and in September 2016 there was a moment magnitude 5.8 earthquake located eight miles from the town of Pawnee – which caused serious damage to buildings. Were a magnitude 6+ earthquake to hit near Edmond (outside Oklahoma City) our modeling shows it could cause billions of dollars of damage.

The risk of seismicity triggered by the energy industry is a global challenge, with implications far beyond Oklahoma. For example Europe’s largest gas field, in the Netherlands, is currently the site of damaging seismicity. And in my next blog, I’ll be looking at the consequences.

[For a wider discussion of the issues surrounding induced seismicity please see these Reactions articles, for which Robert Muir-Wood was interviewed.]

Shrugging Off a Hurricane: A Three Hundred Year Old Culture of Disaster Resilience

If a global prize was to be awarded to the city or country that achieves the peak of disaster resilience, Bermuda might be a fitting first winner.

This October’s Hurricane Nicole made direct landfall on the island. The eyewall tracked over Bermuda with maximum measured windspeeds close to 120 mph. Nonetheless there were there were no casualties. The damage tally was principally to fallen trees, roadway debris, some smashed boats and many downed utility poles. The airport opened in 24 hours, with the island’s ferries operating the following day.

Bermuda’s performance through Nicole was exemplary. What’s behind that?

Since its foundation in 1609 when 150 colonists and crew were shipwrecked on the island, Bermuda has got used to its situation at the heart of hurricane alley. Comprising 21 square miles of reef and lithified dunes, sitting out in the Atlantic 650 miles west of Cape Hatteras, a hurricane hits the island on average once every six or seven years. Mostly these are glancing blows, but once or twice a century Bermuda sustains direct hits at Category 3 or 4 intensity. Hurricane Fabian in 2003 was the worst of the recent storms, causing $300 million of damage (estimated to be worth $650 million, accounting for today’s higher prices and greater property exposure). The cost of the damage from Hurricane Gonzalo in 2014 was about half this amount.

How did Bermuda’s indigenous building style come to adopt such a high standard of wind resistance? It seems to go back to a run of four hurricanes at the beginning of the 18th Century. First, in September 1712 a hurricane persisted for eight hours destroying the majority of wooden buildings. Then twice in 1713 and again more strongly in 1715 the hurricane winds ruined the newly rebuilt churches. One hurricane can seem like an exception, four becomes a trend. In response, houses were constructed with walls of massive reef limestone blocks, covered by roofs tiled with thick slabs of coral stone: traditional house styles that have been sustained ever since.

The frequency of hurricanes has helped stress test the building stock, and ensure the traditional construction styles have been sustained. More recently there has been a robust and well-policed building code to ensure adequate wind resistance for all new construction on the island.

Yet resilience is more than strong buildings. It also requires hardened infrastructure, and that is where Bermuda has some room for improvement. Still dependent on overhead power lines, 90 percent of the island’s 27,000 houses lost power in Hurricane Nicole – although half of these had been reconnected by the following morning and the remainder through that day. Mobile phone and cable networks were also back in operation over a similar timescale. Experience of recent hurricanes has ensured an adequate stockpile of cable and poles.

Expert Eyes on the Island

It helps that there is an international reinsurance industry on the island, with many specialists in the science of hurricanes and the physics and engineering of building performance on hand to scrutinize the application of improved resilience. Almost every building is insured, giving underwriters oversight of building standards. Most importantly, the very functioning of global reinsurance depends on uninterrupted connection with the rest of the world, as well as ensuring that on-island staff are not distracted by having to attend to their family’s welfare.

Bermuda’s experience during Nicole would merit the platinum standard of resilience adopted by the best businesses: that all functions can be restored within 72 hours of a disaster. The Bermuda Business Development Agency and the Association of Bermuda Insurers and Reinsurers were fulsome in their praise for how the island had withstood the hurricane. The strong and widely-owned culture of preparedness, reflects the experience of recent storms like Gonzalo and Fabian.

Stephen Weinstein, general counsel at RenaissanceRe, commented “It’s remarkable that one day after a major hurricane strike, Bermuda is open for business, helping finance disaster risk worldwide, and poised to welcome back business visitors and vacationers alike.”

In early 2017, RMS will issue an update to Bermuda wind vulnerability in the version 17 software release as part of a broader update to the 33 islands and territories covered by the North Atlantic Hurricane Models. Updates to Bermuda vulnerability will consider past hurricane observations and the latest building code research.

New Zealand Earthquake – Early Perspectives

On Monday 14 November 2016 Dr Robert Muir-Wood, RMS chief research officer who is an earthquake expert and specialist in catastrophe risk management, made the following observations about the earthquake in Amberley:

SCALE
“The November 13 earthquake was assigned a magnitude 7.8 by the United States Geological Service. That makes it more than fifty times bigger than the February 2011 earthquake which occurred directly beneath Christchurch. However, it was still around forty times smaller than the Great Tohoku earthquake off the northeast coast of Japan in March 2011.”

CASUALTIES, PROPERTY DAMAGE & BUSINESS INTERRUPTION
“Although it was significantly bigger than the Christchurch earthquake, the source of the earthquake was further from major exposure concentrations. The northeast coast of South Island has a very low population and the earthquake occurred in the middle of the night when there was little traffic on the coast road. Characteristic of such an earthquake in steep mountainous terrain, there have been thousands of landslides, some of which have blocked streams and rivers – there is now a risk of flooding downstream when these “dams” break.

In the capital city, Wellington, liquefaction and slumping on man-made ground around the port has damaged some quays and made it impossible for the ferry that runs between North and South Island to dock. The most spectacular damage has come from massive landslides blocking the main coast road Highway 1 that is the overland connection from the ferryport opposite Wellington down to Christchurch. This will take months or even years to repair. Therefore it appears the biggest consequences of the earthquake can be expected to be logistical, with particular implications for any commercial activity in Christchurch that is dependent on overland supplies from the north. As long as the main highway remains closed, ferries may have to ship supplies down to Lyttelton, the main port of Christchurch.”

SEISMOLOGY
“The earthquake appears to have occurred principally along the complex fault system in the north-eastern part of the South Island, where the plate tectonic motion between the Pacific and Australian plates transfers from subduction along the Hikurangi Subduction Zone to strike-slip along the Alpine Fault System. Faults in this area strike predominantly northeast-southwest and show a combination of thrust and strike-slip motion. From its epicenter the rupture unzipped towards the northeast, for about 100-140km reaching to about 200 km to the capital city Wellington.”

WHAT NOW?
“Given the way the rupture spread to the northeast there is some potential for a follow-on major earthquake on one of the faults running beneath Wellington. The chances of a follow-on major earthquake are highest in the first few days after a big earthquake, and tail off exponentially. Aftershocks are expected to continue to be felt for months.”

MODELING
“These events occurred on multiple fault segments in close proximity to one another. The technology to model this type of complex rupture is now available in the latest RMS high-definition New Zealand Earthquake Model (2016) where fault segments may now interconnect under certain considerations.”

India’s Need for Disaster Risk Reduction: Can it Turn a Plan into Action?

This was the first time I’d ever heard a Prime Minister praising the benefits of “risk mapping.” Mid-morning on Thursday November 3 in a vast tent in the heart of New Delhi, the Indian Prime Minister, Narendra Modi, was delivering an introductory address to welcome four thousand delegates to the 2016 Asian Ministerial Conference on Disaster Risk Reduction.

Modi mentioned his own personal experience of disaster recovery after the 2001 Gujarat earthquake in which more than 12,000 people died, before presenting a ten-point plan of action in response to the 2015 Sendai Framework for disaster risk reduction. There were no guarantees of new regulations or changes in policy, but three of his ten points were particularly substantive.

First there was a call for appropriate protections to be applied to all government sponsored construction of infrastructure or housing against the relevant hazards at that location. Second he called for “work towards” achieving universal “coverage” (insurance if not by name?) against disasters– from the poorest villager to big industries and state governments. Third he called for standardized hazard and risk mapping to be developed not only for earthquake but for other perils: chemical hazards, cyclones, all varieties of floods and forest fires.

More Economic Development Means More Exposure to Risk

India is at a development threshold, comparable to that reached by Japan at the end of the 1950s and China in the 1990s. Rapid economic growth has led to a dramatic expansion of building and value in harm’s way and there now needs to be a significant compensatory focus on measures to reduce risk and expand protections, whether through insurance systems or flood walls.  Development in India has been moving too fast to hope that adequate building standards are being consistently followed – there are not enough engineers or inspectors.

The Chennai floods at the end of 2015 have come to highlight this disaster-prone landscape. Heavy end-of-year monsoonal downpours fell onto saturated ground after weeks of rainfall, which were then ponded by choked drainage channels and illegal development, swamping hundreds of thousands of buildings along with roads and even the main airport. The city was cut off and economic losses totaled billions of U.S. dollars, with more than 1.8 million people being displaced.

Sorting out Chennai will take co-ordinated government action and money: to implement new drainage systems, relocate or raise those at highest risk and apply flood zonations. Chennai provides a test that Disaster Risk Reduction really is a priority, as Mr. Modi’s speech suggested. The response will inevitably encounter opposition, from those who cannot see why they should be forced to relocate or pay more in their taxes to construct flood defenses.

The one community notably missing from Prime Minister Modi’s call to action was the private sector, even though a pre-conference session the day before, organized by Federation of Indian Chambers of Commerce (FICCI), had identified that 80% of construction was likely to be privately financed.

I gave two talks at the conference – one in the private sector session – on how modelers like RMS have taken a lead in developing those risk maps and models for India, including high resolution flood models that will help extend insurance. Yet armed with information by which to differentiate risk and identify the hot spots, the government may need to step in and provide its own coverages for those deemed too high risk by private insurers.

Auditing Disaster Risk Reduction with Cat Models

In a side meeting at the main conference I presented on the need to have independent risk audits of states and cities, to measure progress in achieving their disaster risk reduction goals, in particular when it comes to earthquake mortality – for which experience from the last few decades gives no perspective on the true risk of potentially large and destructive future earthquakes happening in India – this is where probabilistic catastrophe models are invaluable. The Nepal earthquake of 2015 has highlighted the significant vulnerability of ordinary brick and concrete buildings in the region.

I came away seeing the extraordinary opportunity to reduce and insure risk in India, if ten-point lists can truly be converted into co-ordinated action.

Meanwhile as a test of the government’s resolve in the days leading up to the conference, Delhi was shrouded in its worst ever smog: a toxic concoction of traffic fumes, coal smoke, and Diwali fireworks, enriched to extremely dangerous levels in micro-particles, a smog so thick and pervasive that it seeped inside buildings, so that several attendees asked why the toxic smog was not itself being classified and treated as a true “manmade disaster.”

The Cure for Catastrophe?

On August 24, 2016 – just a few weeks ago – an earthquake hit a remote area of the Apennine mountains of central Italy in the middle of the night. Fewer than 3000 people lived in the vicinity of the strongest shaking. But nearly 1 in 10 of those died when the buildings in which they were sleeping collapsed.

This disaster, like almost all disasters, was squarely man-made. Manufactured by what we build and where we build it; or in more subtle ways – by failing to anticipate what will one day inevitably happen.

Italy has some of the richest and best researched disaster history of any country, going back more than a thousand years. The band of earthquakes that runs through the Apennines is well mapped – pretty much this exact same earthquake happened in 1639. If you were identifying the highest risk locations in Italy, these villages would be on your shortlist. So in the year 2016, 300 people dying in a well-anticipated, moderate-sized earthquake, in a rich and highly-developed country, is no longer excusable.

Half the primary school in the town of Amatrice collapsed in the August 24th earthquake. Very fortunately, it being the middle of the night, no children were in class. Four years before, €700,000 had been spent to make the school “earthquake proof.” An investigation is now underway to see why this proofing failed so spectacularly. If only Italy was as good at building disaster resilience as mobilizing disaster response: some 7000 emergency responders had arrived after the earthquake – more than twice as many as those who lived in the affected villages.

The unnatural disaster

When we look back through history and investigate them closely we find that many other “natural disasters” were, in their different ways, also man-made.

The city of Saint-Pierre on the island of Martinique was once known as the “little Paris of the Caribbean.” In 1900 it had a population of 26,000, with tree-lined streets of balconied two and three story houses. From the start of 1902 it was clear the neighbouring volcano of Mont Pelée was heading towards an eruption. The island’s governor convened a panel of experts who concluded Saint-Pierre was at no risk because the valleys beneath the volcano would guide the products of any eruption directly into the sea. As the tremors increased, the Governor brought his family to Saint-Pierre to show the city was safe, and therefore, likely all but one of the city’s inhabitants, died when the eruption blasted sideways out of the volcano. There are some parallels here with the story of those 20,000 people drowned in the 2011 Japanese tsunami, many of whom had assumed they would be protected by concrete tsunami walls and therefore did not bother to escape while they still had time. We should distrust simple notions of where is safe, based only on some untested theory.

Sometimes the disaster reflects the unforeseen consequence of some manmade intervention. In Spring 1965, the U.S. Army Corps of Engineers completed the construction of a broad shipping canal – known as the Mississippi River Gulf Outlet (“Mr Go”) linking New Orleans with the Gulf of Mexico. Within three months, a storm surge flood driven by the strong easterly winds ahead of Hurricane Betsy was funnelled up Mr Go into the heart of the city. Without Mr Go the city would not have flooded. Four decades later Hurricane Katrina performed this same trick on New Orleans again, only this time the storm surge was three feet higher. The flooding was exacerbated when thin concrete walls lining drainage canals fell over without being overtopped. Channels meant for pumping water out of the city reversed their intended function and became the means by which the city was inundated.

These were fundamental engineering and policy failures, for which many vulnerable people paid the price.

RiskTech   

My new book, “The Cure for Catastrophe,” challenges us to think differently about disasters. To understand how risk is generated before the disaster happens. To learn from countries, like Holland, which over the centuries mastered their ever-threatening flood catastrophes, through fostering a culture of disaster resilience.

Today we can harness powerful computer technology to help anticipate and reduce disasters. Catastrophe models, originally developed to price and manage insurance portfolios, are being converted into tools to model metrics on human casualties or livelihoods as well as monetary losses. And based on these measurements we can identify where to focus our investments in disaster reduction.

In 2015 the Tokyo City government was the first to announce it aims to halve its earthquake casualties and measure progress by using the results of a catastrophe model. The frontline towns of Italy should likewise have their risks modeled and independently audited, so that we can see if they are making progress in saving future lives before they suffer their next inevitable earthquake.

 

The Cure for Catastrophe is published by Oneworld (UK) and Basic Books (US)

Fire Weather

Fires can start at all times and places, but how a fire spreads is principally down to the weather.

This week, 350 years ago, the fire at Thomas Farriner’s bakery on Pudding Lane, a small alleyway running down to the river from the City of London, broke out at the quietest time of the week, around 1am on Sunday morning September 2, 1666. London had been experiencing a drought and the thatched roofs of the houses were tinder dry. At 4 am the Lord Mayor, roused from his sleep, decided the blaze was easily manageable. It was already too late, however. By 7am the roofs of some 300 houses were burning and fanned by strong easterly winds the fire was spreading fast towards the west. Within three days the fire had consumed 13,000 houses and left 70,000 homeless.

In the city’s reconstruction only brick and tiles houses were permitted, severely reducing the potential for repeat conflagrations. Within a few years there were the first fire insurers, growing their business as fear outran the risk.

Yet big city fires had by no means gone away and the wooden cities of northern Europe were primed to burn. The 1728 Copenhagen fire destroyed 28% of the city while the 1795 fire left 6000 homeless. A quarter of the city of Helsinki burned down in November 1808. The 1842 fire that destroyed Hamburg left 20,000 homeless. The center of the city of Bergen Norway burnt down in 1855 and then again in January 1916.

Wind and fire

By the start of the 20th Century, improvements in fire-fighting had reduced the chance that a great city fire took hold, but not if there were strong winds, like the 1916 Bergen, Norway fire, which broke out in the middle of an intense windstorm with hurricane force gusts. In February 1941 the fire that burnt out the historic center of Santander on the coast of northern Spain was driven by an intense windstorm: equivalent to the 1987 October storm in the U.K. And then there is the firestorm that destroyed Yokohama and Tokyo after the 1923 earthquake, driven by 50 miles per hour winds on the outer edge of a typhoon in which, over a few hours, an estimated 140,000 died.

Wind and fire in the wooden city are a deadly combination. Above a certain wind speed, the fire becomes an uncontrollable firestorm. The 1991 Oakland Hills fire flared up late morning also on a Sunday and then surged out of the mountains into the city, driven by hot dry 60 miles per hour Diablo Winds from the east, jumping an 8 lane highway and overwhelming the ability of the fire crews to hold the line, until the wind eventually turned and the fire blew back over its own embers.  The fire consumed 2800 houses, spreading so fast that 25 died. On February 7, 2009 a strong northwesterly wind drew baking air out of Australia’s interior and fires took off across the state of Victoria. Fallen power cables sparked a fire whose embers, blown by 60 miles per hour winds, flashed from one woodland to another, overwhelming several small towns so fast that 173 died before they could escape.

Most recently we have seen fire storms in Canada. Again there is nothing new about the phenomenon; the Matheson fires in 1919 destroyed 49 Ontario towns and killed 244 people in a fire front that extended 60km wide. It was a firestorm fanned by gale force winds, that destroyed one third of the city of Slave Lake, Alberta, in 2011 and it is fortunate only that the roads were broad and straight to allow people to escape the fires that raged into Fort McMurray in summer 2016.

There is no remedy for a firestorm blown on gale-force winds. And wooden property close to drought ridden forests are at very high risk, such as those from South Lake Tahoe to Berkeley in California and in New Zealand, from Canberra to Christchurch. Which is why urban fire needs to stay on the agenda of catastrophe risk management. A wind driven conflagration can blow deep into any timber city, and insurers need to manage their exposure concentrations.