Author Archives: Robert Muir-Wood

About Robert Muir-Wood

Chief Research Officer, RMS
Robert Muir-Wood works to enhance approaches to natural catastrophe modeling, identify models for new areas of risk, and explore expanded applications for catastrophe modeling. Recently, he has been focusing on identifying the potential locations and consequences of magnitude 9 earthquakes worldwide. In 2012, as part of Mexico's presidency of the G20, he helped promote government usage of catastrophe models for managing national disaster risks. Robert has more than 20 years of experience developing probabilistic catastrophe models. He was lead author for the 2007 IPCC 4th Assessment Report and 2011 IPCC Special Report on Extremes, is a member of the Climate Risk and Insurance Working Group for the Geneva Association, and is vice-chair of the OECD panel on the Financial Consequences of Large Scale Catastrophes. He is the author of six books, as well as numerous papers and articles in scientific and industry publications. He holds a degree in natural sciences and a PhD in Earth sciences, both from Cambridge University.

Friday 13th and the Long-Term Cost of False Alarms

If the prospect of flooding along the East Coast of England earlier this month was hard to forecast, the newspaper headlines the next day were predictable enough:

Floods? What floods? Families’ fury at evacuation order over storm surge … that never happened (Daily Mail)

East coast residents have derided the severe storm warnings as a ‘load of rubbish’ (The Guardian)

Villagers shrug off storm danger (The Times)

The police had attempted an evacuation of some communities and the army was on standby. This was because of warnings of a ‘catastrophic’ North Sea storm surge on January 13 for which the UK Environment Agency applied the highest level flood warnings along parts of the East Coast: ‘severe’ which represents a danger to life. And yet the flooding did not materialize.

Water levels were 1.2m lower along the Lincolnshire coast than those experienced in the last big storm surge flood in December 2013, and 0.9m lower around the Norfolk towns of Great Yarmouth and Lowestoft. Predicting the future in such complex situations, even very near-term, always has the potential to make fools of the experts. But there’s a pressure on public agencies, knowing the political fallout of missing a catastrophe, to adopt the precautionary principle and take action. Imagine the set of headlines, and ministerial responses, if there had been no warnings followed by loss of life.

Interestingly, most of those who had been told to evacuate as this storm approached chose to stay in their homes. One police force in Essex, knocked on 2,000 doors yet only 140 of those people registered at an evacuation centre. Why did the others ignore the warnings and stay put? Media reports suggest that many felt this was another false alarm.

The precautionary principal might seem prudent, but a false alarm forecast can encourage people to ignore future warnings. Recent years offer numerous examples of the consequences.

The Lessons of History

Following a 2006 Mw8.3 earthquake offshore from the Kurile Islands, tsunami evacuation warnings were issued all along the Pacific coast of northern Japan, where the tsunami that did arrive was harmless. For many people that experience weakened the imperative to evacuate after feeling the three-minute shaking of the March 2011 Mw9 earthquake, following which 20,000 people were drowned by the tsunami. Based on the fear of what happened in 2004 and 2011, today tsunami warnings are being ‘over-issued’ in many countries around the Pacific and Indian Oceans.

For the inhabitants of New Orleans, the evacuation order issued in advance of Hurricane Ivan in December 2004 (when one third of the city’s population moved out, while the storm veered away), left many sceptical about the mandatory evacuation issued in advance of Hurricane Katrina in August 2005 (after which around 1500 drowned).

Agencies whose job it is to forecast disaster know only too well what happens if they don’t issue a warning as any risk looms. However, the long-term consequences from false alarms are perhaps not made explicit enough. While risk models to calculate the consequence are not yet available, a simple hypothetical calculation illustrates the basic principles of how such a model might work:

  • the chance of a dangerous storm surge in the next 20 years is 10 percent, for a given community;
  • if this happens, then let’s say 5,000 people would be at grave risk;
  • because of a recent ‘false’ alarm, one percent of those residents will ignore evacuation orders;
  • thus the potential loss of life attributed to the false alarm is five people.

Now repeat with real data.

Forecasting agencies need a false alarm forecast risk model to be able to help balance their decisions about when to issue severe warnings. There is an understandable instinct to be over cautious in the short-term, but when measured in terms of future lives lost, disaster warnings need to be carefully rationed. And that rationing requires political support, as well as public education.

[Note: RMS models storm surge in the U.K. where the risk is highest along England’s East Coast – the area affected by flood warnings on January 13. Surge risk is complex, and the RMS Europe Windstorm Model™ calculates surge losses caused by extra-tropical cyclones considering factors such as tidal state, coastal defenses, and saltwater contamination.]

The Cost of Shaking in Oklahoma: Earthquakes Caused by Wastewater Disposal

It was back in 2009 that the inhabitants of northern Oklahoma first noticed the vibrations. Initially only once or twice a year, but then every month, and even every week. It was disconcerting rather than damaging until November 2011, when a magnitude 5.6 earthquake broke beneath the city of Prague, Okla., causing widespread damage to chimneys and brick veneer walls, but fortunately no casualties.

The U.S. Geological Service had been tracking this extraordinary outburst of seismicity. Before 2008, across the central and eastern U.S., there were an average of 21 earthquakes of magnitude three or higher each year. Between 2009-2013 that annual average increased to 99 earthquakes in Oklahoma alone, rising to 659 in 2014 and more than 800 in 2015.

blog_100117

During the same period the oil industry in Oklahoma embarked on a dramatic expansion of fracking and conventional oil extraction. Both activities were generating a lot of waste water. The cheapest way of disposing the brine was to inject it deep down boreholes into the 500 million year old Arbuckle Sedimentary Formation. The volume being pumped there increased from 20 million barrels in 1997 to 400 million barrels in 2013. Today there are some 3,500 disposal wells in Oklahoma State, down which more than a million barrels of saline water is pumped every day.

It became clear that the chatter of Oklahoma earthquakes was linked with these injection wells. The way that raising deep fluid pressures can generate earthquakes has been well-understood for decades: the fluid ‘lubricates’ faults that are already poised to fail.

But induced seismicity is an issue for energy companies worldwide, not just in the South Central states of the U.S.. And it presents a challenge for insurers, as earthquakes don’t neatly label themselves ‘induced’ and ‘natural.’ So their losses will also be picked up by property insurers writing earthquake extensions to standard coverages, as well as potentially by the insurers covering the liabilities of the deep disposal operators.

Investigating the Risk

Working with Praedicat, which specializes in understanding liability risks, RMS set out to develop a solution by focusing first on Oklahoma, framing two important questions regarding the potential consequences for the operators of the deep disposal wells:

  • What is the annual risk cost of all the earthquakes with the potential to be induced by a specific injection well?
  • In the aftermath of a destructive earthquake how could the damage costs be allocated back to the nearby well operators most equitably?

In Oklahoma detailed records have been kept on all fluid injection activities: well locations, depths, rates of injection. There is also data on the timing and location of every earthquake in the state. By linking these two datasets the RMS team was able to explore what connects fluid disposal with seismicity. We found, for example, that both the depth of a well and the volume of fluid disposed increased the tendency to generate seismic activity.

Earthquakes in the central U.S. are not only shallow and/or human-induced. The notorious New Madrid, Mo. earthquakes of 1811-1812 demonstrated the enormous capacity for ‘natural’ seismicity in the central U.S., which can, albeit infrequently, cause earthquakes with magnitudes in excess of M7. However, there remains the question of the maximum magnitude of an induced earthquake in Oklahoma. Based on worldwide experience the upper limit is generally assumed to be magnitude M6 to 6.5.

Who Pays – and How Much?

From our studies of the induced seismicity in the region, RMS can now calculate the expected total economic loss from potential earthquakes using the RMS North America Earthquake Model. To do so we run a series of shocks, at quarter magnitude intervals, located at the site of each injection well. Having assessed the impact at a range of different locations, we’ve found dramatic differences in the risk costs for a disposal well in a rural area in contrast to a well near the principal cities of central Oklahoma. Reversing this procedure we have also identified a rational and equitable process which could help allocate the costs of a damaging earthquake back to all the nearby well operators. In this, distance will be a critical factor.

Modeling Advances for Manmade Earthquakes

For carriers writing US earthquake impacts for homeowners and businesses there is also a concern about the potential liabilities from this phenomenon. Hence, the updated RMS North America Earthquake Model, to be released in spring 2017, will now include a tool for calculating property risk from induced seismicity in affected states: not just Oklahoma but also Kansas, Ohio, Arkansas, Texas, Colorado, New Mexico, and Alabama. The scientific understanding of induced seismicity and its consequences are rapidly evolving, and RMS scientists are closely following these developments.

As for Oklahoma, the situation is becoming critical as the seismic activity shows no signs of stopping: a swarm of induced earthquakes has erupted beneath the largest U.S. inland oil storage depot at Cushing and in September 2016 there was a moment magnitude 5.8 earthquake located eight miles from the town of Pawnee – which caused serious damage to buildings. Were a magnitude 6+ earthquake to hit near Edmond (outside Oklahoma City) our modeling shows it could cause billions of dollars of damage.

The risk of seismicity triggered by the energy industry is a global challenge, with implications far beyond Oklahoma. For example Europe’s largest gas field, in the Netherlands, is currently the site of damaging seismicity. And in my next blog, I’ll be looking at the consequences.

[For a wider discussion of the issues surrounding induced seismicity please see these Reactions articles, for which Robert Muir-Wood was interviewed.]

Shrugging Off a Hurricane: A Three Hundred Year Old Culture of Disaster Resilience

If a global prize was to be awarded to the city or country that achieves the peak of disaster resilience, Bermuda might be a fitting first winner.

This October’s Hurricane Nicole made direct landfall on the island. The eyewall tracked over Bermuda with maximum measured windspeeds close to 120 mph. Nonetheless there were there were no casualties. The damage tally was principally to fallen trees, roadway debris, some smashed boats and many downed utility poles. The airport opened in 24 hours, with the island’s ferries operating the following day.

Bermuda’s performance through Nicole was exemplary. What’s behind that?

Since its foundation in 1609 when 150 colonists and crew were shipwrecked on the island, Bermuda has got used to its situation at the heart of hurricane alley. Comprising 21 square miles of reef and lithified dunes, sitting out in the Atlantic 650 miles west of Cape Hatteras, a hurricane hits the island on average once every six or seven years. Mostly these are glancing blows, but once or twice a century Bermuda sustains direct hits at Category 3 or 4 intensity. Hurricane Fabian in 2003 was the worst of the recent storms, causing $300 million of damage (estimated to be worth $650 million, accounting for today’s higher prices and greater property exposure). The cost of the damage from Hurricane Gonzalo in 2014 was about half this amount.

How did Bermuda’s indigenous building style come to adopt such a high standard of wind resistance? It seems to go back to a run of four hurricanes at the beginning of the 18th Century. First, in September 1712 a hurricane persisted for eight hours destroying the majority of wooden buildings. Then twice in 1713 and again more strongly in 1715 the hurricane winds ruined the newly rebuilt churches. One hurricane can seem like an exception, four becomes a trend. In response, houses were constructed with walls of massive reef limestone blocks, covered by roofs tiled with thick slabs of coral stone: traditional house styles that have been sustained ever since.

The frequency of hurricanes has helped stress test the building stock, and ensure the traditional construction styles have been sustained. More recently there has been a robust and well-policed building code to ensure adequate wind resistance for all new construction on the island.

Yet resilience is more than strong buildings. It also requires hardened infrastructure, and that is where Bermuda has some room for improvement. Still dependent on overhead power lines, 90 percent of the island’s 27,000 houses lost power in Hurricane Nicole – although half of these had been reconnected by the following morning and the remainder through that day. Mobile phone and cable networks were also back in operation over a similar timescale. Experience of recent hurricanes has ensured an adequate stockpile of cable and poles.

Expert Eyes on the Island

It helps that there is an international reinsurance industry on the island, with many specialists in the science of hurricanes and the physics and engineering of building performance on hand to scrutinize the application of improved resilience. Almost every building is insured, giving underwriters oversight of building standards. Most importantly, the very functioning of global reinsurance depends on uninterrupted connection with the rest of the world, as well as ensuring that on-island staff are not distracted by having to attend to their family’s welfare.

Bermuda’s experience during Nicole would merit the platinum standard of resilience adopted by the best businesses: that all functions can be restored within 72 hours of a disaster. The Bermuda Business Development Agency and the Association of Bermuda Insurers and Reinsurers were fulsome in their praise for how the island had withstood the hurricane. The strong and widely-owned culture of preparedness, reflects the experience of recent storms like Gonzalo and Fabian.

Stephen Weinstein, general counsel at RenaissanceRe, commented “It’s remarkable that one day after a major hurricane strike, Bermuda is open for business, helping finance disaster risk worldwide, and poised to welcome back business visitors and vacationers alike.”

In early 2017, RMS will issue an update to Bermuda wind vulnerability in the version 17 software release as part of a broader update to the 33 islands and territories covered by the North Atlantic Hurricane Models. Updates to Bermuda vulnerability will consider past hurricane observations and the latest building code research.

New Zealand Earthquake – Early Perspectives

On Monday 14 November 2016 Dr Robert Muir-Wood, RMS chief research officer who is an earthquake expert and specialist in catastrophe risk management, made the following observations about the earthquake in Amberley:

SCALE
“The November 13 earthquake was assigned a magnitude 7.8 by the United States Geological Service. That makes it more than fifty times bigger than the February 2011 earthquake which occurred directly beneath Christchurch. However, it was still around forty times smaller than the Great Tohoku earthquake off the northeast coast of Japan in March 2011.”

CASUALTIES, PROPERTY DAMAGE & BUSINESS INTERRUPTION
“Although it was significantly bigger than the Christchurch earthquake, the source of the earthquake was further from major exposure concentrations. The northeast coast of South Island has a very low population and the earthquake occurred in the middle of the night when there was little traffic on the coast road. Characteristic of such an earthquake in steep mountainous terrain, there have been thousands of landslides, some of which have blocked streams and rivers – there is now a risk of flooding downstream when these “dams” break.

In the capital city, Wellington, liquefaction and slumping on man-made ground around the port has damaged some quays and made it impossible for the ferry that runs between North and South Island to dock. The most spectacular damage has come from massive landslides blocking the main coast road Highway 1 that is the overland connection from the ferryport opposite Wellington down to Christchurch. This will take months or even years to repair. Therefore it appears the biggest consequences of the earthquake can be expected to be logistical, with particular implications for any commercial activity in Christchurch that is dependent on overland supplies from the north. As long as the main highway remains closed, ferries may have to ship supplies down to Lyttelton, the main port of Christchurch.”

SEISMOLOGY
“The earthquake appears to have occurred principally along the complex fault system in the north-eastern part of the South Island, where the plate tectonic motion between the Pacific and Australian plates transfers from subduction along the Hikurangi Subduction Zone to strike-slip along the Alpine Fault System. Faults in this area strike predominantly northeast-southwest and show a combination of thrust and strike-slip motion. From its epicenter the rupture unzipped towards the northeast, for about 100-140km reaching to about 200 km to the capital city Wellington.”

WHAT NOW?
“Given the way the rupture spread to the northeast there is some potential for a follow-on major earthquake on one of the faults running beneath Wellington. The chances of a follow-on major earthquake are highest in the first few days after a big earthquake, and tail off exponentially. Aftershocks are expected to continue to be felt for months.”

MODELING
“These events occurred on multiple fault segments in close proximity to one another. The technology to model this type of complex rupture is now available in the latest RMS high-definition New Zealand Earthquake Model (2016) where fault segments may now interconnect under certain considerations.”

India’s Need for Disaster Risk Reduction: Can it Turn a Plan into Action?

This was the first time I’d ever heard a Prime Minister praising the benefits of “risk mapping.” Mid-morning on Thursday November 3 in a vast tent in the heart of New Delhi, the Indian Prime Minister, Narendra Modi, was delivering an introductory address to welcome four thousand delegates to the 2016 Asian Ministerial Conference on Disaster Risk Reduction.

Modi mentioned his own personal experience of disaster recovery after the 2001 Gujarat earthquake in which more than 12,000 people died, before presenting a ten-point plan of action in response to the 2015 Sendai Framework for disaster risk reduction. There were no guarantees of new regulations or changes in policy, but three of his ten points were particularly substantive.

First there was a call for appropriate protections to be applied to all government sponsored construction of infrastructure or housing against the relevant hazards at that location. Second he called for “work towards” achieving universal “coverage” (insurance if not by name?) against disasters– from the poorest villager to big industries and state governments. Third he called for standardized hazard and risk mapping to be developed not only for earthquake but for other perils: chemical hazards, cyclones, all varieties of floods and forest fires.

More Economic Development Means More Exposure to Risk

India is at a development threshold, comparable to that reached by Japan at the end of the 1950s and China in the 1990s. Rapid economic growth has led to a dramatic expansion of building and value in harm’s way and there now needs to be a significant compensatory focus on measures to reduce risk and expand protections, whether through insurance systems or flood walls.  Development in India has been moving too fast to hope that adequate building standards are being consistently followed – there are not enough engineers or inspectors.

The Chennai floods at the end of 2015 have come to highlight this disaster-prone landscape. Heavy end-of-year monsoonal downpours fell onto saturated ground after weeks of rainfall, which were then ponded by choked drainage channels and illegal development, swamping hundreds of thousands of buildings along with roads and even the main airport. The city was cut off and economic losses totaled billions of U.S. dollars, with more than 1.8 million people being displaced.

Sorting out Chennai will take co-ordinated government action and money: to implement new drainage systems, relocate or raise those at highest risk and apply flood zonations. Chennai provides a test that Disaster Risk Reduction really is a priority, as Mr. Modi’s speech suggested. The response will inevitably encounter opposition, from those who cannot see why they should be forced to relocate or pay more in their taxes to construct flood defenses.

The one community notably missing from Prime Minister Modi’s call to action was the private sector, even though a pre-conference session the day before, organized by Federation of Indian Chambers of Commerce (FICCI), had identified that 80% of construction was likely to be privately financed.

I gave two talks at the conference – one in the private sector session – on how modelers like RMS have taken a lead in developing those risk maps and models for India, including high resolution flood models that will help extend insurance. Yet armed with information by which to differentiate risk and identify the hot spots, the government may need to step in and provide its own coverages for those deemed too high risk by private insurers.

Auditing Disaster Risk Reduction with Cat Models

In a side meeting at the main conference I presented on the need to have independent risk audits of states and cities, to measure progress in achieving their disaster risk reduction goals, in particular when it comes to earthquake mortality – for which experience from the last few decades gives no perspective on the true risk of potentially large and destructive future earthquakes happening in India – this is where probabilistic catastrophe models are invaluable. The Nepal earthquake of 2015 has highlighted the significant vulnerability of ordinary brick and concrete buildings in the region.

I came away seeing the extraordinary opportunity to reduce and insure risk in India, if ten-point lists can truly be converted into co-ordinated action.

Meanwhile as a test of the government’s resolve in the days leading up to the conference, Delhi was shrouded in its worst ever smog: a toxic concoction of traffic fumes, coal smoke, and Diwali fireworks, enriched to extremely dangerous levels in micro-particles, a smog so thick and pervasive that it seeped inside buildings, so that several attendees asked why the toxic smog was not itself being classified and treated as a true “manmade disaster.”

The Cure for Catastrophe?

On August 24, 2016 – just a few weeks ago – an earthquake hit a remote area of the Apennine mountains of central Italy in the middle of the night. Fewer than 3000 people lived in the vicinity of the strongest shaking. But nearly 1 in 10 of those died when the buildings in which they were sleeping collapsed.

This disaster, like almost all disasters, was squarely man-made. Manufactured by what we build and where we build it; or in more subtle ways – by failing to anticipate what will one day inevitably happen.

Italy has some of the richest and best researched disaster history of any country, going back more than a thousand years. The band of earthquakes that runs through the Apennines is well mapped – pretty much this exact same earthquake happened in 1639. If you were identifying the highest risk locations in Italy, these villages would be on your shortlist. So in the year 2016, 300 people dying in a well-anticipated, moderate-sized earthquake, in a rich and highly-developed country, is no longer excusable.

Half the primary school in the town of Amatrice collapsed in the August 24th earthquake. Very fortunately, it being the middle of the night, no children were in class. Four years before, €700,000 had been spent to make the school “earthquake proof.” An investigation is now underway to see why this proofing failed so spectacularly. If only Italy was as good at building disaster resilience as mobilizing disaster response: some 7000 emergency responders had arrived after the earthquake – more than twice as many as those who lived in the affected villages.

The unnatural disaster

When we look back through history and investigate them closely we find that many other “natural disasters” were, in their different ways, also man-made.

The city of Saint-Pierre on the island of Martinique was once known as the “little Paris of the Caribbean.” In 1900 it had a population of 26,000, with tree-lined streets of balconied two and three story houses. From the start of 1902 it was clear the neighbouring volcano of Mont Pelée was heading towards an eruption. The island’s governor convened a panel of experts who concluded Saint-Pierre was at no risk because the valleys beneath the volcano would guide the products of any eruption directly into the sea. As the tremors increased, the Governor brought his family to Saint-Pierre to show the city was safe, and therefore, likely all but one of the city’s inhabitants, died when the eruption blasted sideways out of the volcano. There are some parallels here with the story of those 20,000 people drowned in the 2011 Japanese tsunami, many of whom had assumed they would be protected by concrete tsunami walls and therefore did not bother to escape while they still had time. We should distrust simple notions of where is safe, based only on some untested theory.

Sometimes the disaster reflects the unforeseen consequence of some manmade intervention. In Spring 1965, the U.S. Army Corps of Engineers completed the construction of a broad shipping canal – known as the Mississippi River Gulf Outlet (“Mr Go”) linking New Orleans with the Gulf of Mexico. Within three months, a storm surge flood driven by the strong easterly winds ahead of Hurricane Betsy was funnelled up Mr Go into the heart of the city. Without Mr Go the city would not have flooded. Four decades later Hurricane Katrina performed this same trick on New Orleans again, only this time the storm surge was three feet higher. The flooding was exacerbated when thin concrete walls lining drainage canals fell over without being overtopped. Channels meant for pumping water out of the city reversed their intended function and became the means by which the city was inundated.

These were fundamental engineering and policy failures, for which many vulnerable people paid the price.

RiskTech   

My new book, “The Cure for Catastrophe,” challenges us to think differently about disasters. To understand how risk is generated before the disaster happens. To learn from countries, like Holland, which over the centuries mastered their ever-threatening flood catastrophes, through fostering a culture of disaster resilience.

Today we can harness powerful computer technology to help anticipate and reduce disasters. Catastrophe models, originally developed to price and manage insurance portfolios, are being converted into tools to model metrics on human casualties or livelihoods as well as monetary losses. And based on these measurements we can identify where to focus our investments in disaster reduction.

In 2015 the Tokyo City government was the first to announce it aims to halve its earthquake casualties and measure progress by using the results of a catastrophe model. The frontline towns of Italy should likewise have their risks modeled and independently audited, so that we can see if they are making progress in saving future lives before they suffer their next inevitable earthquake.

 

The Cure for Catastrophe is published by Oneworld (UK) and Basic Books (US)

Fire Weather

Fires can start at all times and places, but how a fire spreads is principally down to the weather.

This week, 350 years ago, the fire at Thomas Farriner’s bakery on Pudding Lane, a small alleyway running down to the river from the City of London, broke out at the quietest time of the week, around 1am on Sunday morning September 2, 1666. London had been experiencing a drought and the thatched roofs of the houses were tinder dry. At 4 am the Lord Mayor, roused from his sleep, decided the blaze was easily manageable. It was already too late, however. By 7am the roofs of some 300 houses were burning and fanned by strong easterly winds the fire was spreading fast towards the west. Within three days the fire had consumed 13,000 houses and left 70,000 homeless.

In the city’s reconstruction only brick and tiles houses were permitted, severely reducing the potential for repeat conflagrations. Within a few years there were the first fire insurers, growing their business as fear outran the risk.

Yet big city fires had by no means gone away and the wooden cities of northern Europe were primed to burn. The 1728 Copenhagen fire destroyed 28% of the city while the 1795 fire left 6000 homeless. A quarter of the city of Helsinki burned down in November 1808. The 1842 fire that destroyed Hamburg left 20,000 homeless. The center of the city of Bergen Norway burnt down in 1855 and then again in January 1916.

Wind and fire

By the start of the 20th Century, improvements in fire-fighting had reduced the chance that a great city fire took hold, but not if there were strong winds, like the 1916 Bergen, Norway fire, which broke out in the middle of an intense windstorm with hurricane force gusts. In February 1941 the fire that burnt out the historic center of Santander on the coast of northern Spain was driven by an intense windstorm: equivalent to the 1987 October storm in the U.K. And then there is the firestorm that destroyed Yokohama and Tokyo after the 1923 earthquake, driven by 50 miles per hour winds on the outer edge of a typhoon in which, over a few hours, an estimated 140,000 died.

Wind and fire in the wooden city are a deadly combination. Above a certain wind speed, the fire becomes an uncontrollable firestorm. The 1991 Oakland Hills fire flared up late morning also on a Sunday and then surged out of the mountains into the city, driven by hot dry 60 miles per hour Diablo Winds from the east, jumping an 8 lane highway and overwhelming the ability of the fire crews to hold the line, until the wind eventually turned and the fire blew back over its own embers.  The fire consumed 2800 houses, spreading so fast that 25 died. On February 7, 2009 a strong northwesterly wind drew baking air out of Australia’s interior and fires took off across the state of Victoria. Fallen power cables sparked a fire whose embers, blown by 60 miles per hour winds, flashed from one woodland to another, overwhelming several small towns so fast that 173 died before they could escape.

Most recently we have seen fire storms in Canada. Again there is nothing new about the phenomenon; the Matheson fires in 1919 destroyed 49 Ontario towns and killed 244 people in a fire front that extended 60km wide. It was a firestorm fanned by gale force winds, that destroyed one third of the city of Slave Lake, Alberta, in 2011 and it is fortunate only that the roads were broad and straight to allow people to escape the fires that raged into Fort McMurray in summer 2016.

There is no remedy for a firestorm blown on gale-force winds. And wooden property close to drought ridden forests are at very high risk, such as those from South Lake Tahoe to Berkeley in California and in New Zealand, from Canberra to Christchurch. Which is why urban fire needs to stay on the agenda of catastrophe risk management. A wind driven conflagration can blow deep into any timber city, and insurers need to manage their exposure concentrations.

Using Insurance Claims Data to Drive Resilience

When disaster strikes for homeowners and businesses the insurance industry is a source of funds to pick up the pieces and carry on. In that way the industry provides an immediate benefit to society. But can insurers play an extended role in helping to reduce the risks for which they provide cover, to make society more resilient to the next disaster?

Insurers collect far more detailed and precise information on property damage than any other public sector or private organisation. Such claims data can provide deep insights into what determines damage – whether it’s the vulnerability of a particular building type or the fine scale structure of flood hazard.

While the data derived from claims experience helps insurers to price and manage their risk, it has not been possible to apply this data to reduce the potential for damage itself – but that is changing.

At a recent Organisation for Economic Co-operation and Development meeting in Paris on flood risk insurance we discussed new initiatives in Norway, France and Australia that harness and apply insurers’ claims experience to inform urban resilience strategies.

Norway Claims Data Improves Flood Risk

In Norway the costs of catastrophes are pooled across private insurance companies, making it the norm for insurers to share their claims data with the Natural Perils Pool. Norwegian insurers have collaborated to make the sharing process more efficient, agreeing a standardized approach in 2008 to address-level exposure and claims classifications covering all private, commercial and public buildings. Once the classifications were consistent it became clear that almost 70% of flood claims were driven by urban flooding from heavy rainfall.

Starting with a pilot of ten municipalities, including the capital Oslo, a group funded by the Norwegian finance and insurance sector took this address-level data to the city authorities to show exactly where losses were concentrated, so that the city engineer could identify and implement remedial actions: whether through larger storm drains or flood walls. As a result flood claims are being reduced.

French Observatory Applies Lessons Learned from Claims Data

Another example is from France, where natural catastrophe losses are refunded through the national ‘Cat Nat System’. Property insureds pay an extra 12% premium to be covered. All the claims data generated in this process now gets passed to the national Observatory of Natural Risks, set up after Storm Xynthia in 2010. This unit employs the data to perform forensic investigations to identify what can be learnt about the claims and then works with municipalities to see how to apply these lessons to reduce future losses. The French claims experience is not as comprehensive as in Norway because such data only gets collected when the state declares there has been a ‘Cat Nat event’  – which excludes some of the smaller and local losses that fail to reach the threshold of political attention.

Australian Insurers Forced Council to Act on Their Claims Data

In Australia sharing claims data with a city council was the result of a provocative action by insurers which were frustrated by the political pressure to offer universal flood insurance following the major floods in 2011.  Roma, a town in Queensland, had been inundated five times in six years – insurers mapped and published the addresses of the properties that had been repeatedly flooded and refused to renew the insurance cover unless action was taken. The insurers’ campaign achieved its goal, pressuring the local council to fund flood alleviation measures across the town.

These examples highlight how insurers can help cities identify where their investments will accomplish the most cost-effective risk reduction. All that’s needed is an appetite to find ways to process and deliver claims data in a format that provides the key insights that city bosses need, without compromising concerns around confidentiality or privacy.

This is another exciting application in the burgeoning new field of resilience analytics.

Calculating the cost of “Loss and Damage”

The idea that rich, industrialized countries should be liable for paying compensation to poorer, developing ones damaged by climate change is one that has been disputed endlessly at recent international climate conferences.

The fear among rich countries is that they would be signing a future blank check. And the legal headaches in working out the amount of compensation don’t bear thinking about when there are likely to be arguments about whether vulnerable states have done enough to protect themselves.

The question of who pays the compensation bill may prove intractable for some years to come. But the scientific models already exist to make the working out of that bill more transparent.

Some context: in the early years of climate negotiations there was a single focus—on mitigating or (limiting) greenhouse gas emissions. Through the 1990s it became clear atmospheric carbon dioxide was growing just as quickly, so a second mission was added: “adaptation” to the effects of climate change.

Now we have a third concept: “Loss and Damage” which recognizes that no amount of mitigation or adaptation will fully protect us from damages that can’t be stopped and losses that can’t be recovered.

Sufficient self-protection?

The Loss and Damage concept was originally developed by the Association of Small Island States, which saw themselves in the frontline of potential impacts from climate change, in particular around sea-level rise. By some projections at least four of the small island countries (Kiribati, Tuvalu, the Marshall Islands, and the Maldives) will be submerged by the end of this century.

Countries in such a predicament seeking compensation for their loss and damage will have to answer a difficult question: did they do enough to adapt to rising temperatures before asking other countries to help cover the costs? Rich countries will not look kindly on countries they deem to have done too little.

If money were no object, then adaptation strategies might seem limitless and nothing in the loss and damage world need be inevitable. Take sea level rise, for example. Even now in the South China Sea we see the Chinese government, armed with strategic will and giant dredgers, pumping millions of tons of sand so that submerged reefs can be turned into garrison town islands. New Orleans—a city that is 90% below sea level—is protected by a $14 billion flood wall.

But, clearly, adaptation is expensive and so the most effective strategies may be beyond the reach of poorer countries.

Calculating the cost with models

Through successive international conferences on climate change the legal and financial implications of loss and damage have seen diplomatic wrangling as richer and poorer nations argue about who’s going to foot the bill.

But we can conceptualize a scientific mechanism for tallying what that bill should be. It would need a combination of models to discriminate between costs that would have happened anyway and those that are the responsibility of climate change.

Firstly, we could use “attribution climate models” which run two versions of future climate change: one model is based on the atmosphere as it actually is in 2016 while the other “re-writes history” and supposes there’s been no increase in greenhouse gases since 1950.

By running these two models for thousands of simulation years we can see the difference in the number of times a particular climate extreme might happen. And the difference between them suggests how much that extreme is down to greenhouse gas emissions. After this we will need to model how much adaptation could have reduced loss and damage. An illustration:

  • A future extreme weather event might cause $100 billion damage.
  • Attribution studies show that the event has become twice as likely because of climate change.
  • Catastrophe models show the cost of the damage could have been halved with proper adaptation.
  • So the official loss and damage could be declared as $25 billion.

While hardly a straightforward accounting device it’s clear that this is a mechanism—albeit an impressively sophisticated one—that could be developed to calculate the bill for loss and damage due to climate change.

Leaving only the rather thorny question of who pays for it.

Mangroves and Marshes: A Shield Against Catastrophe?

“We believe that natural ecosystems protect against catastrophic coastal flood losses, but how can we prove it?”

This question was the start of a conversation in 2014 which has led to some interesting results. And it set us thinking: can RMS’ models, like the one which estimates the risk of surge caused by hurricanes, capture the protective effect of those natural ecosystems?

The conversation took place at a meeting on Coastal Defenses organized by the Science for Nature and People Partnership. RMS had been invited by one of our leading clients, Guy Carpenter, to join them. The partnership is organized by The Nature Conservancy, the Wildlife Conservation Society, and the National Center for Ecological Analysis and Synthesis.

We were confident we could help. Not only did we think our models would show how biological systems can limit flood impacts, we reckoned we could measure this and then quantify those benefits for people who calculate risk costs, and set insurance prices.

RMS’ modeling methodology uses a time-stepping simulation, which relies on a specialist ocean atmosphere model, allowing us to evaluate at fine resolution how the coastal landscape can actually reduce the storm surge—and in particular lower the height of waves. In many buildings the real weakness proves to be the vulnerability to wave action rather than just the damage done by the water inundation alone.

The first phase of RMS’ work with The Nature Conservancy is focused on coastal marshes as part of a project supported by a Lloyd’s Tercentenary Research Foundation grant to TNC and UC Santa Cruz. Under the supervision of Paul Wilson, in the RMS model development team, and working with Mike Beck who’s the lead marine scientist for The Nature Conservancy, the project is focused on the coastlines, which were worst impacted by the surge from Superstorm Sandy. The irregular terrain of the marsh and resulting frictional effects reduce the surge height from the storm. Our work is showing that coastal marshes can reduce the flood risk costs of properties, which lie inland of the marshes by something in the range of 10-25%.

Tropical Defenses

So, that’s the effect of coastal marshes. But what about other biological defenses such as mangrove forests and offshore reefs (whether coral or oyster reefs)? Further research is planned in 2016 using RMS models to measure those likely benefits too.

But here’s a rather intriguing (if unscientific) thought: is there a curious Gaia-like principle of self-protection operating here in that the most effective natural coastal protections—mangroves and coral reefs—are themselves restricted to the tropics and subtropics, the very regions where tropical cyclone storm surges pose the greatest threat? Mangroves cannot withstand frosts and therefore in their natural habitat only extend as far north along the Florida peninsula as Cape Canaveral. And yet in our shortsightedness humans have removed those very natural features, which could help protect us.

Paradise Lost?

Between 1943 and 1970 half a million acres of Florida mangroves were cleared to make way for smooth beaches—those beautiful and inviting stretches of pristine sand which have for decades attracted developers to build beachfront properties. Yet, paradoxically, that photogenic “nakedness” of sand and sea may be one of the things, which leaves those properties most exposed to the elements.

With the backing of The Nature Conservancy it seems mangroves are making a comeback. In Miami-Dade County they’re examining a planting program to protect a large water treatment facility. Of course biological systems can only reduce part of the flood risk. They can weaken the destructive storm surge but the water still gets inland. To manage this might require designing buildings with water-resistant walls and floors, or could involve a hybrid of grey (manmade) and green defenses. And if we can reduce the destructive wave action, that might allow us to build earth embankments protected with turf in place of expensive and ugly, but wave-resistant, concrete flood walls.

On March 28, 2015 The Nature Conservancy organized a conference and press briefing in Miami at which they announced their collaboration with RMS to measure the benefits of natural coastal defenses. The coastline of Miami-Dade, already experiencing the effects of rising high tide sea levels, presents real opportunities to test out ways of combatting hurricane hazards and stronger storms through biological systems. Our continued work with The Nature Conservancy is intended to develop metrics that are widely trusted and can eventually be adopted for setting flood insurance prices in the National Flood Insurance Program.