The second Mexico multi-cat securitization was launched at the end of 2012 to provide reinsurance for the Fund for Natural Disasters (FONDEN) – established by the Mexican federal government as a disaster fund for the poor, which also finances disaster-damaged infrastructure. This three-year bond had a series of tranches covering earthquake or hurricane, each to be triggered by parametric “cat-in-a-box” structures. For one hurricane tranche, this was based on the central pressure of a storm passing into a large “box” drawn around the Pacific coasts of Mexico and Baja California, with the length of the box spanning well over a thousand miles. With a “U.S. National Hurricane Center (NHC) ratified” hurricane central pressure in the box of 920 millibars (mb) or lower there would be a 100 percent payout or $100 million; for a central pressure of 932 mb to 920 mb a 50 percent payout of $50 million.
For Texas, it all began with Hurricane Ike in 2008. In the run-up and during the global financial crisis, between September 2006 and the last quarter of 2008, 780,000 jobs disappeared in the U.S. construction sector. Following Ike, with its generally modest levels of damage across a wide area, “two men in a pickup” teams turned up to get homeowners to sign “Assignment of Benefits” (AOB) forms. A signed AOB form gave repairers responsibility for dealing with the insurer over the claim, and the right to pass the case onto a lawyer assembling a class action lawsuit.
The original idea of contractors sourcing AOBs may have emerged after the 2004-2005 hurricanes in Florida, when lawyer Harvey V Cohen of Cohen Grossman met with restoration contractors to encourage them to employ “Assignment of Benefits” forms so they would deal directly with the insurance company for their payments.
It would be hard to find a simpler example of a catastrophe. Details are emerging that the overall plan for managing fire risk in the 24-storey Grenfell Tower in North Kensington, London, centered on the assumption that as each of the 120 apartments in the tower block had a fire door, any fire would be contained long enough for the fire service to arrive. Meanwhile all those living in the unaffected apartments could conduct an orderly evacuation from the building. As a concrete building, with concrete floors and walls, it would be hard for a fire to spread. Continue reading
The term “observer effect” in physics refers to how the act of making an observation changes the state of the system. To measure the pressure in a tire you have to let out some air. Measure the spin of an electron and it will change its state.
There is something similar about the “insurer effect” in catastrophe loss. If insurance is in place, the loss will be higher than if there is no insurer. We see this effect in many areas of insurance, but now the “insurer effect” factor is becoming an increasing contributor to disaster losses. In the U.S., trends in claiming behavior are having a bigger impact on catastrophe insurance losses than climate change.
“Some six months have passed since the magnitude (Mw) 6.7 earthquake struck Los Angeles County, with an epicenter close to the coast in Long Beach. Total economic loss estimates are more than $30 billion. Among the affected homeowners, the earthquake insurance take-up rates were pitifully low – around 14 percent. And even then, the punitive deductibles contained in their policies means that homeowners may only recover 20 percent of their repair bills. So, there is a lot of uninsured loss looking for compensation. Now there are billboards with pictures of smiling lawyers inviting disgruntled homeowners to become part of class action lawsuits, directed at several oilfield operators located close to the fault. For there is enough of an argument to suggest that this earthquake was triggered by human activities.”
This is not a wild hypothesis with little chance of establishing liability, or the lawyers would not be investing in the opportunity. There are currently three thousand active oil wells in Los Angeles County. There is even an oil derrick in the grounds of Beverly Hills High School. Los Angeles County is second only to its northerly neighbor Kern County in terms of current levels of oil production in California. In 2013, the U.S. Geological Survey (USGS) estimated there were 900 million barrels of oil still to be extracted from the coastal Wilmington Field which extends for around six miles (10 km) around Long Beach, from Carson to the Belmont Shore.
However, the Los Angeles oil boom was back in the 1920s when most of the large fields were first discovered. Two seismologists at the USGS have now searched back through the records of earthquakes and oil field production – and arrived at a startling conclusion. Many of the earthquakes during this period appear to have been triggered by neighboring oil field production.
The Mw4.9 earthquake of June 22, 1920 had a shallow source that caused significant damage in a small area just a mile to the west of Inglewood. Local exploration wells releasing oil and gas pressures had been drilled at this location in the months before the earthquake.
A Mw4.3 earthquake in July 1929 at Whittier, some four miles (6 km) southwest of downtown Los Angeles, had a source close to the Santa Fe Springs oil field; one of the top producers through the 1920s, a field which had been drilled deeper and had a production boom in the months leading up to the earthquake.
A Mw5 earthquake occurred close to Santa Monica on August 31, 1930, in the vicinity of the Playa del Rey oilfield at Venice, California, a field first identified in December 1929 with production ramping up to four million barrels over the second half of 1930.
The epicenter of the Mw6.4 1933 Long Beach earthquake, on the Newport-Inglewood Fault was in the footprint of the Huntingdon Beach oilfield at the southern end of this 47 mile-long (75 km) fault.
As for a mechanism – the Groningen gas field in the Netherlands, shows how earthquakes can be triggered simply by the extraction of oil and gas, as reductions in load and compaction cause faults to break.
More Deep Waste Water Disposal Wells in California than Oklahoma
Today many of the Los Angeles oilfields are being managed through secondary recovery – pumping water into the reservoir to flush out the oil. In which case, we have an additional potential mechanism to generate earthquakes – raising deep fluid pressures – as currently experienced in Oklahoma. And Oklahoma is not even the number one U.S. state for deep waste water disposal. Between 2010 and 2013 there were 9,900 active deep waste water disposal wells in California relative to 8,600 in Oklahoma. And the California wells tend to be deeper.
More than 75 percent of the state’s oil production and more than 80 percent of all injection wells are in Kern County, central California, which happens to be close to the largest earthquake in the region over the past century on the White Wolf Fault: Mw7.3 in 1952. In 2005, there was an abrupt increase in the rates of waste water injection close to the White Wolf Fault, which was followed by an unprecedented swarm of four earthquakes over Magnitude 4 on the same day in September 2005. The injection and the seismicity have been linked in a research paper by Caltech and University of Southern California seismologists published in 2016. One neighboring well, delivering 57,000 cubic meters of waste water each month, was started just five months before the earthquake swarm broke out. The seismologists found a smoking gun, a pattern of smaller shocks migrating from the site of the well to the location of the earthquake cluster.
To summarize – we know that raising fluid pressures at depth can cause earthquakes, as is the case in Oklahoma, and also in Kern County, CA. We know there is circumstantial evidence for a connection between specific damaging earthquakes and oil extraction in southern California in the 1920s and 1930s. According to the location of the next major earthquake in southern or central California, there is a reasonable probability there will be an actively managed oilfield or waste water well in the vicinity.
Whoever is holding the liability cover for that operator may need some deep pockets.
The Groningen gas field, discovered in 1959, is the largest in Europe and produces up to 15 per cent of the natural gas consumed across the continent. With original reserves of more than 100 trillion cubic feet, over the decades the field has been an extraordinary cash cow for the Dutch government and the two global energy giants, Shell and ExxonMobil, which partner in managing the field. In 2014 alone, state proceeds from Groningen were approximately €9.4 billion ($9.8 billion).
But now, costs to the Dutch government are mounting as the courts have ordered that compensation is paid to nearby propery owners for damage caused by the earthquakes induced by extracting the gas. Insurers who were covering liabilities at the field now find that the claims have the potential to extend beyond the direct shaking damage to include the reduction in property values caused by this ongoing seismic crisis. And the potential for future earthquakes and their related damages has not disappeared – a situation which again illustrates the importance of modeling the risk costs of liability coverages, a new capability on which RMS is partnering with its sister company Praedicat.
The Groningen gas reservoir covers 700 square miles and, uniquely among giant gas fields worldwide, it is located beneath a well-populated and developed region. The buildings in this region, which half a million people live and work in, are not earthquake resistant: 90% of properties are made from unreinforced masonry (URM).
The ground above the gas field has been subsiding as the gas has vented out from the 10,000-feet deep porous sandstone reservoir and the formation has compacted. This compaction helps squeeze the gas out of reservoir, but has also led to movement on pre-existing faults that are present throughout the sandstone layer, a small number of which are more regional in extent. And these sudden fault movements radiate earthquake vibrations.
How A Shake Became a Seismic Crisis
The first earthquake recorded at the field was in December 1991 with a magnitude of 2.4. The largest to date was in August 2012 with a magnitude of 3.6. In most parts of the world, such an earthquake would not have significant consequences, but on account of the shallow depth of the quake, thick soils and poor quality building construction in the Groningen area, there were more than 30,000 claims for property damage, dwarfing the total number from the previous two decades.
Since the start of 2014 the government has limited gas production in an attempt to manage the earthquakes, with some success. But the ongoing seismicity has had a catastrophic effect on the property market, which has been compounded by a class-action lawsuit in 2015. It was filed on behalf of 900 homeowners and 12 housing co-operatives who had seen the value of their properties plummet. The judge ruled that owners of the real estate should be compensated for loss of their property’s market value, even when the property was not up for sale. The case is still rumbling on through the appeal courts but if the earlier ruling stands, then the estimates of the future liabilities for damage and loss of property value range from €6.5 billion to €30 billion.
Calculating the Risk
While earthquakes associated with gas and oil extraction are known from other fields worldwide, the massive financial risk at Groningen reflects the intersection of a moderate level of seismicity with a huge concentration of exposed value and very weak buildings. And although limiting production since 2014 has reduced the seismicity, there still remains the potential for further highly damaging earthquakes.
Calculating these risk costs requires a fully probabilistic assessment of the expected seismicity, across the full range of potential magnitudes and their annual probabilities. Each event in the simulation can be modeled using locally-calibrated ground motion data as well as expected property vulnerabilities, based on previous experience from the 2012 earthquake.
There is also the question of how far beyond actual physical damage the liabilities have the potential to extend and where future earthquakes can affect house values. The situation at Groningen, where it took almost thirty years of production before the earthquakes began, highlights the need for detailed risk analysis of all energy liability insurance covers for gas and oil extraction.
If the prospect of flooding along the East Coast of England earlier this month was hard to forecast, the newspaper headlines the next day were predictable enough:
Villagers shrug off storm danger (The Times)
The police had attempted an evacuation of some communities and the army was on standby. This was because of warnings of a ‘catastrophic’ North Sea storm surge on January 13 for which the UK Environment Agency applied the highest level flood warnings along parts of the East Coast: ‘severe’ which represents a danger to life. And yet the flooding did not materialize.
Water levels were 1.2m lower along the Lincolnshire coast than those experienced in the last big storm surge flood in December 2013, and 0.9m lower around the Norfolk towns of Great Yarmouth and Lowestoft. Predicting the future in such complex situations, even very near-term, always has the potential to make fools of the experts. But there’s a pressure on public agencies, knowing the political fallout of missing a catastrophe, to adopt the precautionary principle and take action. Imagine the set of headlines, and ministerial responses, if there had been no warnings followed by loss of life.
Interestingly, most of those who had been told to evacuate as this storm approached chose to stay in their homes. One police force in Essex, knocked on 2,000 doors yet only 140 of those people registered at an evacuation centre. Why did the others ignore the warnings and stay put? Media reports suggest that many felt this was another false alarm.
The precautionary principal might seem prudent, but a false alarm forecast can encourage people to ignore future warnings. Recent years offer numerous examples of the consequences.
The Lessons of History
Following a 2006 Mw8.3 earthquake offshore from the Kurile Islands, tsunami evacuation warnings were issued all along the Pacific coast of northern Japan, where the tsunami that did arrive was harmless. For many people that experience weakened the imperative to evacuate after feeling the three-minute shaking of the March 2011 Mw9 earthquake, following which 20,000 people were drowned by the tsunami. Based on the fear of what happened in 2004 and 2011, today tsunami warnings are being ‘over-issued’ in many countries around the Pacific and Indian Oceans.
For the inhabitants of New Orleans, the evacuation order issued in advance of Hurricane Ivan in December 2004 (when one third of the city’s population moved out, while the storm veered away), left many sceptical about the mandatory evacuation issued in advance of Hurricane Katrina in August 2005 (after which around 1500 drowned).
Agencies whose job it is to forecast disaster know only too well what happens if they don’t issue a warning as any risk looms. However, the long-term consequences from false alarms are perhaps not made explicit enough. While risk models to calculate the consequence are not yet available, a simple hypothetical calculation illustrates the basic principles of how such a model might work:
- the chance of a dangerous storm surge in the next 20 years is 10 percent, for a given community;
- if this happens, then let’s say 5,000 people would be at grave risk;
- because of a recent ‘false’ alarm, one percent of those residents will ignore evacuation orders;
- thus the potential loss of life attributed to the false alarm is five people.
Now repeat with real data.
Forecasting agencies need a false alarm forecast risk model to be able to help balance their decisions about when to issue severe warnings. There is an understandable instinct to be over cautious in the short-term, but when measured in terms of future lives lost, disaster warnings need to be carefully rationed. And that rationing requires political support, as well as public education.
[Note: RMS models storm surge in the U.K. where the risk is highest along England’s East Coast – the area affected by flood warnings on January 13. Surge risk is complex, and the RMS Europe Windstorm Model™ calculates surge losses caused by extra-tropical cyclones considering factors such as tidal state, coastal defenses, and saltwater contamination.]
It was back in 2009 that the inhabitants of northern Oklahoma first noticed the vibrations. Initially only once or twice a year, but then every month, and even every week. It was disconcerting rather than damaging until November 2011, when a magnitude 5.6 earthquake broke beneath the city of Prague, Okla., causing widespread damage to chimneys and brick veneer walls, but fortunately no casualties.
The U.S. Geological Service had been tracking this extraordinary outburst of seismicity. Before 2008, across the central and eastern U.S., there were an average of 21 earthquakes of magnitude three or higher each year. Between 2009-2013 that annual average increased to 99 earthquakes in Oklahoma alone, rising to 659 in 2014 and more than 800 in 2015.
During the same period the oil industry in Oklahoma embarked on a dramatic expansion of fracking and conventional oil extraction. Both activities were generating a lot of waste water. The cheapest way of disposing the brine was to inject it deep down boreholes into the 500 million year old Arbuckle Sedimentary Formation. The volume being pumped there increased from 20 million barrels in 1997 to 400 million barrels in 2013. Today there are some 3,500 disposal wells in Oklahoma State, down which more than a million barrels of saline water is pumped every day.
It became clear that the chatter of Oklahoma earthquakes was linked with these injection wells. The way that raising deep fluid pressures can generate earthquakes has been well-understood for decades: the fluid ‘lubricates’ faults that are already poised to fail.
But induced seismicity is an issue for energy companies worldwide, not just in the South Central states of the U.S.. And it presents a challenge for insurers, as earthquakes don’t neatly label themselves ‘induced’ and ‘natural.’ So their losses will also be picked up by property insurers writing earthquake extensions to standard coverages, as well as potentially by the insurers covering the liabilities of the deep disposal operators.
Investigating the Risk
Working with Praedicat, which specializes in understanding liability risks, RMS set out to develop a solution by focusing first on Oklahoma, framing two important questions regarding the potential consequences for the operators of the deep disposal wells:
- What is the annual risk cost of all the earthquakes with the potential to be induced by a specific injection well?
- In the aftermath of a destructive earthquake how could the damage costs be allocated back to the nearby well operators most equitably?
In Oklahoma detailed records have been kept on all fluid injection activities: well locations, depths, rates of injection. There is also data on the timing and location of every earthquake in the state. By linking these two datasets the RMS team was able to explore what connects fluid disposal with seismicity. We found, for example, that both the depth of a well and the volume of fluid disposed increased the tendency to generate seismic activity.
Earthquakes in the central U.S. are not only shallow and/or human-induced. The notorious New Madrid, Mo. earthquakes of 1811-1812 demonstrated the enormous capacity for ‘natural’ seismicity in the central U.S., which can, albeit infrequently, cause earthquakes with magnitudes in excess of M7. However, there remains the question of the maximum magnitude of an induced earthquake in Oklahoma. Based on worldwide experience the upper limit is generally assumed to be magnitude M6 to 6.5.
Who Pays – and How Much?
From our studies of the induced seismicity in the region, RMS can now calculate the expected total economic loss from potential earthquakes using the RMS North America Earthquake Model. To do so we run a series of shocks, at quarter magnitude intervals, located at the site of each injection well. Having assessed the impact at a range of different locations, we’ve found dramatic differences in the risk costs for a disposal well in a rural area in contrast to a well near the principal cities of central Oklahoma. Reversing this procedure we have also identified a rational and equitable process which could help allocate the costs of a damaging earthquake back to all the nearby well operators. In this, distance will be a critical factor.
Modeling Advances for Manmade Earthquakes
For carriers writing US earthquake impacts for homeowners and businesses there is also a concern about the potential liabilities from this phenomenon. Hence, the updated RMS North America Earthquake Model, to be released in spring 2017, will now include a tool for calculating property risk from induced seismicity in affected states: not just Oklahoma but also Kansas, Ohio, Arkansas, Texas, Colorado, New Mexico, and Alabama. The scientific understanding of induced seismicity and its consequences are rapidly evolving, and RMS scientists are closely following these developments.
As for Oklahoma, the situation is becoming critical as the seismic activity shows no signs of stopping: a swarm of induced earthquakes has erupted beneath the largest U.S. inland oil storage depot at Cushing and in September 2016 there was a moment magnitude 5.8 earthquake located eight miles from the town of Pawnee – which caused serious damage to buildings. Were a magnitude 6+ earthquake to hit near Edmond (outside Oklahoma City) our modeling shows it could cause billions of dollars of damage.
The risk of seismicity triggered by the energy industry is a global challenge, with implications far beyond Oklahoma. For example Europe’s largest gas field, in the Netherlands, is currently the site of damaging seismicity. And in my next blog, I’ll be looking at the consequences.
[For a wider discussion of the issues surrounding induced seismicity please see these Reactions articles, for which Robert Muir-Wood was interviewed.]
If a global prize was to be awarded to the city or country that achieves the peak of disaster resilience, Bermuda might be a fitting first winner.
This October’s Hurricane Nicole made direct landfall on the island. The eyewall tracked over Bermuda with maximum measured windspeeds close to 120 mph. Nonetheless there were there were no casualties. The damage tally was principally to fallen trees, roadway debris, some smashed boats and many downed utility poles. The airport opened in 24 hours, with the island’s ferries operating the following day.
Bermuda’s performance through Nicole was exemplary. What’s behind that?
Since its foundation in 1609 when 150 colonists and crew were shipwrecked on the island, Bermuda has got used to its situation at the heart of hurricane alley. Comprising 21 square miles of reef and lithified dunes, sitting out in the Atlantic 650 miles west of Cape Hatteras, a hurricane hits the island on average once every six or seven years. Mostly these are glancing blows, but once or twice a century Bermuda sustains direct hits at Category 3 or 4 intensity. Hurricane Fabian in 2003 was the worst of the recent storms, causing $300 million of damage (estimated to be worth $650 million, accounting for today’s higher prices and greater property exposure). The cost of the damage from Hurricane Gonzalo in 2014 was about half this amount.
How did Bermuda’s indigenous building style come to adopt such a high standard of wind resistance? It seems to go back to a run of four hurricanes at the beginning of the 18th Century. First, in September 1712 a hurricane persisted for eight hours destroying the majority of wooden buildings. Then twice in 1713 and again more strongly in 1715 the hurricane winds ruined the newly rebuilt churches. One hurricane can seem like an exception, four becomes a trend. In response, houses were constructed with walls of massive reef limestone blocks, covered by roofs tiled with thick slabs of coral stone: traditional house styles that have been sustained ever since.
The frequency of hurricanes has helped stress test the building stock, and ensure the traditional construction styles have been sustained. More recently there has been a robust and well-policed building code to ensure adequate wind resistance for all new construction on the island.
Yet resilience is more than strong buildings. It also requires hardened infrastructure, and that is where Bermuda has some room for improvement. Still dependent on overhead power lines, 90 percent of the island’s 27,000 houses lost power in Hurricane Nicole – although half of these had been reconnected by the following morning and the remainder through that day. Mobile phone and cable networks were also back in operation over a similar timescale. Experience of recent hurricanes has ensured an adequate stockpile of cable and poles.
Expert Eyes on the Island
It helps that there is an international reinsurance industry on the island, with many specialists in the science of hurricanes and the physics and engineering of building performance on hand to scrutinize the application of improved resilience. Almost every building is insured, giving underwriters oversight of building standards. Most importantly, the very functioning of global reinsurance depends on uninterrupted connection with the rest of the world, as well as ensuring that on-island staff are not distracted by having to attend to their family’s welfare.
Bermuda’s experience during Nicole would merit the platinum standard of resilience adopted by the best businesses: that all functions can be restored within 72 hours of a disaster. The Bermuda Business Development Agency and the Association of Bermuda Insurers and Reinsurers were fulsome in their praise for how the island had withstood the hurricane. The strong and widely-owned culture of preparedness, reflects the experience of recent storms like Gonzalo and Fabian.
Stephen Weinstein, general counsel at RenaissanceRe, commented “It’s remarkable that one day after a major hurricane strike, Bermuda is open for business, helping finance disaster risk worldwide, and poised to welcome back business visitors and vacationers alike.”
In early 2017, RMS will issue an update to Bermuda wind vulnerability in the version 17 software release as part of a broader update to the 33 islands and territories covered by the North Atlantic Hurricane Models. Updates to Bermuda vulnerability will consider past hurricane observations and the latest building code research.
On Monday 14 November 2016 Dr Robert Muir-Wood, RMS chief research officer who is an earthquake expert and specialist in catastrophe risk management, made the following observations about the earthquake in Amberley:
“The November 13 earthquake was assigned a magnitude 7.8 by the United States Geological Service. That makes it more than fifty times bigger than the February 2011 earthquake which occurred directly beneath Christchurch. However, it was still around forty times smaller than the Great Tohoku earthquake off the northeast coast of Japan in March 2011.”
CASUALTIES, PROPERTY DAMAGE & BUSINESS INTERRUPTION
“Although it was significantly bigger than the Christchurch earthquake, the source of the earthquake was further from major exposure concentrations. The northeast coast of South Island has a very low population and the earthquake occurred in the middle of the night when there was little traffic on the coast road. Characteristic of such an earthquake in steep mountainous terrain, there have been thousands of landslides, some of which have blocked streams and rivers – there is now a risk of flooding downstream when these “dams” break.
In the capital city, Wellington, liquefaction and slumping on man-made ground around the port has damaged some quays and made it impossible for the ferry that runs between North and South Island to dock. The most spectacular damage has come from massive landslides blocking the main coast road Highway 1 that is the overland connection from the ferryport opposite Wellington down to Christchurch. This will take months or even years to repair. Therefore it appears the biggest consequences of the earthquake can be expected to be logistical, with particular implications for any commercial activity in Christchurch that is dependent on overland supplies from the north. As long as the main highway remains closed, ferries may have to ship supplies down to Lyttelton, the main port of Christchurch.”
“The earthquake appears to have occurred principally along the complex fault system in the north-eastern part of the South Island, where the plate tectonic motion between the Pacific and Australian plates transfers from subduction along the Hikurangi Subduction Zone to strike-slip along the Alpine Fault System. Faults in this area strike predominantly northeast-southwest and show a combination of thrust and strike-slip motion. From its epicenter the rupture unzipped towards the northeast, for about 100-140km reaching to about 200 km to the capital city Wellington.”
“Given the way the rupture spread to the northeast there is some potential for a follow-on major earthquake on one of the faults running beneath Wellington. The chances of a follow-on major earthquake are highest in the first few days after a big earthquake, and tail off exponentially. Aftershocks are expected to continue to be felt for months.”
“These events occurred on multiple fault segments in close proximity to one another. The technology to model this type of complex rupture is now available in the latest RMS high-definition New Zealand Earthquake Model (2016) where fault segments may now interconnect under certain considerations.”