Category Archives: Natural Catastrophe Risk

Nine Years After Darfield: When an Earthquake Drives a New Model – Part Two

The Liquefaction Model

The 2010 M7.1 Darfield earthquake in New Zealand started a sequence of events – the Canterbury Earthquake Sequence (CES), that propagated eastward in the Canterbury region over several years. Since the City of Christchurch is built on alluvial sediments where the water table is very shallow, several of the larger events created widespread liquefaction within the city and surrounding areas. Such ground deformations caused a significant number of buildings with shallow foundations to settle, tilt and deform.

Prior to these New Zealand earthquakes, liquefaction was observed but not on this scale in a built-up area in a developed country. As in previous well-studied liquefaction events (e.g. 1964 Niigata) this was a unique opportunity to examine liquefaction severity and building responses. Christchurch was referred to as a “liquefaction laboratory” with the multiple events causing different levels of shaking across the city. However, we had not previously seen suburbs of insured buildings damaged by liquefaction.

Continue reading

Nine Years After Darfield: When an Earthquake Drives a New Model – Part One

The Source Model

The 2010 M7.1 Darfield earthquake in New Zealand started a sequence of events that propagated eastward in the Canterbury region over several years, collectively causing upward of 15 individual loss-causing events for the insurance industry. The Insurance Council of New Zealand state that the total insured loss was more than NZ$31 billion (US$19.4 billion).

With such a significant sequence of events, a lot had to be learned and reflected into earthquake risk modeling, both to be scientifically robust and to answer the new regulatory needs. GNS Science – the New Zealand Crown Research Institute, had issued its National Seismic Hazard Model (NSHM) in 2010, before the Canterbury Earthquake Sequence (CES) and before Tōhoku. The model release was a major project, and at the time, in response to the CES, GNS only had the bandwidth for a mini-update to the 2010 models, to allow M9 events on the Hikurangi Subduction Interface, New Zealand’s largest plate boundary fault, and to get a working group started on Canterbury earthquake rates.

But given the high penetration rate of earthquake insurance in New Zealand and the magnitude of the damage in the Canterbury region, the (re)insurance and regulatory position was in transition. Rather than wait for a new National Seismic Hazard Map (NSHMP) update (which is still in not available), RMS joined the national effort and started a collaboration with GNS Science as well as our own research, to build a model that would help during this difficult time, when many rebuild decisions had to be made. The RMS® New Zealand Earthquake High Definition (HD) model was released in mid-2016.

Continue reading

How to Deliver Sea Level Data

Global sea levels are rising. After two thousand years of stability, the transition to continuous coastal change will be jarring (although this is what our shoreline ancestors experienced more than 6,000 years ago). By the end of this century, millions of people will need to relocate. An estimated two trillion dollars of assets lies within the first meter above extreme high tide.  

Future sea levels” is one of seven “Grand Challenges” of the World Climate Research Programme (WCRP). Through the week of November 11, leading experts from around the world met in Orléans at the headquarters of the Bureau de Recherches Géologiques et Minières (BRGM), the French geological survey.

Continue reading

Venice in Peril

The first time I noticed the coincidence I assumed there had been a mistake. The most-costly flooding in modern Italian history inundated the city of Florence on November 4, 1966. The Arno river burst its banks and flooded the low-lying heart of the city, with six meters (19.6 feet) of water in some riverine streets. A hundred people died and three to four million priceless medieval books and manuscripts along with precious artworks stored in basements, were damaged and destroyed. It was the worst flood in the city for at least 400 years.

Flood marker on a Venice street. Image credit: Wikimedia

But then the highest measured storm surge flood “acqua alta” in Venice, reached 1.94 meters above the sea level datum, on the same day in November 1966: November 4.   

Continue reading

Did Ridgecrest Increase the Chances of a Large Seismic Event?

Interest in the 160-mile-long Garlock Fault, the second-largest fault in California, has been piqued recently after a Los Angeles Times article about deformation on the Garlock Fault due to the Ridgecrest sequence of events in July 2019. Since the publication of this article, RMS has received information requests focused around two main points.

First, does RMS believe that Ridgecrest impacted the Garlock Fault (and possibly others), and has therefore increased the probability of a rupture there? Second, does RMS support the assumption from the U.S. Geological Survey (USGS) that the most likely scenario is that the Ridgecrest quakes probably won’t trigger a larger earthquake, but have raised the chances of an earthquake of magnitude 7.5 or more on the nearby Garlock, Owens Valley, Blackwater and Panamint Valley faults over the next year. And how would RMS recommend that clients model and capture this increased risk?

Continue reading

U.S. Wildfire: Mitigation Really Matters

It has been a year since the deadliest and most destructive wildfire in California’s history. The Camp Fire burnt some 153,336 acres, starting at Camp Creek Road, two miles from the small community of Concow, Butte County in Northern California. A fire was reported at 6.33 a.m. local time on Thursday, November 8, 2018, and spread to Concow within 30 minutes and by 8 a.m. had moved quickly west to the town of Paradise (pop. ~26,800).

The town was devastated within hours, as embers driven by 50 miles per hour winds created an urban conflagration which saw 80 to 90 percent of the town destroyed. The fire took 15 days to fully contain. Overall, the fire destroyed a total of 18,804 structures, and killed 85 people.

The insurance industry is also still reeling after both last year’s and the previous year’s record-breaking California wildfire seasons with US$23 billion in insured losses. All eyes are on the current events as a quiet early season has morphed into an active late season, as the Kincade Fire in Sonoma County that started on October 23, burnt some 77,758 acres and destroyed 374 structures according to CAL FIRE. The Kincade Fire is now the largest ever wildfire in Sonoma County.

Continue reading

SiteIQ: More Power and Control for Your Underwriters

If on-the-ground underwriters can get risk insight instantly – and can make a quick check simply by entering a location rather than waiting for a risk analyst or trying to gather public data themselves, it has the potential to radically improve underwriting performance. We are seeing this change beginning to happen with SiteIQ, a recently launched application that utilizes the RMS open platform – Risk Intelligence™.

SiteIQ uses our trusted risk model data – the same data used across a client’s organization, to deliver hazard risk scores instantly for a location, to help underwriters make better decisions on whether to reject, accept or refer a risk for further analysis. Using the same risk data throughout means that new risks reflect a business’s acceptance criteria, bringing harmony to the book of business.

By making SiteIQ quick and simple to use, underwriters see it as a useful tool in their armory, knowing they can get valuable, modeled risk insight whenever they need it. The breadth of the instant insight adds to its usefulness, covering many available perils, with outputs including risk scores, loss costs – all presented in a highly visual, intuitive app.

We keep going back to users to find out how they are using SiteIQ and what they would like to see in terms of developments. And, in its first few months since launch, thanks to client feedback, RMS has now released the third iteration since launch – SiteIQ version 1.3.   

Continue reading

MyShake: New App Unveiled for California Earthquake Early Warning

As my colleague Mohsen Rahnama reminded us in his recent blog, the last destructive earthquake to strike Northern California was on October 17, 1989. Loma Prieta was a magnitude 6.9 earthquake which resulted in 63 deaths and about four thousand injuries. The epicenter was about ten miles northeast of Santa Cruz, and seismic waves took about 30 seconds to reach San Francisco. But there was no way of communicating any earthquake early warning to residents of the Marina district of San Francisco, which suffered some of the worst damage from shaking and fire outbreak.

On October 17, 2019, the thirtieth anniversary of this earthquake, the California Governor’s Office of Emergency Services unveiled a smartphone app from the University of California, Berkeley Seismological Lab that will give all Californians the opportunity to receive earthquake early warnings.

Governor Gavin Newsom, who happened to be in the Marina district at the time of the 1989 earthquake, has urged people to download the MyShake app. This app (myshake.berkeley.edu) is available on the Apple App Store and Google Play, and relies on the ShakeAlert earthquake early warning system, developed by the U.S. Geological Survey (USGS).

Continue reading

Loma Prieta and 30 Years of Bay Area Growth

Thirty years ago, the Mw6.9 Loma Prieta Earthquake struck the San Francisco Bay Area. When looking back at disasters, it is always particularly relevant to understand the moment in time impacted. The Loma Prieta Earthquake struck on Tuesday, October 17, 1989 at 5:04 p.m. local time, but it was no ordinary Tuesday afternoon. Game Three of the Major League Baseball 1989 World Series was to start at 5:35 p.m. between the two Bay Area teams: the Oakland Athletics and the San Francisco Giants.

Typically, 5:04 p.m. would represent the height of rush hour in the Bay Area, but because of the game a significant component of the workforce had left work early or had stayed late to watch it. While 63 lives were lost, this loss level was much lower than it might have been given the level of damage that impacted highways across the region including the failures of the Nimitz Freeway and the San Francisco–Oakland Bay Bridge.

I was at Stanford University in the Terman Engineering Building studying when the earthquake struck. The Stanford campus made up of numerous historical buildings saw substantial damage. In all, more than 200 structures were impacted. The restoration of the damage took more than a decade to fix and cost Stanford more than US$160 million. Classes were canceled for more than a week. Students were locked out of damaged buildings which meant they could not access their research samples, data and equipment. Adding to the stress were the innumerable aftershocks. For those of us studying engineering, it really brought home the importance of our work.

Continue reading

Is It Time to Break up With Your Deficient Risk Scoring Analytics?

Risk scoring is a fundamental part of the property and casualty underwriting process, allowing underwriters to sort and rank the quality of submissions. This process culminates in critical business decisions on quoting, declination, referral, and pricing, which taken together can make the difference between an insurer’s survival and its failure. The best insurers make these decisions in a manner that is disciplined, consistent, and data-driven. Those who fail to do this fall prey to adverse selection, pay high reinsurance costs, and suffer at the hands of disapproving rating agencies.

Given this high stakes game, why does the industry continue to rely on oversimplified, unproven, and outdated risk scores for natural catastrophe underwriting?

Continue reading