Tag Archives: risk modeling

Friday 13th and the Long-Term Cost of False Alarms

If the prospect of flooding along the East Coast of England earlier this month was hard to forecast, the newspaper headlines the next day were predictable enough:

Floods? What floods? Families’ fury at evacuation order over storm surge … that never happened (Daily Mail)

East coast residents have derided the severe storm warnings as a ‘load of rubbish’ (The Guardian)

Villagers shrug off storm danger (The Times)

The police had attempted an evacuation of some communities and the army was on standby. This was because of warnings of a ‘catastrophic’ North Sea storm surge on January 13 for which the UK Environment Agency applied the highest level flood warnings along parts of the East Coast: ‘severe’ which represents a danger to life. And yet the flooding did not materialize.

Water levels were 1.2m lower along the Lincolnshire coast than those experienced in the last big storm surge flood in December 2013, and 0.9m lower around the Norfolk towns of Great Yarmouth and Lowestoft. Predicting the future in such complex situations, even very near-term, always has the potential to make fools of the experts. But there’s a pressure on public agencies, knowing the political fallout of missing a catastrophe, to adopt the precautionary principle and take action. Imagine the set of headlines, and ministerial responses, if there had been no warnings followed by loss of life.

Interestingly, most of those who had been told to evacuate as this storm approached chose to stay in their homes. One police force in Essex, knocked on 2,000 doors yet only 140 of those people registered at an evacuation centre. Why did the others ignore the warnings and stay put? Media reports suggest that many felt this was another false alarm.

The precautionary principal might seem prudent, but a false alarm forecast can encourage people to ignore future warnings. Recent years offer numerous examples of the consequences.

The Lessons of History

Following a 2006 Mw8.3 earthquake offshore from the Kurile Islands, tsunami evacuation warnings were issued all along the Pacific coast of northern Japan, where the tsunami that did arrive was harmless. For many people that experience weakened the imperative to evacuate after feeling the three-minute shaking of the March 2011 Mw9 earthquake, following which 20,000 people were drowned by the tsunami. Based on the fear of what happened in 2004 and 2011, today tsunami warnings are being ‘over-issued’ in many countries around the Pacific and Indian Oceans.

For the inhabitants of New Orleans, the evacuation order issued in advance of Hurricane Ivan in December 2004 (when one third of the city’s population moved out, while the storm veered away), left many sceptical about the mandatory evacuation issued in advance of Hurricane Katrina in August 2005 (after which around 1500 drowned).

Agencies whose job it is to forecast disaster know only too well what happens if they don’t issue a warning as any risk looms. However, the long-term consequences from false alarms are perhaps not made explicit enough. While risk models to calculate the consequence are not yet available, a simple hypothetical calculation illustrates the basic principles of how such a model might work:

  • the chance of a dangerous storm surge in the next 20 years is 10 percent, for a given community;
  • if this happens, then let’s say 5,000 people would be at grave risk;
  • because of a recent ‘false’ alarm, one percent of those residents will ignore evacuation orders;
  • thus the potential loss of life attributed to the false alarm is five people.

Now repeat with real data.

Forecasting agencies need a false alarm forecast risk model to be able to help balance their decisions about when to issue severe warnings. There is an understandable instinct to be over cautious in the short-term, but when measured in terms of future lives lost, disaster warnings need to be carefully rationed. And that rationing requires political support, as well as public education.

[Note: RMS models storm surge in the U.K. where the risk is highest along England’s East Coast – the area affected by flood warnings on January 13. Surge risk is complex, and the RMS Europe Windstorm Model™ calculates surge losses caused by extra-tropical cyclones considering factors such as tidal state, coastal defenses, and saltwater contamination.]

The Cost of Shaking in Oklahoma: Earthquakes Caused by Wastewater Disposal

It was back in 2009 that the inhabitants of northern Oklahoma first noticed the vibrations. Initially only once or twice a year, but then every month, and even every week. It was disconcerting rather than damaging until November 2011, when a magnitude 5.6 earthquake broke beneath the city of Prague, Okla., causing widespread damage to chimneys and brick veneer walls, but fortunately no casualties.

The U.S. Geological Service had been tracking this extraordinary outburst of seismicity. Before 2008, across the central and eastern U.S., there were an average of 21 earthquakes of magnitude three or higher each year. Between 2009-2013 that annual average increased to 99 earthquakes in Oklahoma alone, rising to 659 in 2014 and more than 800 in 2015.


During the same period the oil industry in Oklahoma embarked on a dramatic expansion of fracking and conventional oil extraction. Both activities were generating a lot of waste water. The cheapest way of disposing the brine was to inject it deep down boreholes into the 500 million year old Arbuckle Sedimentary Formation. The volume being pumped there increased from 20 million barrels in 1997 to 400 million barrels in 2013. Today there are some 3,500 disposal wells in Oklahoma State, down which more than a million barrels of saline water is pumped every day.

It became clear that the chatter of Oklahoma earthquakes was linked with these injection wells. The way that raising deep fluid pressures can generate earthquakes has been well-understood for decades: the fluid ‘lubricates’ faults that are already poised to fail.

But induced seismicity is an issue for energy companies worldwide, not just in the South Central states of the U.S.. And it presents a challenge for insurers, as earthquakes don’t neatly label themselves ‘induced’ and ‘natural.’ So their losses will also be picked up by property insurers writing earthquake extensions to standard coverages, as well as potentially by the insurers covering the liabilities of the deep disposal operators.

Investigating the Risk

Working with Praedicat, which specializes in understanding liability risks, RMS set out to develop a solution by focusing first on Oklahoma, framing two important questions regarding the potential consequences for the operators of the deep disposal wells:

  • What is the annual risk cost of all the earthquakes with the potential to be induced by a specific injection well?
  • In the aftermath of a destructive earthquake how could the damage costs be allocated back to the nearby well operators most equitably?

In Oklahoma detailed records have been kept on all fluid injection activities: well locations, depths, rates of injection. There is also data on the timing and location of every earthquake in the state. By linking these two datasets the RMS team was able to explore what connects fluid disposal with seismicity. We found, for example, that both the depth of a well and the volume of fluid disposed increased the tendency to generate seismic activity.

Earthquakes in the central U.S. are not only shallow and/or human-induced. The notorious New Madrid, Mo. earthquakes of 1811-1812 demonstrated the enormous capacity for ‘natural’ seismicity in the central U.S., which can, albeit infrequently, cause earthquakes with magnitudes in excess of M7. However, there remains the question of the maximum magnitude of an induced earthquake in Oklahoma. Based on worldwide experience the upper limit is generally assumed to be magnitude M6 to 6.5.

Who Pays – and How Much?

From our studies of the induced seismicity in the region, RMS can now calculate the expected total economic loss from potential earthquakes using the RMS North America Earthquake Model. To do so we run a series of shocks, at quarter magnitude intervals, located at the site of each injection well. Having assessed the impact at a range of different locations, we’ve found dramatic differences in the risk costs for a disposal well in a rural area in contrast to a well near the principal cities of central Oklahoma. Reversing this procedure we have also identified a rational and equitable process which could help allocate the costs of a damaging earthquake back to all the nearby well operators. In this, distance will be a critical factor.

Modeling Advances for Manmade Earthquakes

For carriers writing US earthquake impacts for homeowners and businesses there is also a concern about the potential liabilities from this phenomenon. Hence, the updated RMS North America Earthquake Model, to be released in spring 2017, will now include a tool for calculating property risk from induced seismicity in affected states: not just Oklahoma but also Kansas, Ohio, Arkansas, Texas, Colorado, New Mexico, and Alabama. The scientific understanding of induced seismicity and its consequences are rapidly evolving, and RMS scientists are closely following these developments.

As for Oklahoma, the situation is becoming critical as the seismic activity shows no signs of stopping: a swarm of induced earthquakes has erupted beneath the largest U.S. inland oil storage depot at Cushing and in September 2016 there was a moment magnitude 5.8 earthquake located eight miles from the town of Pawnee – which caused serious damage to buildings. Were a magnitude 6+ earthquake to hit near Edmond (outside Oklahoma City) our modeling shows it could cause billions of dollars of damage.

The risk of seismicity triggered by the energy industry is a global challenge, with implications far beyond Oklahoma. For example Europe’s largest gas field, in the Netherlands, is currently the site of damaging seismicity. And in my next blog, I’ll be looking at the consequences.

[For a wider discussion of the issues surrounding induced seismicity please see these Reactions articles, for which Robert Muir-Wood was interviewed.]

Exceedance 2017 Is Coming to New Orleans!

Welcome to the first in a series of blogs leading up to Exceedance 2017, March 20-23.

We’re looking forward to the event, which will be held at the Hyatt Regency New Orleans. Situated less than a mile from the historic French Quarter, and about a mile-and-a-half from Jackson Square, it’s a great location in the heart of the ‘Big Easy.’

This year’s theme, ‘Create Resilience,’ reflects the strength and spirit of New Orleans, including the tremendous progress made in the ten years since the devastation caused by Hurricane Katrina. Exceedance 2017 will emphasize how innovation, analytics, and ingenuity can create more resilience in our global catastrophe risk management practices.

Hands-On Training for Risk Modeler on the RMS(one) Platform and Version 17

With the release of Risk Modeler on the RMS(one)® platform and Version 17 upcoming in April, this year’s Exceedance schedule is focused on training and enablement. It’s the only place to get key insights into these new RMS releases – and be trained to assess risk more effectively.

Exceedance2017Exceedance will feature over 22 speakers and provide many opportunities to dive deep into more than 20 new models, including North America Earthquake, North Atlantic Hurricane, and major advances in science, software, and HD-simulation models.

The agenda is designed to provide attendees with all the information they need for our new solutions developed for a rapidly changing market. Solutions that will increase operational effectiveness, agility, resilience, and business growth.

Take Some Time to Have Some Fun

Along with experiencing all there is to see and learn at Exceedance, there are plenty of opportunities to relax and have some fun with the following pre-conference activities:

Golf at TPC Louisiana: Enjoy a round at TPC Louisiana, rated one of Golfweek’s “Best Courses You Can Play.” It’s a great place for you and your colleagues to experience a one-of-a-kind day on a championship golf course.

Tour the Lower 9th Ward: Join the Make It Right Foundation for a walking tour of the Lower 9th Ward. You’ll experience first-hand how innovative partnerships and community-led design sessions are transforming the neighborhood that was most devastated by Hurricane Katrina.

Horse-Drawn Carriage Ride and Cooking Class: Journey through the French Quarter by carriage, where you’ll pass through the city’s eighteenth- and nineteenth-century French and Spanish architecture. Then, satisfy your appetite with chef extraordinaire Amy Sins who will guide you through an interactive culinary experience that ends with a delectable meal.

Spirits and Spirits – an Evening Tour: Take a guided evening stroll through the spooky side of the old French Quarter. You’ll hear tales from the city’s storied history, and perhaps even encounter a ghost or two. Then enjoy local cocktail favorites at one of New Orleans’s oldest restaurants, a former Spanish armory.

To learn more about these events, visit the Exceedance website. If you’re ready to register, fill out your form.

Exceedance will be here soon, so look for our next blog in two weeks. It will include the latest information on the session tracks and content, as well as details of the keynote speakers.

Customers Adopt Solutions on the RMS(one) Platform

Exposure Manager, the first solution on the RMS(one)® platform, launched in July and has created great momentum in the market.

Insurance and reinsurance firms using Exposure Manager gain a clearer view of risk accumulations. With insight into their diverse set of exposures in both modeled and non-modeled regions, they are able to better manage exposure concentrations, and can help avoid private catastrophes.

Today, we announced that Mitsui Sumitomo Marine Management has chosen Exposure Manager to strengthen its risk accumulation management, taking a “big data” approach for dynamic exposure management. They are among the first in a wave of companies to adopt Exposure Manager to minimize blind spots in their risk portfolios.

Since July, we’ve been previewing some of the other solutions due out on the RMS(one) platform – from one-on-one meetings with clients leading all the way up to the main stage at Exceedance 2017 (registration is now open!). Our customers are in various stages of evaluating and adopting RMS(one) solutions and are excited to capitalize on the advantages that these solutions will bring.

As one of the many ways we are helping customers along their adoption journeys, we recently held our first Hack Event, Powered by RMS(one). Our customers in the London market attended a full-day session to understand their options, choose the right solution for their business goals, and map out adoption strategies. Due to the event’s success, we will be holding additional Hack Events in the coming months as we march toward releases for a full suite of solutions on the RMS(one) platform.

We will share more as customers continue to implement solutions on the RMS(one) platform and realize business benefits as we all work toward a common goal of building a more resilient global society.

Learn how to build portfolio intuition faster and access metrics that matter with Exposure Manager.

Extreme Wind Speeds Over the Ocean – an International Workshop of Experts

It’s one thing being invited to speak at an industry event in front of dozens of the leading scientists in your field. It’s another to find, with a certain astonishment, that virtually all of them use RMS HWind to validate their scientific work.

Last month, at a U.K. Met Office-hosted workshop, I spoke about the RMS HWind hurricane modeling solutions to a group of high-wind remote-sensing scientists from academic and government agencies from around the world, including:

  • European Space Agency (ESA)
  • National Aeronautics and Space Administration (NASA)
  • French Research Institute for Exploitation of the Sea (IFREMER)
  • Royal Netherlands Meteorological Institute (KNMI)
  • Met Office (U.K.)
  • European Center for Medium-Range Weather Forecasts
  • National Space Science Center, Chinese Academy of Sciences
  • Institute of Applied Physics of the Russian Academy of Sciences
  • National Oceanic and Atmospheric Administration (U.S.)

All of the above agencies are researching how satellite-mounted remote wind sensors can be used, most effectively, to inform on hurricanes and typhoons developing over the ocean.

During the workshops I was delighted to learn that every major remote sensing agency had used the RMS HWind archive of historical storms to validate and calibrate their sensor programs for detecting high winds from space. RMS HWind also provides real-time analysis of hurricanes as they happen with observational data from instruments in the air, in the sea and on land – including aircraft reconnaissance, GPS dropsonde instruments, sea buoys and satellites.

By citing HWind products and research in their peer-reviewed publications, these agencies provide independent endorsements that enhance the scientific credibility of the HWind archives and services, while also giving us a chance to evaluate cutting-edge technology before it becomes operationally available.

There is tremendous value in scientific collaboration and, as such, RMS facilitates the science community’s understanding of hurricanes by providing our academic partners free access to HWind products for their scientific investigations.

Sensors in Space

None of the satellites we discussed at the Met Office workshop actually measure wind directly – rather, they measure a signal which is influenced by the wind. So, for example, the new NASA CYGNSS system measures the reflection of GPS signals off sea’s surface, which is like the reflection of the moon on the surface of a lake. Winds disturb the surface and this scatters the signal.

Another satellite, the Canadian RADARSAT-2, has already been up for a few years and can capture images of the fine scale roughness of the ocean surface. But to collect these images and convert them to a wind speed reading requires a lot of advance planning, followed by lengthy processing.

Which is where RMS HWind comes in. Our 1 km gridded HWind Snapshots make it easy for scientists to overlay their satellite instrument measurements (typically a microwave signal reflected from the sea surface) over our wind analyses. They can do this for several storms of various sizes and intensities to convert the measured signal to wind speed over a range of meteorological and oceanic conditions.

Due for release this winter, the HWind Enhanced Archive of wind hazard metrics will provide a high resolution library of tropical cyclone wind fields for the North Atlantic, the Caribbean, Gulf of Mexico, and the east and central Pacific. In the coming years we’d hope to see the expansion of tropical cyclone wind field monitoring globally.

This expansion could benefit areas of the world with insurance protection gaps hugely. Increased insurance penetration in the Asian and Australian markets, together with new risk transfer products using parametric triggers, could help improve financial resilience to catastrophic tropical cyclones in whole new regions of the world.

Understanding Risk Accumulations in Taiwan’s Science Parks

“The 6.4 magnitude Tainan earthquake in February 2016 resulted in a sizeable insured loss from the high-tech industrial risks and reminded the insurance industry of the potential threat from the risk accumulated in science parks.” (A.M. Best Special Report, Sept 2016)

Reading the sentence above you might be forgiven for wondering why science parks would give insurers and reinsurers any particular cause for concern. But consider this statistic: although Taiwan’s three major science and industrial parks occupy only 0.1% of the island’s total land mass, they represent 16% of Taiwan’s overall manufacturing – they are hugely significant, both economically and with regards to the insured exposure in Taiwan.

For example, the Hsinchu Science Park (HSP), known for semiconductor production, employs more than 150,000 people and contributes over $32 billion in revenues – approximately 6% of national GDP. By one estimate HSP represents over $319 billion in total insured values. In addition, some of the latest high tech areas within HSP, such as advanced “clean rooms,” present additional challenges due to their vulnerability to ground shaking or power interruption. The importance of this risk was observed in February’s Tainan earthquake where some significant losses to high-tech industrial risks were caused by damage to the equipment and the related business interruption due to power outage.

Improving data quality for advanced and detailed modeling is an important way to manage these risks, concludes the A.M. Best report quoted above. This is so as to accurately assess the potential loss impact on insurers’ books. RMS has already been analysing earthquake risk in Taiwan for 12 years – long before this year’s Mw 6.4 event – and in that time our view of seismic risk in Taiwan has not changed, since our model benefits from spectral response-based hazard and damage functions, that even include local liquefaction and landslide susceptibilities.

The 1999 Chi-Chi Earthquake (known in Taiwan as the 921 Earthquake) was the key event in building the RMS® Taiwan Earthquake Model in terms of the quake’s seismicity, ground motion, soil secondary effects and building response. Since then there have been no significant events to justify a re-calibration of the components of the model. In fact, the damages observed in this year’s event were broadly in line with RMS’ expectations and validated the robustness of the current model.

But although A.M. Best views the Taiwan insurance industry as prudently managed with relatively high catastrophe management capability, there are still lessons to be learnt from the 2016 event, and RMS has solutions which offer additional insight into understanding the risk posed by these business parks in Taiwan.

Concentration of Exposure into Science Parks

The RMS® Asia Industrial Clusters Catalogs were released in 2014 to identify hotspots of exposure, and profile their risk. The locations and geographic extent of the science parks within Taiwan are detailed to help understand risk accumulations for industrial lines and develop more robust risk management strategies.


Example of industrial cluster captured in the RMS Taiwan Industrial Clusters Catalog. The red outline illustrates the digitized boundaries of the Formosa Petrochemical Co. Plant in Yunlin Hsien.

High Fragility of the Semiconductor Industry

For coding of Industrial Plants, the RMS® Industrial Facilities Model (IFM) captures the unique nature of different industrial risks, as a high percentage of property value is often associated with machinery and equipment (M&E) and stock. This advanced vulnerability model supports the earthquake model to define the damageability of a comprehensive set of industrial facilities more accurately, and calculate the financial risk to these specific types of facilities, including building, contents, and business interruption (BI) loss estimates. The IFM differentiates the risks for different types of business within the science parks, and highlights the higher fragility of semiconductor plants compared to other industrial units, as shown below.


Lessons Learnt?

The huge damage from the 1999 Chi Chi earthquake has not halted the rapid development of Taiwan’s science parks in this seismically active area – indeed the island’s third biggest science park has since been built there. But this year’s comparatively small Mw 6.4 event further highlighted the substantial exposures concentrated within this sector, reminding the industry of the potential for significant losses without sound accumulation management practices, informed by the best modeling insights.

“Italy is Stronger than any Earthquake”

Those were the words of the then Italian Prime Minister, Matteo Renzi, in the aftermath of two earthquakes on the same day, October 26, 2016. As a statement of indomitable defiance at a scene of devastation it suited the political and public mood well. But the simple fact is there is work to do, because Italy is not as strong as it could be in its resilience to earthquakes.

There’s a long history of powerful seismic activity in the central Apennines: only recently we’ve seen L’Aquila (2009, Mw6.3), Amatrice (August 2016, Mw6.0), two earthquakes in the area near Visso (October 2016, Mw 5.4 and 5.9) and Norcia (October 2016, Mw6.5). These have resulted in hundreds of fatalities, mainly attributed to widespread collapse of old buildings, emphasizing that earthquakes don’t kill people – buildings do. Whilst Italy’s Civil Protection Department provides emergency management and support after earthquakes, there is too little insurance help for the financial resiliency of the communities most affected by all these events. While the oft-repeated call for earthquake insurance to be compulsory continues to be politically unobtainable, one way it could be spread more widely is through effective modeling. And RMS expertise can help with this, allowing the market to better understand the risk and so build resilience.

Examining High Building Fragility

The two most significant factors for earthquake risk in Italy are (i) construction materials and (ii) the age of the buildings. The majority of the damaged and destroyed buildings were made from unreinforced masonry, and built prior to the introduction of the most recent seismic design and building codes, making them particularly susceptible. With the RMS® Europe Earthquake model capturing both the variations in construction types and age, as well as other vulnerability factors, (re)insurers can accurately reflect the response of different structures to earthquakes.  This allows the models to be used to evaluate the cost benefits of retrofitting buildings.  RMS has worked with the Italian National Institute for Geophysics and Volcanology (INGV) to see how such analyses could be used to optimize the allocation of public funds for strengthening older buildings, thereby reducing future damage and costs.

Seismic Risk Assessment

The high-risk zone of the central Apennines is described well by probabilistic seismic hazard assessment (PSHA) maps, which show the highest risks in that region resulting from the movement of tectonic blocks that produce the extensional, ‘normal’ faulting observed. The maps also show earthquake risk throughout the rest of Italy. RMS worked with researchers from INGV to develop our view of risk in 2007, based on the latest available databases at that time, including active faults and earthquake catalogs. The resulting hazard model produces a countrywide view of seismic hazard that has not been outdated by newer studies, such as the 2009 INGV Seismic Hazard Map and the 2013 European Seismic Hazard Map published by the SHARE consortium, as shown below:


The Route to Increased Resiliency

Increasing earthquake resiliency in Italy should also involve further development of the private insurance market. The seismic risk in Italy is relatively high for western Europe, whilst the insurance penetration is low, even outside the central Apennines. For example, in 2012, there were two large earthquakes in the Emilia-Romagna region of the Po valley, where there are higher concentrations of industrial and commercial risks. Although the type of faults and risks vary by region, such as the potential impact of liquefaction, the RMS model captures such variations in risk and can be used for the development of risk-based pricing and products for the expansion of the insurance market throughout the country.

Whilst Italy’s seismic events in October caused casualties on a lesser scale than might have been, the extent of the damage highlights once again the prevalence of earthquake risk. It is only a matter of time before the next disaster strikes, either in the Central Apennines or elsewhere. When that happens, the same questions will be asked about how Italy could be made more resilient. But if, by then, the country’s building stock is being made less susceptible and the private insurance market is growing markedly, then Italy will be able to say, with justification, it is becoming stronger than any earthquake.

See How Quickly and Easily You Can Access the Exposure Metrics That Matter

Exposure Manager is a risk management solution that provides executives, underwriters, risk analysts, and other decision-makers with the exposure analytics needed to offer a comprehensive view of risk and understand loss potential.

As the first solution released on the RMS(one) platform, Exposure Manager was developed based on the understanding that organizations not only need quick and reliable assessments of exposure concentrations, but also the right tools to ensure they can access key metrics and insights.

The videos below illustrate two of the important capabilities that enhance users’ ability to build portfolio intuition faster and quickly access the metrics that are most important.

Build Portfolio Intuition Faster provides insights into how Exposure Manager enables customers to quickly and efficiently derive deeper portfolio insights using an intuitive and user-friendly interface.

With a customizable interface that conveys the information that’s most important to the user, Exposure Manager’s analytics, enabled by an intuitive best-in-class user experience, can be configured without knowledge of SQL or support from IT.

This enhances the ability for customers to create quick insights into their portfolio or perform a deep dive into their book to make quick assessments.

Access Metrics That Matter shows how Exposure Manager leverages the RMS financial model to provide an exposed limit metric. This offers a consistent view of loss potential to enable precise identification of loss drivers.

The flexible interface provides users with precise control to quickly make informed decisions about their book and help identify threats and opportunities in the portfolio.

All of these benefits allow customers to become more incisive about their portfolio.

Earthquake Hazard: What Has New Zealand’s Kaikoura Earthquake Taught Us So Far?

The northeastern end of the South Island is a tectonically complex region with the plate motion primarily accommodated through a series of crustal faults. On November 14, as the Kaikoura earthquake shaking began, multiple faults ruptured at the same time culminating in a Mw 7.8 event (as reported by GNS Science).

The last two weeks have been busy for earthquake modelers. The paradox of our trade is that while we exist to help avoid the damage this natural phenomenon causes, the only way we can fully understand this hazard is to see it in action so that we can refine our understanding and check that our science provides the best view of risk. Since November 14 we have been looking at what Kaikoura tells us about our latest, high-definition New Zealand Earthquake model, which was designed to handle such complex events.

Multiple-Segment Ruptures

With the Kaikoura earthquake’s epicenter at the southern end of the faults identified, the rupture process moved from south to north along this series of interlinked faults (see graphic below). Multi-fault rupture is not unique to this event as the same process occurred during the 2010 Mw 7.2 Darfield Earthquake. Such ruptures are important to consider in risk modeling as they produce events of larger magnitude, and therefore affect a larger area, than individual faults would on their own.

Map showing the faults identified by GNS Sciences as experiencing surface fault rupture in the Kaikoura Earthquake.
Source: http://info.geonet.org.nz/display/quake/2016/11/16/Ruptured +land%3A+observations+from+the+air

In keeping with the latest scientific thinking, the New Zealand Earthquake HD Model provides an expanded suite of events that represent complex ruptures along multiple faults. For now, these are included only for areas of high slip fault segments in regions with exposure concentrations, but their addition increases the robustness of the tail of the Exceedance Probability curve, meaning clients get a better view of the risk of the most damaging, but lower probability events.

Landsliding and Liquefaction

While most property damage has been caused directly by shaking, infrastructure has been heavily impacted by landsliding and, to a lesser extent, liquefaction. Landslides and slumps have occurred across the region, most notably over Highway 1, an arterial route. The infrastructure impacts of the Kaikoura earthquake are a likely dress rehearsal for the expected event on the Alpine Fault. This major fault runs 600 km along the western coast of the South Island and is expected to produce an Mw 8+ event with a probability of 30% in the next 50 years, according to GNS Science.

As many as 80 – 100,000 landslides have been reported in the upper South Island, with some creating temporary dams over rivers and in some cases temporary lakes (see below). These dams can fail catastrophically, sending a sudden increase of water flow down the river.



Examples of rivers blocked by landslides photographed by GNS Science researchers.

Source: http://info.geonet.org.nz/display/quake/2016/11/18/ Landslides+and+Landslide+dams+caused +by+the+Kaikoura+Earthquake









Liquefaction occurred in discrete areas across the region impacted by the Kaikoura earthquake. The Port of Wellington experienced both lateral and vertical deformation likely due to liquefaction processes in reclaimed land. There have been reports of liquefaction near the upper South Island towns (Blenheim, Seddon, Ward), but liquefaction will not be a driver of loss in the Kaikoura event to the extent it was in the Christchurch earthquake sequence.

RMS’ New Zealand Earthquake HD Model includes a new liquefaction component that was derived using the immense amount of new borehole data collected after the Canterbury Earthquake Sequence in 2010-2011. This new methodology considers additional parameters, such as depth to the groundwater table and soil-strength characteristics, that lead to better estimates of lateral and vertical displacement. The HD model is the first probabilistic model with a landslide susceptibility component for New Zealand.


The Kaikoura Earthquake generated tsunami waves that were observed in Kaikoura at 2.5m, Christchurch at 1m, and Wellington at 0.5m. The tsunami waves arrived in Kaikoura significantly earlier than in Christchurch and Wellington indicating that the tsunami was generated near Kaikoura. The waves were likely generated by offshore faulting, but also may be associated with submarine landsliding. Fortunately, the scale of the tsunami waves did not produce significant damage. RMS’ latest New Zealand Earthquake HD Model captures tsunami risk due to local ocean bottom deformation caused by fault rupture, and is the first model in the New Zealand market to do this, that is built from a fully hydrodynamic model.

Next Generation Earthquake Modeling at RMS

Thankfully the Kaikoura earthquake seems to have produced damage that is lower than we might have seen had it hit a more heavily populated area of New Zealand with greater exposures – for detail on damage please see my other blog on this event.

But what Kaikoura has told us is that our latest HD model offers an advanced view of risk. Released only in September 2016, it was designed to handle such a complex event as the Kaikoura earthquake, featuring multiple-segment ruptures, a new liquefaction model at very high resolution, and the first landslide susceptibility model for New Zealand.

Prudential Regulation Authority on the Challenges Facing Cyber Insurers

Most firms lack clear strategies and appetites for managing cyber risk, with a shortage of cyber domain knowledge noted as a key area of concern. So said the Prudential Regulation Authority, the arm of the Bank of England which oversees the insurance industry, in a letter to CEOs last week.

This letter followed a lengthy consultation with a range of stakeholders, including RMS, and identified several key areas where insurance firms could and should improve their cyber risk management practices. It focussed on the two distinct types of cyber risk: affirmative and silent.

Affirmative cover is explicit cyber coverage, either offered as a stand-alone policy or as an endorsement to more traditional lines of business. Silent risk is where cover is provided “inadvertently” through a policy that was typically never designed for it. But this isn’t the only source of silent risk: it can also leak into policies where existing exclusions are not completely exhaustive. A good example being policies with NMA 2914 applied, which excludes cyber losses except for cases where physical damage is caused in any cyber-attack (eg. by fire or explosion).

The proliferation of this silent risk across the market is highlighted as one of the key areas of concern by the PRA. It believes this risk is not only material, but it is likely to increase over time and has the potential to cause losses across a wide range of classes, a sentiment we at RMS would certainly echo.

The PRA intervention shines a welcome spotlight and adds to the growing pressure on firms to do more to improve their cyber risk management practices. These challenges facing the market have been an issue for some time, but the how do we help the industry address them?

The PRA suggests firms with cyber exposure should have a clearly defined strategy and risk appetite owned by the board and risk management practices that include quantitative and qualitative elements.

At RMS our cyber modeling has focussed on providing precisely this insight, helping many of the largest cyber writers to quantify both their silent and affirmative cyber risk, thus allowing them to focus on growing cyber premiums.

If you would like to know more about the RMS Cyber Accumulation Management System (released February 2016), please contact cyberrisk@rms.com.