Safety from volcanic eruptions is heavily influenced by economic factors. Those who earn their livelihood from farming around a volcano may be reluctant to evacuate, and those who operate tourist excursions may be reluctant to suspend them. This may have been the case with White Island Tours which holds an exclusive license to land tourists on the privately-owned island in New Zealand’s Bay of Plenty, named White Island by Captain James Cook. Tourists averse to sea sickness have also been able to arrive on the island via helicopter through Volcanic Air.
With around 10,000 customers per year, paying up to several hundred New Zealand dollars (US$0.66 per NZ dollar) for a tour, White Island Tours has been a substantial business. But one whose financial viability would have required that as few trips as possible were cancelled because of the volcano risk.
After 22 years of business operation, the volcano erupted on December 9, 2019. There were 47 tourists on White Island; most were killed or seriously injured. 38 of the tourists were from the cruise liner Ovation of the Seas, operated by Royal Caribbean Cruises Ltd., which denies any responsibility for their excursions, which were advertised with the statement that White Island is one of the most active volcanoes in the world. The terms and conditions of cruise tickets require that any lawsuit be filed in Miami. The U.S. courts will thus decide what liability Royal Caribbean Cruises Ltd. had in vetting White Island Tours.
Over the past 15 years, we have witnessed some of the world’s largest possible recorded earthquakes that have had catastrophic impacts around the globe. But, looking back 30 years to 1989, we saw two smaller, but still significant earthquakes. The first was the M6.9 Loma Prieta event that hit the San Francisco Bay Area in October, an earthquake that is familiar to many due to its proximity to the city, and its level of destruction. However, less are aware of the other notable earthquake that year. December 28, 1989, is a memorable date for many Australians; as it marks the country’s most damaging earthquake in recorded history, and still remains one of Australia’s costliest natural catastrophes to date.
Despite its moderate magnitude, the M5.4 Newcastle earthquakecaused widespread ground shaking, with insured losses of just under $1 billion AUD (US$690 million) at the time of the event (ICA, 2012), a loss which if the earthquake was repeated, RMS estimates would cost over $5 billion AUD.
Twenty years ago, while the planet was getting ready for transitioning to year 2000 and trying to solve the Y2K bug, the (re)insurance industry in Europe got caught by surprise by windstorm Lothar. Even today, 1999 remains a historic windstorm year, with catastrophic storms Anatol (December 3), Lothar (December 26) and Martin (December 28) all happening within a period of less than a month.
Lothar tracked across northern France, southern Belgium and central Germany and into Poland; Martin tracked through southern Europe – affecting France, Spain, Switzerland and Italy. Between Lothar and Martin, 140 people were killed, and losses ran over €14.2 billion economic losses, approximately €7.7 billion of which was insured. If the three events happened today, they would cost approximately €20 billion (US$23.3 billion) to the (re)insurance industry.
At the time, I was still living in Geneva with my parents. I remember waking up the day after Christmas and seeing fallen trees in our garden and our telephone line was cut. It was very dramatic and since then, no other windstorm has caused that kind of damage in this region.
In commemoration of the twentieth anniversary of windstorms Anatol, Lothar and Martin, I have asked my colleagues at RMS to share their experience of the storms.
Cautious optimism surrounds the January 1, 2020 reinsurance renewals, with expectations that the anticipated hardening of rates might be realized – to a modest degree at least.
Reinsurance underwriters who can harness technology to conquer historic risk assessment challenges – including robust marginal impact analytics, and create the space for innovation can build customer relationships that are resilient to future market rate oscillations.
The capital influx to reinsurance markets, triggered by low market returns globally, has led to increased limits and more generous terms being offered without commensurate increases in rates. This trend can only last for so long before having dire effects on reinsurer profitability.
Profitability in the primary insurance markets has been helped by innovation, with new product offerings linked to enhanced risk assessment techniques like telematics. But while the insurtech wave has propagated hundreds of companies and ideas focused on primary insurers, progress in “reinsure-tech” has been limited, due primarily to the current soft market. These market conditions have constrained resources available for speculative investments and has limited the reinsurer’s ability to pursue potential upside in the fast-moving tech space.
Almost ironically, in response to the market conditions, companies have instituted cautious underwriting approaches still rooted in low-fidelity risk assessment techniques, which haven’t evolved to capitalize on the technological advances made since the market softened at the start of the decade.
A new article, The Science of Cyber Risk: A Research Agenda has just been published in Science. A free, non-paywall version of this paper is available here. Written by a diverse team of 19 authors, including myself, it presents a concise argument for interdisciplinary research, to establish a scientific basis for risk analysis and management in the cyber security domain.
As a leading provider of cyber risk models for the (re)insurance industry, RMS is committed to advancing the state-of-the-art in the science of cyber risk. The proposed six category research agenda is of keen interest to RMS and we recommend this Science journal article to anyone who shares our interest in solving the hard problems.
In this the first of three blog posts, I’ll explore why we need a “science” and what difference it will make. The next two posts will feature case studies in interdisciplinary collaboration, including lessons from past successes and failures.
The picture below on the left shows the extensive flooding at industrial parks north of Bangkok, Thailand. Western Digital had 60 percent of its total hard drive production coming from the country – floods disrupted production facilities at multiple sites to dramatically affect a major, global supply chain. And the picture on the right – showing flooding on the New York Subway from Hurricane Sandy, caused widespread disruption and nearly US$70 billion of losses across the northeastern U.S.
In both examples, the analysis of risk should not only help with physical protection measures such as stronger buildings through improved building codes or better defenses, but also the protection available through financial recovery. Providing financial protection is the job of the financial services and insurance industries. Improving our understanding of and practices in risk analytics as a field is one of the most interesting problems in big data these days, given the increasing set of risks we have to watch for.
How Does Risk Analytics Work?
Obviously, the risk landscape is vast. It stretches from “natural” events – such as severe hurricanes and typhoons, to earthquakes to “human-generated” disasters, such as cyberattacks, terrorism and so on.
The initial steps of risk analytics start with understanding the exposure – this is the risks a given asset, individual etc. are exposed to. Understanding exposure means detailing events that lead to damage and the related losses that could result from those events. Formulas get more complicated from here. There is a busy highway of data surrounding this field. Data engineers, data scientists, and others involved in risk analytics work to predict, model, select, and price risk to calculate how to provide effective protection.
Data Engineering for Risk Analytics
Let’s look at property-focused risks. In this instance, risk analytics starts with an understanding of how a property – such as a commercial or a residential building, is exposed to risk. The kind of events that could pose a risk and the associated losses that could result from those events depends on many variables.
The problem is that in today’s enterprise, if you want to work with exposure data, you have to work with multiple siloed systems that have their own data formats and representations. These systems do not speak the same language. For a user to get a complete picture, they need to go across these systems and constantly translate and transform data between them. As a data engineer, how do you provide a unified view of data across all systems? For instance, how can you enable a risk analyst to understand all kinds of perils – from a hurricane, a hailstorm to storm surge, and then roll this all up so you can guarantee the coverage on these losses?
There are also a number of standards used by the
insurance industry to integrate, transfer, and exchange this type of
information. The most popular of these formats is the Exposure Data Model (EDM).
However, EDMs and some of their less popular counterparts (Catastrophe Exposure
Database Exchange – CEDE, and Open Exposure Data – OED) have not aged well and
have not kept up with the industry needs:
These older standards are property centric; risk analytics requires an accommodation and understanding of new risks, such as cyberattacks, liability risks, and supply chain risk.
These older standards are propriety-designed for single systems that do not take into account the needs of various systems, for example, they can’t support new predictive risk models.
These standards don’t come with the right containment to represent high fidelity data portability – the exposure data formats do not usually represent losses, reference data, and settings used to produce the loss information that can allow for data integrity.
These standards are not extensible. Versioning and dependencies on specific product formats (such as database formats specific to version X of SQL Server etc) constantly make data portability harder.
This creates a huge data engineering challenge. If you
can’t exchange information with high fidelity, forget getting reliable
insights. As anyone dealing with data will say: garbage in, garbage out!
For any data engineer dealing with risk analytics, there is great news. There is a new open standard that is designed to remove shortcomings of the EDM and other similar formats. This new standard has been in the works for several years. It is the Risk Data Open Standard. The Risk Data Open Standard (RDOS) is designed to simplify data engineering. It is designed to simplify integrating data between systems that deal with exposure and loss data. It isn’t just RMS working to invent and validate this standard in isolation. A steering committee of thought leaders from influential companies is working on validating the Risk Data OS.
The Risk Data OS will allow us to work on risk analytics much more effectively. This is the way we can better understand the type of protection we need to create to help mitigate against climate change and other natural or human-made disasters. You can find details on the Risk Data OS here. If you are interested in the Risk Data OS, have feedback, or would like to help us define this standard, you can email the Risk Data OS steering committee by clicking here .
The 2010 M7.1 Darfield earthquake in
New Zealand started a sequence of events – the Canterbury Earthquake Sequence
(CES), that propagated eastward in the Canterbury region over several years. Since
the City of Christchurch is built on alluvial sediments where the water table
is very shallow, several of the larger events created widespread liquefaction
within the city and surrounding areas. Such ground deformations caused a
significant number of buildings with shallow foundations to settle, tilt and
Prior to these New Zealand earthquakes, liquefaction was observed but not on this scale in a built-up area in a developed country. As in previous well-studied liquefaction events (e.g. 1964 Niigata) this was a unique opportunity to examine liquefaction severity and building responses. Christchurch was referred to as a “liquefaction laboratory” with the multiple events causing different levels of shaking across the city. However, we had not previously seen suburbs of insured buildings damaged by liquefaction.
The 2010 M7.1 Darfield earthquake in
New Zealand started a sequence of events that propagated eastward in the
Canterbury region over several years, collectively causing upward of 15
individual loss-causing events for the insurance industry. The Insurance
Council of New Zealand state that the total insured loss was more than NZ$31
billion (US$19.4 billion).
With such a significant sequence of events, a lot had to be learned and reflected into earthquake risk modeling, both to be scientifically robust and to answer the new regulatory needs. GNS Science – the New Zealand Crown Research Institute, had issued its National Seismic Hazard Model (NSHM) in 2010, before the Canterbury Earthquake Sequence (CES) and before Tōhoku. The model release was a major project, and at the time, in response to the CES, GNS only had the bandwidth for a mini-update to the 2010 models, to allow M9 events on the Hikurangi Subduction Interface, New Zealand’s largest plate boundary fault, and to get a working group started on Canterbury earthquake rates.
But given the high penetration rate of earthquake insurance in New Zealand and the magnitude of the damage in the Canterbury region, the (re)insurance and regulatory position was in transition. Rather than wait for a new National Seismic Hazard Map (NSHMP) update (which is still in not available), RMS joined the national effort and started a collaboration with GNS Science as well as our own research, to build a model that would help during this difficult time, when many rebuild decisions had to be made. The RMS® New Zealand Earthquake High Definition (HD) model was released in mid-2016.
A few hundred yards from where Stephen Hawking first explored black holes from his wheelchair, is the Institute of Criminology at the University of Cambridge. Hawking never shied away from really hard problems; nor do the Cambridge criminologists. There is no Nobel Prize for finding viable solutions to rehabilitating prisoners, but the Cambridge Learning Together program has forged new communal pathways for addressing this major societal challenge. The program seeks to bring together people in criminal justice and higher education institutions to study alongside each other in inclusive and transformative learning communities.
The Learning Together program began at the University of Cambridge in 2014, in partnership with HMP Grendon, a small prison at a village named Grendon Underwood, outside London. This program recognizes that collaboration underpins the growth of opportunities for the learning progression of students in prison, and the development of pathways towards non-offending futures.
Five years on, a celebration alumni event was organized for Black Friday, November 29. This took place in the City of London, at Fishmongers’ Hall, off London Bridge. This happens to be close to the Monument, where the RMS London office is situated.