Category Archives: Risk Modeling

How Did the Global Risk Report Become Existential?

Mid-January saw the publication of the annual World Economic Forum (WEF) “Global Risks Report” timed to set the agenda during this week’s WEF Annual Meeting in Davos.

With each new edition – and this year’s edition is the fifteenth, inevitably, one first turns to the opening page of the report, to discover the Top Five Global Risks for 2020, in terms of their “likelihood” and “impact”. What has been trending and what has slipped down the chart?

Continue reading

Risk Modeling for the Future

The World Economic Forum (WEF) has celebrated its fiftieth-year at its annual meeting in Davos. Increasingly the business/political nexus has become that articulated in WEF founder Klaus Schwab’s Davos Manifesto, that corporations “… must assume the role of a trustee of the material universe for future generations.”

In 2020, “Action on climate change” has now become the number one risk in terms of impact in the World Economic Forum’s Global Risk Report. The work at RMS on quantifying risk and exploring how risk is expected to shift under climate change has never been more important or timely.

Continue reading

2018 USGS Hazard Map Updates: Is RMS Revising the U.S. Earthquake Model?

In March 2018, RMS hosted the U.S. Geological Survey (USGS) workshop at our Newark headquarters in California to discuss the interim updates planned for the 2018 USGS National Seismic Hazard Map Project (NSHMP). Details can be found in my previous blog: Are You Ready for an Interim USGS NSHM Update?

The USGS informed the public and technical community about this interim update ahead of their regular six-year cycle of updates anticipated after 2020. The main purpose was to incorporate new ground motion modeling advances for Central and Eastern U.S. from Project 17, which has significant value for the national building code (details can be found here).

Towards the end of 2018, the USGS published the draft document and national hazard maps to receive scientific peer review and public feedback from the user community (Petersen et al. 2018). Since then, the USGS has been very busy incorporating the updates and finalizing the models. In December 2019, they published the official 2018 USGS NSHMP document in the Earthquake Spectra journal.

Continue reading

Risk Maturity Benchmarking Example: IRB Brasil Re

In our previous blog post, we reviewed how RMS has developed Risk Maturity Benchmarking, a tool to help clients understand their current processes and maturity and create a blueprint for improvement tied to their business strategy.

In 2017, RMS conducted a Risk Maturity Benchmarking (RMB) study for IRB Brasil Re (click here to read the full case study) to assist IRB on the implementation of their three-year transformation plan. 

The IRB Transformation Plan objectives were closely aligned to the company’s primary strategic drivers. These included:

  • To grow IRB’s international presence as a “best in class” global reinsurer
  • To achieve greater capital efficiency across all business lines
  • To develop a market-advancing Enterprise Risk Management capability
  • To maintain a focus on innovation as a key differentiator
  • To achieve a competitive advantage by advancing modeling and analytical capabilities
Continue reading

Risk Maturity Benchmarking: Riding the Wave of Change

Helping clients through the evolution of catastrophe modeling is a core mission for RMS Consulting. To assist in the process we have developed a tool called Risk Maturity Benchmarking, which we’ll introduce below, that helps our customers do this. Secondly, we will review an example where we have applied this framework with a client to create their own target operating model for catastrophe risk.

I don’t believe we would have achieved what we have if we had not first undertaken the RMB study

Luis Brito, head of catastrophe modeling, IRB Brasil Re

The industry is presented with both challenges and opportunities as the pace of change in the (re)insurance industry accelerates. Challenges include increased M&A activity, the entry of alternative capital and continued rate pressure, coupled with catastrophe losses from 2017 and 2018. These headwinds are contrasted by opportunities: an expanding protection gap which is not being filled quickly enough by the market, and technology – from data analytics to automation, frequently touted as the Holy Grail. All these factors have forced the industry to look at itself and reexamine how and where to compete in this brave new world.

Continue reading

Australia Bushfires: A New Normal?

The sheer scale of the Australian bushfires is hard to comprehend, as what has already been a long bushfire season continues apace. Australia’s most-populous state, New South Wales (NSW) has been the worst-affected, with 12.1 million acres (4.9 million hectares) burnt over the current bushfire season. According to the New South Wales Rural Fire Service, damage has recently escalated with 672 homes destroyed since January 1, during a season which has seen 1,870 homes destroyed and 653 damaged.

There has also been reports of significant damage in the neighboring states, including Victoria to the south and Queensland to the north of NSW. Overall, across southeast Australia, 15.6 million acres (6.3 million hectares) have burned, and 25 people have been killed as of January 7. According to the Insurance Council of Australia (ICA), as of January 10, a total of 10,550 claims have been filed since November 8, amounting to around AU$939 million (US$645 million) in insured losses. The ICA notes that it expects more claims to be filed in the coming weeks.

Australian insurers are under the spotlight, but are holding up very well – insurer IAG has publicly stated it was “… on track to blow its perils allowance for the six months to December by AU$80 million” but had strong reinsurance in place. The article in Financial Review commented that there may be a modest effect on earnings for the industry overall, and premiums may have to rise.

Continue reading

The Storm Surge and the Tsunami

The core idea behind catastrophe modeling is that the architecture of risk quantification is the same whatever the peril. While a hurricane is not an earthquake, building a hurricane catastrophe model has elements in common with an earthquake catastrophe model. Stochastic event occurrence, the hazard footprint, the damage mechanism, clustering, post-event loss amplification are all shared concepts.

While on the university campus, disciplines may retain their nineteenth century segregations, in catastrophe modeling we are “ecumenical” about what is the driver of loss: whether it is wind, hail, vibration, flood, cyber, a virus or a terrorist attack. The track of a hurricane, the track of a fault rupture: the contagion of influenza, the contagion of NotPetya malware: the topographic controls of flooding, the topographic controls of wildfire. Exploring the parallels can be illuminating.

Which is why it is interesting to discover historical figures, who like catastrophe modelers, have looked sideways across the catastrophe disciplines. One such figure is the Anglo-Greek Lafcadio Hearn (unless you are from Japan where he is known as Koizumi Yakumo.)

Continue reading

Newcastle: Thirtieth Anniversary of Australia’s Largest Earthquake Loss. But What If…?

Over the past 15 years, we have witnessed some of the world’s largest possible recorded earthquakes that have had catastrophic impacts around the globe. But, looking back 30 years to 1989, we saw two smaller, but still significant earthquakes. The first was the M6.9 Loma Prieta event that hit the San Francisco Bay Area in October, an earthquake that is familiar to many due to its proximity to the city, and its level of destruction. However, less are aware of the other notable earthquake that year. December 28, 1989, is a memorable date for many Australians; as it marks the country’s most damaging earthquake in recorded history, and still remains one of Australia’s costliest natural catastrophes to date.

Despite its moderate magnitude, the M5.4 Newcastle earthquake caused widespread ground shaking, with insured losses of just under $1 billion AUD (US$690 million) at the time of the event (ICA, 2012), a loss which if the earthquake was repeated, RMS estimates would cost over $5 billion AUD.

Continue reading

Technology: The Springboard to Innovative Treaty Underwriting

Cautious optimism surrounds the January 1, 2020 reinsurance renewals, with expectations that the anticipated hardening of rates might be realized – to a modest degree at least.

Reinsurance underwriters who can harness technology to conquer historic risk assessment challenges – including robust marginal impact analytics, and create the space for innovation can build customer relationships that are resilient to future market rate oscillations.

The capital influx to reinsurance markets, triggered by low market returns globally, has led to increased limits and more generous terms being offered without commensurate increases in rates. This trend can only last for so long before having dire effects on reinsurer profitability. 

Profitability in the primary insurance markets has been helped by innovation, with new product offerings linked to enhanced risk assessment techniques like telematics. But while the insurtech wave has propagated hundreds of companies and ideas focused on primary insurers, progress in “reinsure-tech” has been limited, due primarily to the current soft market. These market conditions have constrained resources available for speculative investments and has limited the reinsurer’s ability to pursue potential upside in the fast-moving tech space.

Almost ironically, in response to the market conditions, companies have instituted cautious underwriting approaches still rooted in low-fidelity risk assessment techniques, which haven’t evolved to capitalize on the technological advances made since the market softened at the start of the decade.

Continue reading

Data Engineering for Risk Analytics with Risk Data Open Standard

This article was originally published by DZone

What Is Risk Analytics?

The picture below on the left shows the extensive flooding at industrial parks north of Bangkok, Thailand. Western Digital had 60 percent of its total hard drive production coming from the country – floods disrupted production facilities at multiple sites to dramatically affect a major, global supply chain. And the picture on the right – showing flooding on the New York Subway from Hurricane Sandy, caused widespread disruption and nearly US$70 billion of losses across the northeastern U.S.

In both examples, the analysis of risk should not only help with physical protection measures such as stronger buildings through improved building codes or better defenses, but also the protection available through financial recovery. Providing financial protection is the job of the financial services and insurance industries. Improving our understanding of and practices in risk analytics as a field is one of the most interesting problems in big data these days, given the increasing set of risks we have to watch for.

Flooding at industrial parks north of Bangkok, Thailand in 2011 (left) and flooded subway stations in New York after Hurricane Sandy in 2012 (right) Image credit: Wikimedia/Flickr

How Does Risk Analytics Work?

Obviously, the risk landscape is vast. It stretches from “natural” events – such as severe hurricanes and typhoons, to earthquakes to “human-generated” disasters, such as cyberattacks, terrorism and so on.

The initial steps of risk analytics start with understanding the exposure – this is the risks a given asset, individual etc. are exposed to. Understanding exposure means detailing events that lead to damage and the related losses that could result from those events. Formulas get more complicated from here. There is a busy highway of data surrounding this field. Data engineers, data scientists, and others involved in risk analytics work to predict, model, select, and price risk to calculate how to provide effective protection.

Data Engineering for Risk Analytics

Let’s look at property-focused risks. In this instance, risk analytics starts with an understanding of how a property – such as a commercial or a residential building, is exposed to risk. The kind of events that could pose a risk and the associated losses that could result from those events depends on many variables.

The problem is that in today’s enterprise, if you want to work with exposure data, you have to work with multiple siloed systems that have their own data formats and representations. These systems do not speak the same language. For a user to get a complete picture, they need to go across these systems and constantly translate and transform data between them. As a data engineer, how do you provide a unified view of data across all systems? For instance, how can you enable a risk analyst to understand all kinds of perils – from a hurricane, a hailstorm to storm surge, and then roll this all up so you can guarantee the coverage on these losses?

There are also a number of standards used by the insurance industry to integrate, transfer, and exchange this type of information. The most popular of these formats is the Exposure Data Model (EDM). However, EDMs and some of their less popular counterparts (Catastrophe Exposure Database Exchange – CEDE, and Open Exposure Data – OED) have not aged well and have not kept up with the industry needs:

  • These older standards are property centric; risk analytics requires an accommodation and understanding of new risks, such as cyberattacks, liability risks, and supply chain risk.
  • These older standards are propriety-designed for single systems that do not take into account the needs of various systems, for example, they can’t support new predictive risk models.
  • These standards don’t come with the right containment to represent high fidelity data portability – the exposure data formats do not usually represent losses, reference data, and settings used to produce the loss information that can allow for data integrity.
  • These standards are not extensible. Versioning and dependencies on specific product formats (such as database formats specific to version X of SQL Server etc) constantly make data portability harder.

This creates a huge data engineering challenge. If you can’t exchange information with high fidelity, forget getting reliable insights. As anyone dealing with data will say: garbage in, garbage out!

For any data engineer dealing with risk analytics, there is great news. There is a new open standard that is designed to remove shortcomings of the EDM and other similar formats. This new standard has been in the works for several years. It is the Risk Data Open Standard. The Risk Data Open Standard (RDOS) is designed to simplify data engineering. It is designed to simplify integrating data between systems that deal with exposure and loss data. It isn’t just RMS working to invent and validate this standard in isolation. A steering committee of thought leaders from influential companies is working on validating the Risk Data OS.

The Risk Data OS will allow us to work on risk analytics much more effectively. This is the way we can better understand the type of protection we need to create to help mitigate against climate change and other natural or human-made disasters. You can find details on the Risk Data OS here. If you are interested in the Risk Data OS, have feedback, or would like to help us define this standard, you can email the Risk Data OS steering committee by clicking here .