Tag Archives: catastrophe risk management

What Can You Learn From Exposure?

Many RMS team members are active bloggers and speakers at speaking engagements, and contribute to articles, but to capture even more of our expertise and insight, and to reflect the breadth of our activities, we have created a new magazine called Exposure, which is ready for you to download.

The aim of Exposure magazine is to bring together topics of special interest to catastrophe and risk management professionals, and recognize the vast area that individuals involved in risk management must cover today.  There is also a theme that we believe unites the risk community, and that is the belief that “risk is opportunity,” and the articles within Exposure magazine reflect that this is a market seeking to avoid surprises, improve business performance, and innovate to create new opportunities for growth.

Within the foreword to Exposure, Hemant Shah, CEO of RMS, also reflects on an “inflection point” in the industry, a mix of globalization, changing market structures, to technology, data and analytics, offering a chance for the industry to innovate and increase its relevance.  In Exposure, there is a mix of articles examining perils and regions, industry issues, and articles discussing what’s coming up for our industry.

Within perils and regions, Exposure looks at opportunities for U.S. and European flood, the effect extra-tropical transitioning has on typhoons in Japan, and the impact that secondary hazards have, such as liquefaction, and the earthquake sequencing that hit the low-seismicity area of Canterbury, New Zealand in 2010 and 2011.

The magazine also tackles issues around Solvency II, the emergence of the “grey swan” event, why reinsurers are opting to buy or build their own insurance-linked securities fund management capabilities, and wraps up with Robert Muir-Wood, chief research officer for RMS, explaining how insurers can help drive the resilience analytics revolution.

Please download your copy now.

Fire Weather

Fires can start at all times and places, but how a fire spreads is principally down to the weather.

This week, 350 years ago, the fire at Thomas Farriner’s bakery on Pudding Lane, a small alleyway running down to the river from the City of London, broke out at the quietest time of the week, around 1am on Sunday morning September 2, 1666. London had been experiencing a drought and the thatched roofs of the houses were tinder dry. At 4 am the Lord Mayor, roused from his sleep, decided the blaze was easily manageable. It was already too late, however. By 7am the roofs of some 300 houses were burning and fanned by strong easterly winds the fire was spreading fast towards the west. Within three days the fire had consumed 13,000 houses and left 70,000 homeless.

In the city’s reconstruction only brick and tiles houses were permitted, severely reducing the potential for repeat conflagrations. Within a few years there were the first fire insurers, growing their business as fear outran the risk.

Yet big city fires had by no means gone away and the wooden cities of northern Europe were primed to burn. The 1728 Copenhagen fire destroyed 28% of the city while the 1795 fire left 6000 homeless. A quarter of the city of Helsinki burned down in November 1808. The 1842 fire that destroyed Hamburg left 20,000 homeless. The center of the city of Bergen Norway burnt down in 1855 and then again in January 1916.

Wind and fire

By the start of the 20th Century, improvements in fire-fighting had reduced the chance that a great city fire took hold, but not if there were strong winds, like the 1916 Bergen, Norway fire, which broke out in the middle of an intense windstorm with hurricane force gusts. In February 1941 the fire that burnt out the historic center of Santander on the coast of northern Spain was driven by an intense windstorm: equivalent to the 1987 October storm in the U.K. And then there is the firestorm that destroyed Yokohama and Tokyo after the 1923 earthquake, driven by 50 miles per hour winds on the outer edge of a typhoon in which, over a few hours, an estimated 140,000 died.

Wind and fire in the wooden city are a deadly combination. Above a certain wind speed, the fire becomes an uncontrollable firestorm. The 1991 Oakland Hills fire flared up late morning also on a Sunday and then surged out of the mountains into the city, driven by hot dry 60 miles per hour Diablo Winds from the east, jumping an 8 lane highway and overwhelming the ability of the fire crews to hold the line, until the wind eventually turned and the fire blew back over its own embers.  The fire consumed 2800 houses, spreading so fast that 25 died. On February 7, 2009 a strong northwesterly wind drew baking air out of Australia’s interior and fires took off across the state of Victoria. Fallen power cables sparked a fire whose embers, blown by 60 miles per hour winds, flashed from one woodland to another, overwhelming several small towns so fast that 173 died before they could escape.

Most recently we have seen fire storms in Canada. Again there is nothing new about the phenomenon; the Matheson fires in 1919 destroyed 49 Ontario towns and killed 244 people in a fire front that extended 60km wide. It was a firestorm fanned by gale force winds, that destroyed one third of the city of Slave Lake, Alberta, in 2011 and it is fortunate only that the roads were broad and straight to allow people to escape the fires that raged into Fort McMurray in summer 2016.

There is no remedy for a firestorm blown on gale-force winds. And wooden property close to drought ridden forests are at very high risk, such as those from South Lake Tahoe to Berkeley in California and in New Zealand, from Canberra to Christchurch. Which is why urban fire needs to stay on the agenda of catastrophe risk management. A wind driven conflagration can blow deep into any timber city, and insurers need to manage their exposure concentrations.

Exposure Data: The Undervalued Competitive Edge

High-quality catastrophe exposure data is key to a resilient and competitive insurer’s business. It can improve a wide range of risk management decisions, from basic geographical risk diversification to more advanced deterministic and probabilistic modeling.

The need to capture and use high quality exposure data is not new to insurance veterans. It is often referred to as the “garbage-in-garbage-out” principle, highlighting the dependency of catastrophe model’s output on reliable, high quality exposure data.

The underlying logic of this principle is echoed in the EU directive Solvency II, which requires firms to have a quantitative understanding of the uncertainties in their catastrophe models; including a thorough understanding of the uncertainties propagated by the data that feeds the models.

The competitive advantage of better exposure data

The implementation of Solvency II will lead to a better understanding of risk, increasing the resilience and competitiveness of insurance companies.

Firms see this, and more insurers are no longer passively reacting to the changes brought about by Solvency II. Increasingly, firms see the changes as an opportunity to proactively implement measures that improve exposure data quality and exposure data management.

And there is good reason for doing so: The majority of reinsurers polled recently by EY (formerly known as Ernst & Young) said quality of exposure data was their biggest concern. As a result, many reinsurers apply significant surcharges to cedants that are perceived to have low-quality exposure data and exposure management standards. Conversely, reinsurers are more likely to provide premium credits of 5 to 10 percent or offer additional capacity to cedants that submit high-quality exposure data.

Rating agencies and investors also expect more stringent exposure management processes and higher exposure data standards. Sound exposure data practices are, therefore, increasingly a priority for senior management, and changes are driven with the mindset of benefiting from the competitive advantage that high-quality exposure data offers.

However, managing the quality of exposure data over time can be a challenge: During its life cycle, exposure data degrades as it’s frequently reformatted and re-entered while passed on between different insurance entities along the insurance chain.

To fight the decrease of data quality, insurers spend considerable time and resources to re-format and re-enter exposure data as its being passed on along the insurance chain (and between departments within each individual touch point on the chain). However, due to the different systems, data standards and contract definitions in place a lot of this work remains manual and repetitive, inviting human error.

In this context, RMS’ new data standards, exposure management systems, and contract definition languages will be of interest to many insurers; not only because it will help them to tackle the data quality issue, but also by bringing considerable savings through reduced overhead expenditure, enabling clients to focus on their core insurance business.

What is Catastrophe Modeling?

Anyone who works in a field as esoteric as catastrophe risk management knows the feeling of being at a cocktail party and having to explain what you do.

So what is catastrophe modeling anyway?

Catastrophe modeling allows insurers and reinsurers, financial institutions, corporations, and public agencies to evaluate and manage catastrophe risk from perils ranging from earthquakes and hurricanes to terrorism and pandemics.

Just because an event hasn’t occurred in that past doesn’t mean it can’t or won’t. A combination of science, technology, engineering knowledge, and statistical data is used to simulate the impacts of natural and manmade perils in terms of damage and loss. Through catastrophe modeling, RMS uses computing power to fill the gaps left in historical experience.

Models operate in two ways: probabilistically, to estimate the range of potential catastrophes and their corresponding losses, and deterministically, to estimate the losses from a single hypothetical or historical catastrophe.

Catastrophe Modeling: Four Modules

The basic framework for a catastrophe model consists of four components:

  • The Event Module incorporates data to generate thousands of stochastic, or representative, catastrophic events. Each kind of catastrophe has a method for calculating potential damages taking into account history, geography, geology, and, in cases such as terrorism, psychology.
  • The Hazard Module determines the level of physical hazard the simulated events would cause to a specific geographical area-at-risk, which affects the strength of the damage.
  • The Vulnerability Module assesses the degree to which structures, their contents, and other insured properties are likely to be damaged by the hazard. Because of the inherent uncertainty in how buildings respond to hazards, damage is described as an average. The vulnerability module offers unique damage curves for different areas, accounting for local architectural styles and building codes.
  • The Financial Module translates the expected physical damage into monetary loss; it takes the damage to a building and its contents and estimates who is responsible for paying. The results of that determination are then interpreted by the model user and applied to business decisions.

Analyzing the Data

Loss data, the output of the models, can then be queried to arrive at a wide variety of metrics, including:

  • Exceedance Probability (EP): EP is the probability that a loss will exceed a certain amount in a year. It is displayed as a curve, to illustrate the probability of exceeding a range of losses, with the losses (often in millions) running along the X-axis, and the exceedance probability running along the Y-axis.
  • Return Period Loss: Return periods provide another way to express exceedance probability. Rather than describing the probability of exceeding a given amount in a single year, return periods describe how many years might pass between times when such an amount might be exceeded. For example, a .4% probability of exceeding a loss amount in a year corresponds to a probability of exceeding that loss once every 250 years, or “a 250-year return period loss.”
  • Annual Average Loss (AAL): AAL is the average loss of all modeled events, weighted by their probability of annual occurrence. In an EP curve, AAL corresponds to the area underneath the curve, or the average expected losses that do not exceed the norm. Because of this, the AAL of two EP curves can be compared visually. AAL is additive, so it can be calculated based on a single damage curve, a group of damage curves, or the entire event set for a sub-peril or peril. It also provides a useful, normalized metric for comparing the risks of two or more perils, despite the fact that peril hazards are quantified using different metrics.
  • Coefficient of Variation (CV): The CV measures the size, or degree of variation, of each set of damage outcomes estimated in the vulnerability module. This is important because damage estimates with high variation, and therefore a high CV, will be more volatile than an estimate with a low CV. More often than not, a property will “behave” unexpectedly in the face of a given peril, if the property’s characteristics were modeled with high volatility data versus a data set with more predictable variation. Mathematically, the CV is the ratio of the standard deviation of the losses (or the “breadth” of variation in a set of possible damage outcomes) over the mean (or average) of the possible losses.

Catastrophe modeling is just one important component of a risk management strategy. Analysts use a blend of information to get the most complete picture possible so that insurance companies can determine how much loss they could sustain over a period of time, how to price products to balance market needs and potential costs, and how much risk they should transfer to reinsurance companies.

Catastrophe modeling allows the world to predict and mitigate damage resulting from the events. As models improve, so hopefully will our ability to face these catastrophes and minimize the negative effects in an efficient and less costly way.