Tag Archives: (re)insurance

Reflections from Rendezvous: Innovation to Drive Growth in the Global (Re)Insurance Industry

Each year, the (re)insurance industry meets at the Rendezvous in Monte Carlo to discuss pressing issues facing the market. This year, my colleagues and I had lively discussions about the future of our industry, explored what’s top of mind for our clients and partners, and shared our own perspectives.


Source: The Fairmont Monte Carlo

Over the course of the week, a number of themes emerged.

The industry is at an inflection point, poised for growth

The (re)insurance industry is at an inflection point. While the existing market remains soft, there was a growing recognition at the Rendezvous that the real issue is innovation for growth. We heard time and again that too much of the world’s risk is uninsured, and that (re)insurers need strategies to expand coverage to catastrophic events. Not only in the developing world, but in established markets such as the U.S. and Europe.

Flood risk was of particular interest and discussion at the event. Against the backdrop of a changing climate and a growing concentration of exposures, flood losses nearly doubled in the 10 years from 2000 to 2009, compared to the decade prior. With better data and models, (re)insurers are growing confident they can underwrite, structure, and manage flood risks and provide solutions to meet growing global demand.

In many conversations we shared our thesis that the world’s exposures are evolving from assets@risk to systems@risk. Economic growth and activity is vulnerable to disruption in systems, and innovation, supported by models, data and analytics, is needed to provide new forms of coverage. Take cyber, for example. Insurers see significant opportunities for new forms of cyber risk coverage, but there are fundamental gaps in the industry’s understanding of the risk. When the market is better able to understand cyber risks and model and manage accumulations, cyber could really take off.

Alternative capital is no longer alternative

Amidst a general sense of stability—in part due to more acceptance of the “new normal” after falling prices and a number of mergers and acquisitions, and in part due to a very benign catastrophe risk market—there is a shifting dynamic between insurance-linked securities (ILS) and reinsurance. Alternative capital is now mainstream. In fact, one equity analyst called the use of third party capital a “fiduciary duty.”

Risk is opportunity

I was motivated by how many industry leaders see their market as primed for innovation-driven growth. This is not to overlook present day challenges, but to recognize that the industry can combine capital and know-how, increasingly informed by data analytics, to develop new solutions to expand coverage to an increasingly risky and interconnected world. As I recently wrote, risk is opportunity.

Exposure Data: The Undervalued Competitive Edge

High-quality catastrophe exposure data is key to a resilient and competitive insurer’s business. It can improve a wide range of risk management decisions, from basic geographical risk diversification to more advanced deterministic and probabilistic modeling.

The need to capture and use high quality exposure data is not new to insurance veterans. It is often referred to as the “garbage-in-garbage-out” principle, highlighting the dependency of catastrophe model’s output on reliable, high quality exposure data.

The underlying logic of this principle is echoed in the EU directive Solvency II, which requires firms to have a quantitative understanding of the uncertainties in their catastrophe models; including a thorough understanding of the uncertainties propagated by the data that feeds the models.

The competitive advantage of better exposure data

The implementation of Solvency II will lead to a better understanding of risk, increasing the resilience and competitiveness of insurance companies.

Firms see this, and more insurers are no longer passively reacting to the changes brought about by Solvency II. Increasingly, firms see the changes as an opportunity to proactively implement measures that improve exposure data quality and exposure data management.

And there is good reason for doing so: The majority of reinsurers polled recently by EY (formerly known as Ernst & Young) said quality of exposure data was their biggest concern. As a result, many reinsurers apply significant surcharges to cedants that are perceived to have low-quality exposure data and exposure management standards. Conversely, reinsurers are more likely to provide premium credits of 5 to 10 percent or offer additional capacity to cedants that submit high-quality exposure data.

Rating agencies and investors also expect more stringent exposure management processes and higher exposure data standards. Sound exposure data practices are, therefore, increasingly a priority for senior management, and changes are driven with the mindset of benefiting from the competitive advantage that high-quality exposure data offers.

However, managing the quality of exposure data over time can be a challenge: During its life cycle, exposure data degrades as it’s frequently reformatted and re-entered while passed on between different insurance entities along the insurance chain.

To fight the decrease of data quality, insurers spend considerable time and resources to re-format and re-enter exposure data as its being passed on along the insurance chain (and between departments within each individual touch point on the chain). However, due to the different systems, data standards and contract definitions in place a lot of this work remains manual and repetitive, inviting human error.

In this context, RMS’ new data standards, exposure management systems, and contract definition languages will be of interest to many insurers; not only because it will help them to tackle the data quality issue, but also by bringing considerable savings through reduced overhead expenditure, enabling clients to focus on their core insurance business.

Opportunities and Challenges ahead for Vietnam: Lessons Learned from Thailand

Earlier this month I gave a presentation at the 13th Asia Insurance Review conference in Ho Chi Minh City, Vietnam. It was a very worthwhile event that gave good insights into this young insurance market, and it was great to be in Ho Chi Minh City—a place that immediately captured me with its charm.


Bangkok, Thailand during the 2011 floods. Photo by Petty Officer 1st Class Jennifer Villalovos.

Vietnam shares striking similarities to Thailand, both from a peril and an exposure perspective. And, for Vietnam to become more resilient, it could make sense to learn from Thailand’s recent natural catastrophe (NatCat) experiences, and understand why some of the events were particularly painful in absence of good exposure data.

NatCat and Exposure similarities between Thailand and Vietnam 

Flood profile Vietnam shows a similar flood profile as Thailand, with significant flooding every year. Vietnam’s Mekong Delta, responsible for half of the country’s rice production, is especially susceptible to flooding.
Coast line Both coastlines are similar in length[1] and are similarly exposed to storm surge and tsunami.[2]
Tsunami & Tourism Thailand and its tourism industry were severely affected by the 2004 Indian Ocean Tsunami. Vietnam’s coastline and it’s tourism hotspots (e.g. Da Nang) show similar exposure to tsunami, potentially originating from the Manila Arc.2
GDP growth Thailand’s rapid GDP growth and accompanying exposure growth in the decade prior to the 2011 floods caught many by surprise. Vietnam has been growing even faster in the last ten years[3]; and exposure data quality (completeness and accuracy) have not necessarily kept up with this development.
Industrialization and global supply chain relevance Many underestimated the significance Thailand played in the global supply chain; for example, in 2011 about a quarter of all hard disk drives were produced in Thailand. Currently, Vietnam is undergoing the same rapid industrialization. For example, Samsung opened yet another multi-billion dollar industrial facility in Vietnam, propelling the country to the forefront of mobile phone production and increasing its significance to the global supply chain.

Implications for the Insurance Industry

In light of these similarities and the strong impact that global warming will have on Vietnam[4], regulators and (re)insurers are now facing several challenges and opportunities:

Modeling of perils and technical writing of business needs to be at the forefront of every executive’s mind for any mid-to long-term business plan. While this is not something that can be implemented overnight, the first steps have been taken, and it’s just a matter of time to get there.

But to get there as quickly and efficiently as possible, another crucial step stone must be taken: to improve exposure data quality in Vietnam. Better exposure insights in Thailand would almost certainly have led to a better understanding of exposure accumulations and could have made a significant difference post floods, resulting in less financial and reputational damage to many (re)insurers.

As insurance veterans know, it’s not a question of if a large scale NatCat event will happen in Vietnam, but a question of when. And while it’s not possible to fully eliminate the element of surprise in NatCat events, the severity of these surprise can be reduced by having better exposure data and exposure management in place.

This is where the real opportunity and challenge lies for Vietnam: getting better exposure insights to be able to mitigate risks. Ultimately, any (re)insurer wants to be in a confident position when someone poses this question: “Do you understand your exposures in Vietnam?”

RMS recognizes the importance of improving the quality and management of exposure data: Over the past twelve months, RMS has released exposure data sets for Vietnam and many other territories in the Asia-Pacific. To find out more about the RMS® Asia Exposure data sets, please e-mail asia-exposure@rms.com.  

[1] Source: https://en.wikipedia.org/wiki/List_of_countries_by_length_of_coastline
[2] Please refer to the RMS® Global Tsunami Scenario Catalog and the RMS® report on Coastlines at Risk of Giant Earthquakes & Their Mega-Tsunami, 2015
[3] The World Bank: http://data.worldbank.org/country/vietnam, last accessed: 1 July 2015
[4] Vietnam ranks among the five countries to be most affected by global warming, World Bank Country Profile 2011: http://sdwebx.worldbank.org/climateportalb/doc/GFDRRCountryProfiles/wb_gfdrr_climate_change_country_profile_for_VNM.pdf

What is Catastrophe Modeling?

Anyone who works in a field as esoteric as catastrophe risk management knows the feeling of being at a cocktail party and having to explain what you do.

So what is catastrophe modeling anyway?

Catastrophe modeling allows insurers and reinsurers, financial institutions, corporations, and public agencies to evaluate and manage catastrophe risk from perils ranging from earthquakes and hurricanes to terrorism and pandemics.

Just because an event hasn’t occurred in that past doesn’t mean it can’t or won’t. A combination of science, technology, engineering knowledge, and statistical data is used to simulate the impacts of natural and manmade perils in terms of damage and loss. Through catastrophe modeling, RMS uses computing power to fill the gaps left in historical experience.

Models operate in two ways: probabilistically, to estimate the range of potential catastrophes and their corresponding losses, and deterministically, to estimate the losses from a single hypothetical or historical catastrophe.

Catastrophe Modeling: Four Modules

The basic framework for a catastrophe model consists of four components:

  • The Event Module incorporates data to generate thousands of stochastic, or representative, catastrophic events. Each kind of catastrophe has a method for calculating potential damages taking into account history, geography, geology, and, in cases such as terrorism, psychology.
  • The Hazard Module determines the level of physical hazard the simulated events would cause to a specific geographical area-at-risk, which affects the strength of the damage.
  • The Vulnerability Module assesses the degree to which structures, their contents, and other insured properties are likely to be damaged by the hazard. Because of the inherent uncertainty in how buildings respond to hazards, damage is described as an average. The vulnerability module offers unique damage curves for different areas, accounting for local architectural styles and building codes.
  • The Financial Module translates the expected physical damage into monetary loss; it takes the damage to a building and its contents and estimates who is responsible for paying. The results of that determination are then interpreted by the model user and applied to business decisions.

Analyzing the Data

Loss data, the output of the models, can then be queried to arrive at a wide variety of metrics, including:

  • Exceedance Probability (EP): EP is the probability that a loss will exceed a certain amount in a year. It is displayed as a curve, to illustrate the probability of exceeding a range of losses, with the losses (often in millions) running along the X-axis, and the exceedance probability running along the Y-axis.
  • Return Period Loss: Return periods provide another way to express exceedance probability. Rather than describing the probability of exceeding a given amount in a single year, return periods describe how many years might pass between times when such an amount might be exceeded. For example, a .4% probability of exceeding a loss amount in a year corresponds to a probability of exceeding that loss once every 250 years, or “a 250-year return period loss.”
  • Annual Average Loss (AAL): AAL is the average loss of all modeled events, weighted by their probability of annual occurrence. In an EP curve, AAL corresponds to the area underneath the curve, or the average expected losses that do not exceed the norm. Because of this, the AAL of two EP curves can be compared visually. AAL is additive, so it can be calculated based on a single damage curve, a group of damage curves, or the entire event set for a sub-peril or peril. It also provides a useful, normalized metric for comparing the risks of two or more perils, despite the fact that peril hazards are quantified using different metrics.
  • Coefficient of Variation (CV): The CV measures the size, or degree of variation, of each set of damage outcomes estimated in the vulnerability module. This is important because damage estimates with high variation, and therefore a high CV, will be more volatile than an estimate with a low CV. More often than not, a property will “behave” unexpectedly in the face of a given peril, if the property’s characteristics were modeled with high volatility data versus a data set with more predictable variation. Mathematically, the CV is the ratio of the standard deviation of the losses (or the “breadth” of variation in a set of possible damage outcomes) over the mean (or average) of the possible losses.

Catastrophe modeling is just one important component of a risk management strategy. Analysts use a blend of information to get the most complete picture possible so that insurance companies can determine how much loss they could sustain over a period of time, how to price products to balance market needs and potential costs, and how much risk they should transfer to reinsurance companies.

Catastrophe modeling allows the world to predict and mitigate damage resulting from the events. As models improve, so hopefully will our ability to face these catastrophes and minimize the negative effects in an efficient and less costly way.

An Industry Call to Action: It’s Time for India’s Insurance Community To Embrace Earthquake Modeling

The devastating Nepal earthquake on April 25, 2015 is a somber reminder that other parts of this region are highly vulnerable to earthquakes.

India, in particular, stands to lose much in the event of an earthquake or other natural disaster: the economy is thriving; most of its buildings aren’t equipped to withstand an earthquake; the region is seismically active, and the continent is home to 1.2 billion people—a sizeable chunk of the world’s population.

In contrast to other seismically active countries such as the United States, Chile, Japan and Mexico, there are few (re)insurers in India using earthquake models to manage their risk, possibly due to the country’s nascent non-life insurance industry.

Let’s hope that the Nepal earthquake will prompt India’s insurance community to embrace catastrophe modeling to help understand, evaluate, and manage its own earthquake risk. Consider just a few of the following facts:

  • Exposure Growth: By 2016, India is projected to be the world’s fastest growing economy. In the past decade, the country has experienced tremendous urban expansion and rapid development, particularly in mega-cities like Mumbai and Delhi.
  • Buildings are at Risk: Most buildings in India are old and aren’t seismically reinforced. These buildings aren’t expected to withstand the next major earthquake. While many newer buildings have been built to higher seismic design standards they are still expected to sustain damage in a large event.
  • Non-Life Insurance Penetration Is Low but Growing: India’s non-life insurance penetration is under one percent but it’s slowly increasing—making it important for (re)insurers to understand the earthquake hazard landscape.

Delhi and Mumbai – Two Vulnerable Cities

India’s two mega cities, Delhi and Mumbai, have enjoyed strong economic activity in recent years, helping to quadruple the country’s GDP between 2001 and 2013.

Both cities are located in moderate to high seismic zones, and have dense commercial centers with very high concentrations of industrial and commercial properties, including a mix of old and new buildings built to varying building standards.

According to AXCO, an insurance information services company, 95 percent of industrial and commercial property policies in India carry earthquake cover. This means that (re)insurers need to have a good understanding of the exposure vulnerability to effectively manage their earthquake portfolio aggregations and write profitable business, particularly in high hazard zones.

For (re)insurers to effectively manage the risk in their portfolio, they require an understanding of how damage can vary depending on the different type of construction. One way to do this is by using earthquake models, which take account of the different quality and types of building stock, enabling companies to understand potential uncertainty associated with varying construction types.

A Picture of India’s Earthquake Risk

India sits in a seismically active region and is prone to some of the world’s most damaging continental earthquakes.

The country is tectonically diverse and broadly characterized by two distinct seismic hazard regions: high hazard along the Himalayan belt as well as along Gujarat near the Pakistan border (inter-plate seismicity), and low-to-moderate hazard in the remaining 70 percent of India’s land area, known as the Stable Continental Region.

The M7.8 Nepal earthquake occurred on the Himalayan belt, where most of India’s earthquakes occur, including four great earthquakes (M > 8). However, since exposure concentrations and insurance penetration in these areas are low, the impact to the insurance industry has so far been negligible.

In contrast, further south on the peninsula where highly populated cities are located there have been several low magnitude earthquakes that have caused extensive damages and significant casualties, such as the Koyna (1967), Latur (1993), and Jabalpur (1997) earthquakes.

It is these types of damaging events that will be of significance to (re)insurers, particularly as insurance penetration increases. Earthquake models can help (re)insurers to quantify the impacts of potential events on their portfolios.

Using Catastrophe Models to Manage Earthquake Risk

There are many tools available to India’s insurance community to manage and mitigate earthquake risk.

Catastrophe models are one example.

Our fully probabilistic India Earthquake Model includes 14 historical events, such as the 2001 Gurajat and 2005 Kashmir earthquakes, and a stochastic event set of more than 40,000 earthquake scenarios that have the potential to impact India, providing a comprehensive view of earthquake risk India.

Since its release in 2006, (re)insurers in India and around the world have been using the RMS model output to manage their earthquake portfolio aggregations, optimizing their underwriting and capital management processes. We also help companies without the infrastructure to use fully probabilistic models to reap the benefits of the model through our consulting services.

What are some of the challenges to embracing modeling in parts of the world like India and Nepal? Feel free to ask questions or comment below. 

Exceedance 2015: In the Books

It’s been quite a week here in Miami – full of palm trees, ocean views…and catastrophe risk management.

Throughout the week, our keynote speakers discussed hot topics in science, catastrophe modeling, and risk management:

  • We kicked off the week with keynotes from Hemant Shah, Paul Wilson, Ben Brookes, and Daniel Stander discussing RMS’ vision for the future and how catastrophe modeling can enable innovation and growth within the (re)insurance industry and beyond.
  • Patricia Grossi shed light on earthquake risk in Latin America, and there were more than a few misty eyes as Laurence Golborne regaled us with tales of risk management from his time as minister of mines and energy in Chile, where he led the rescue of the “Los 33” miners trapped underground for more than two months.
  • Rick Knabb, director of the National Hurricane Center, explained why awareness is central to the mission of the NHC; educating the public about the need to prepare increases the ability to recover.
  • Robert Muir-Wood explained that the biggest concentrations of risk and gradients of risk are coastal, necessitating state-of-the-art modeling of storm surge, tsunami, and liquefaction in order to mitigate this risk.

IMG_6369 copy 3

IMG_6840 (1) copy

IMG_6864 copy

Hemant and other members of the RMS leadership team answered questions on-stage during an “Ask Us Anything” session. Here are a few highlights:

  • What’s your vision beyond 2020?
    • Eric Yau: We want to create an open platform that unlocks innovation potential for our clients and partners.
    • Matthew Grant: Our goal is to allow clients to underwrite business that isn’t possible today. We will work together to grow the broader (re)insurance market.
  • What can I do to help Nepal? 
    • Paul VanderMarck: We work with Build Change, an organization aligned with our mission of mitigating risk. We recommend them as an organization and are matching employee contributions. Build Change is starting a program in Nepal using the same playbook that has already been successful in areas such as Haiti and Japan.
  • Suppose you were to start from scratch today – would you do anything differently? 
    • Mohsen Rahnama: When we started, we didn’t have any of the tools we have today. We take advantage of and implement technology to approach problems in a systematic way. Technology allows us to build better models.

In addition, we were thankful to have many of our clients and partners not just attend, but present at Exceedance. BMS, JLT RE, Munich Re, Aon Benfield, Risk Frontiers, Holborn Corp, ARA, Willis Re, Guy Carpenter, TigerRisk, SCOR, and Price Forbes all presented during the “Alternative Views of the Market” track which provided insight from across the industry.

  • Munich Re showed impactful videos of homes under 100 mph winds, emphasizing the difference in performance of structures built to various standards.
  • Willis Re advocated for deterministic modeling and developing alternative views of risk by considering different sizes of events and “what if” analyses.
  • Guy Carpenter explained how to define critical events by aligning the level of loss to specific outcomes such as lost earnings, ratings watches, and ratings downgrades.

And finally, we salsa-ed the night away to the sweet tunes of two-time Grammy-nominated Latin band Palo during EP, the Exceedance Party, at LIV nightclub.

I hope you enjoyed the week and found it insightful and thought-provoking. We hope to see you all back at the Fontainebleau Miami Beach Hotel next year, where Exceedance 2016 will take place from May 16 to 19.

Risk, Models, and Innovations: It’s All Interconnected

A few themes came through loud and clear during this morning’s keynote sessions at Exceedance 2015.

RMS’ commitment to modeling innovation was unmistakable. As RMS co-founder and CEO Hemant Shah highlighted on stage, RMS worked hard and met our commitment to release RiskLink version 15 on March 31, taking extra measures to ensure the quality of the product.

Over the past five years, RMS has released 210 model upgrades and 35 new models. With a 30% increase in model development resources over the last two years and 10 HD models in various stages of research and development, RMS has the most robust model pipeline in its history.

As Paul Wilson explained, HD models are all about providing better clarity into the risk. They are a more precise representation of the way a physical damage results in a (re)insurance loss, with a more precise treatment of propagation of uncertainty through the model, designed to deal with losses as closely as possible as the way claims occur in real life.

HD models are the cornerstone of the work RMS is doing in model development right now. HD models represent the intersection of RMS research, science and technology. With HD models we are not limited by software – we can approach the challenge of modeling risk in exciting new ways.

And it’s more than just the models – RMS is committed to transparency, engagement, and collaboration.

RMS’ commitment to RMS(one) was also clear. Learning from the lessons of the past year, RMS developing an open platform that’s not just about enabling RMS to build its own models. It’s an exposure and risk management platform that’s about enabling clients and partners to build models. It’s about analytics, dynamic risk management and more.

RMS(one) will be released, judiciously and fully-matured, in stages over the next 15 months,starting with a model evaluation environment for our first HD Model, Europe Flood, in autumn 2015.

And, Hemant emphasized that starting later this calendar year, RMS will open the platform to its clients and partners with the Platform Development Kit (PDK).

In addition, RMS(one) pricing will be built around three core principles:

  • Simple, predictable packages
  • In most cases, no additional fees for clients who simply want continuity in their RMS modeling relationships
  • Clearly differentiated high-value packages at compelling prices for those who wish to benefit from RMS(one) beyond its replacement as a superior modeling utility to RiskLink

The overall goal of RMS’ commitment to modeling and technology innovation is to capitalize on a growing and ever-changing global (re)insurance market, ultimately building a more resilient global society. RMS is working with industry clients and partners to do so by understanding emerging risks, identifying new opportunities to insure more risk, developing new risk transfer products, and creating new ways of measuring risk.

As Ben Brookes said, we only have to look at the recent events in Nepal to understand that there are huge opportunities – and needs – to improve resilience and the management of risk. RMS’ work for Metrocat, a catastrophe bond designed specifically to protect the New York MTA’s infrastructure against storm surge, showed the huge potential for the developing alternate methods of risk transfer in order to improve resilience.

And during his session, Daniel Stander pointed out that only 1.9% of the global economy is insured. As the world’s means of production shifts from assets to systems, RMS is working to understand how to understand systems of risk, starting with marine, supply chain, and cyber risk, tackling tough questions such as:

  • What are the choke points in the global shipping network, and how do they respond under stress?
  • How various events create a ripple effect that impact the global supply chain – for example, why did the Tohoku earthquake and tsunami in Japan cause a shortage of iPads in Australia, halt production at BMW in Germany, and enable a booming manufacturing industry in Guangzhou?
  • How do we measure cyber risk when technology has become so critical that it is systemically important to the global economy?

global shipping

Leaving the keynotes, a clear theme rang true: as the world becomes more interconnected, it is the intersection of innovation in science and technology that will enable us to scale and solve global problems head on.

The 2015 U.K. Budget and Terrorism Insurance

On 18 March, the Chancellor of the Exchequer, George Osborne, delivered his pre-election budget. Billions of further public spending cuts are needed. Several weeks earlier, Pool Re, the U.K. terrorism insurance pool, announced its first ever purchase of reinsurance in the commercial market.

These two announcements are not unconnected.

Pool Re was set up in 1993, after the IRA bombing of the Baltic Exchange in 1992. Since the pool was established, it has built up quite a substantial surplus; claims have been low thanks to the vigilance of the security and intelligence services. Almost all the major plots since the September 11, 2001 attack have been foiled.

Terrorism insurance is effectively insurance against counter-terrorism failure, and the huge sums spent on blanket indiscriminate surveillance have helped to minimize terrorism insurance losses. The low level of losses is not coincidental, or due to some unpredictable whim of terrorist behavior but readily explainable; too many terrorists spoil the plot. The type of plots capable of causing terrorism insurance losses of a billion pounds or more would involve a sizeable number of operatives.

As the NSA whistleblower Edward Snowden has revealed, the level of surveillance of electronic communications is so intensive that sizeable terrorist social networks end up being tracked by NSA and GCHQ. Lesser plots involving lone wolves or several operatives are most likely to be successful. A string of these have struck the western alliance over the past months in Ottawa, Sydney, Paris, and Copenhagen. Besides causing terror, these have attracted global media publicity, inspiring Jihadi recruitment. But terrorism insurance covers property loss, not the spread of fear or growth in the ranks of Islamic State.

Having developed a tough security environment, it is unsurprising that the U.K. Government should be questioning its continuing exposure to terrorism insurance risk. This is an age of austerity. Pool Re’s three year program provides £1.8bn of reinsurance cover, so diminishing this exposure. More cover might have been purchased, but this was the market limit, given that Chemical-Biological-Radiological-Nuclear (CBRN) risks were included.

The idea of separating out extreme CBRN terrorism risks was considered in Washington by the House Financial Services Committee in the discussions last year over the renewal of the Terrorism Risk Insurance Act. The objective was to provide a federal safety net for such extreme risks, whilst encouraging further private sector solutions for conventional terrorist attacks. This idea was considered at some length, but was a step too far for this TRIA renewal. It might be a step for Pool Re.

The modus operandi of the IRA was to avoid killing civilians. This would alienate their Catholic community support. Bomb warnings, genuine and hoax, were often given. Thus the metric of IRA attacks was typically physical destruction and economic loss. Islamist militants of all persuasions have no such qualms about killing civilians. Indeed, gruesome killings are celebrated. Terrorists follow the path of least resistance in their actions. For Islamic State, this is the path of brutal murder rather than property damage.

The Challenges Around Modeling European Windstorm Clustering for the (Re)insurance Industry

In December I wrote about Lothar and Daria, a cluster of windstorms that emphasized the significance of ‘location’ when assessing windstorm risk. This month we have the 25th anniversary of the most damaging cluster of European windstorms on record—Daria, Herta, Wiebke, and Vivan.

This cluster of storms highlighted the need for better understanding the potential impact of clustering for insurance industry.

At the time of the events the industry was poorly prepared to deal with the cluster of four extreme windstorms that struck in rapid succession over a very short timeframe. However, since then we have not seen such a clustering again of such significance, so how important is this phenomena really over the long term?

There has been plenty of discourse over what makes a cluster of storms significant, the definition of clustering and how clustering should be modeled in recent years.

Today the industry accepts the need to consider the impact of clustering on the risk, and assess its importance when making decisions on underwriting and capital management. However, identifying and modeling a simple process to describe cyclone clustering is still proving to be a challenge for the modeling community due to the complexity and variety of mechanisms that govern fronts and cyclones.

What is a cluster of storms?

Broadly, a cluster can be defined as a group of cyclones that occur close in time.

But the insurance industry is mostly concerned with severity of the storms. Thus, how do we define a severe cluster? Are we talking about severe storms, such as those in 1990 and 1999, which had very extended and strong wind footprints. Or is it storms like those in the winter 2013/2014 season, that were not extremely windy but instead very wet and generated flooding in the U.K.? There are actually multiple descriptions of storm clustering, in terms of storm severity or spatial hazard variability.

Without a clearly identified precedence of these features, defining a unique modeled view for clustering has been complicated and brings uncertainty in the modelled results. This issue also exists in other aspects of wind catastrophe modeling, but in the case of clustering, the limited amount of calibration data available makes the problem particularly challenging.

Moreover, the frequency of storms is impacted by climate variability and as a result there are different valid assumptions that could be applied for modeling, depending on the activity time frame replicated in the model. For example, the 1980s and 1990s were more active than the most recent decade. A model that is calibrated against an active period will produce higher losses than one calibrated against a period of lower activity.

Due to the underlying uncertainty in the model impact, the industry should be cautious of only assessing either a clustered or non-clustered view of risk until future research has demonstrated that one view of clustering is superior to others.

How does RMS help?

RMS offers clustering as an optional view that reflects well-defined and transparent assumptions. By having different views of risk model available to them, users can better deepen their understanding of how clustering will impact a particular book of business, and explore the impact of the uncertainty around this topic, helping them make more informed decisions.

This transparent approach to modeling is very important in the context of Solvency II and helping (re)insurers better understand their tail risk.

Right now there are still many unknowns surrounding clustering but ongoing investigation, both in academia and industry, will help modelers to better understand the clustering mechanisms and dynamics, and the impacts on model components to further reduce the prevalent uncertainty that surrounds windstorm hazard in Europe.

 

What to expect this 2014-2015 Europe Winter Windstorm Season

When it rains in Sulawesi it blows a gale in Surrey, some 12,000 miles away? While these occurrences may sound distinct and uncorrelated, the wet weather in Indonesia is likely to have played some role in the persistent stormy weather experienced across northern Europe last winter.

Weather events are clearly connected in different parts of the world. The events of last winter are discussed in RMS’ 2013-2014 Winter Storms in Europe report, which provides an in-depth analysis of the main 2013-2014 winter storm events and why it is difficult to predict European windstorm hazard due to many factors, including the influence of distant climate anomalies from across the globe.

Can we predict seasonal windstorm activity during the 2014-2015 Europe winter windstorm season?

As we enter the 2014-2015 Europe winter windstorm season, (re)insurers are wondering what to expect.

Many consider current weather forecasting tools beyond a week to be as useful as the unique “weather forecasting stone” that I came across on a recent vacation.

I am not so cynical; while weather forecasting models may have missed storms in the past and the outputs of long-range forecasts still contain uncertainty, they have progressed significantly in recent years.

In addition, our understanding of climatic drivers that strongly influence our weather, such as the North Atlantic Oscillation (NAO), El Niño Southern Oscillation (ENSO), and the Quasi-Biennial Oscillation (QBO) is constantly improving. As we learn more about these phenomena, forecasts will improve, as will our ability to identify trends and likely outcomes.

What can we expect this season?

The Indian dipole is an oscillation in sea surface temperatures between the East and West Indian Ocean. It has trended positively since the beginning of the year to a neutral phase and is forecast to remain neutral into 2015. Indonesia is historically wet during a negative phase, so we are unlikely to observe the same pattern that was characteristic of winter 2013-2014.

Current forecasts indicate that we will observe a weak central El Niño this winter. Historically speaking this has led to colder winter temperatures over northern Europe, with a blocking system drawing cooler temperatures from the north and northeast.

The influence of ENSO on the jet stream is less well-defined but potentially indicates that storms will be steered along a more southerly track. Lastly, the QBO is currently in a strong easterly phase, which tends to weaken the polar vortex as well as westerlies over the Atlantic.

Big losses can occur during low-activity seasons

Climatic features like NAO, ENSO, and QBO are indicators of potential trends in activity. While they provide some insight, (re)insurers are unlikely to use them to inform their underwriting strategy.

And, knowing that a season may have low overall winter storm activity does not remove the risk of having a significant windstorm event. For example, Windstorm Klaus occurred during a period of low winter storm activity in 2009 and devastated large parts of southern Europe, causing $3.4 billion in insured losses.

Given this uncertainty around what could occur, catastrophe models remain the best tool available for the (re)insurance industry to evaluate risk and prepare for potential impacts. While they don’t aim to forecast exactly what will happen this winter, they help us understand potential worst-case scenarios, and inform appropriate strategies to manage the exposure.