RMS supports NYC tech talent in Flatiron School partnership

On February 6, 2015, United States Secretary of Commerce, Penny Pritzker visited The Flatiron School in New York City to learn more about the program, its students, and the companies that are leading the way in hiring skilled workers in a data-driven economy. It was such an honor to be invited to a closed door round table with Secretary Pritzker.

Despite the standard formalities, Secretary Pritzker walked around as if she were a student herself. She asked questions of the founders and spoke to students about their work. At the table sat representatives from Microsoft, The New York Times, New York Tech Meetup, Wiser, DoSomething.org, UniteUS, Alphasights, and me from RMS. Secretary Pritzker welcomed us, mentioned the traffic on the West Side Highway, and then jumped right in to asking questions. The session was being covered “on background” (meaning that we weren’t quoted) by Catherine Rampell from the Washington Post and Katie Couric.

The forum was casual but direct. Secretary Pritzker’s interests included understanding why companies recruit from schools like Flatiron (RMS has hired 2 students; welcome Sam Tran and Jimmy Kuruvilla) versus traditional university programs; how to get other companies to look outside of the traditional schools; how skills-driven programs are helping shrink the unemployment rate; and why an internship/apprenticeship is so valuable.

RMS forged a relationship with the Flatiron School just after they opened in 2012. Since then we have been a partner to the school and to the student population. Not only have we participated in the science fairs and as mentioned above, hired students from the program, but earlier this year we committed to being a NYC Tech Talent Pipeline Industry Partner. New York City Mayor Bill de Blasio is leading this initiative with the hope that the $10 million plan will support the growth of Silicon Alley (NYC high-tech industry).

The NYC Web Development Program is designed for 18-26 year olds without college degrees looking to gain skills as developers and creative technologists. The students who enter this program receive the $15,000 Flatiron School training free of charge. RMS’ commitment to this program is to hire 2 interns from the NYC Web Development Fellowship program. We look forward to welcoming these interns during the summer of 2015.

When the 45-minute session wrapped, Secretary Pritzker immediately sat down with Katie Couric for a one on one. They discussed the unemployment report, manufacturing, and educational programs like the Flatiron School.

The honor was mine to not only meet Secretary Pritzker but to be at such an important conversation about representing RMS. I am thrilled that we as a company are committed to hiring such a diverse group of intellectuals and that education, no matter where it is obtained, is still at the core of who we are as an organization.

For more information about this event, see other blog posts:

Germanwings 9525: Why didn’t this happen before?

On the tenth anniversary of 9/11, I attended a commemorative meeting at the British Academy in London. A professor of international relations recounted how he watched the horrific scenes of destruction of the World Trade Center in the company of his five year-old daughter. She posed this intriguing question: why didn’t this happen before?

Just a few years before she was born, in December 1994, Algerian terrorists attempted to fly a hijacked plane into the Eiffel Tower. Fortunately, French commandos terminated the hijacking when the plane stopped for refueling. This was a near-miss. The American writer of counterfactual fiction, Philip Roth, observed in his book The Plot Against America that: “the terror of the unforeseen is what the science of history hides.” The destruction of the Eiffel Tower is not an event in terrorism history—just one of numerous ambitious plots that were foiled.

With the current state of historical and scientific knowledge, there are very few unknown hazard events that should take catastrophe risk analysts by surprise. Almost all either did happen before in some guise, or, taking a counterfactual view, might well have happened before. Take for example the great Japanese tsunami and magnitude 9 earthquake of four years ago. It is doubtful that this was the strongest historical earthquake to have struck Japan. The Sanriku Earthquake of 869 may merit this status, based on archaeological evidence of widespread tsunami deposits.

Disasters are rare, and preparedness depends crucially on knowledge of the past. Aviation is the safest mode of travel; there are very few crashes. However there are numerous near-misses, where one or more of the key flight parameters is dangerously close to the disaster threshold. There is a valuable learning curve associated with the lessons gained from such operational experience.

The direct action of the co-pilot in the tragic crash of Germanwings Flight 9525 on March 24, 2015 raises again the question: why didn’t this happen before? As recently as November 29, 2013, it did. A Mozambique Airlines plane flying from the Mozambican capital Maputo to Luanda in Angola crashed, killing 27 passengers and its six crew. The pilot locked himself in the cockpit keeping out the co-pilot. He ignored alarm signals and manually changed altitude levels.

Quite apart from this and other historical precedents for fatal crashes caused by direct pilot action, there must be many more near-misses, where timely intervention has inhibited direct action by pilots suffering from some psychological disorder. The reporting of incidents is a crucial part of aviation safety culture, so is advancing the learning curve. Analysis of such data would contribute to accident risk assessment and subsequent risk mitigation measures.

The 2015 U.K. Budget and Terrorism Insurance

On 18 March, the Chancellor of the Exchequer, George Osborne, delivered his pre-election budget. Billions of further public spending cuts are needed. Several weeks earlier, Pool Re, the U.K. terrorism insurance pool, announced its first ever purchase of reinsurance in the commercial market.

These two announcements are not unconnected.

Pool Re was set up in 1993, after the IRA bombing of the Baltic Exchange in 1992. Since the pool was established, it has built up quite a substantial surplus; claims have been low thanks to the vigilance of the security and intelligence services. Almost all the major plots since the September 11, 2001 attack have been foiled.

Terrorism insurance is effectively insurance against counter-terrorism failure, and the huge sums spent on blanket indiscriminate surveillance have helped to minimize terrorism insurance losses. The low level of losses is not coincidental, or due to some unpredictable whim of terrorist behavior but readily explainable; too many terrorists spoil the plot. The type of plots capable of causing terrorism insurance losses of a billion pounds or more would involve a sizeable number of operatives.

As the NSA whistleblower Edward Snowden has revealed, the level of surveillance of electronic communications is so intensive that sizeable terrorist social networks end up being tracked by NSA and GCHQ. Lesser plots involving lone wolves or several operatives are most likely to be successful. A string of these have struck the western alliance over the past months in Ottawa, Sydney, Paris, and Copenhagen. Besides causing terror, these have attracted global media publicity, inspiring Jihadi recruitment. But terrorism insurance covers property loss, not the spread of fear or growth in the ranks of Islamic State.

Having developed a tough security environment, it is unsurprising that the U.K. Government should be questioning its continuing exposure to terrorism insurance risk. This is an age of austerity. Pool Re’s three year program provides £1.8bn of reinsurance cover, so diminishing this exposure. More cover might have been purchased, but this was the market limit, given that Chemical-Biological-Radiological-Nuclear (CBRN) risks were included.

The idea of separating out extreme CBRN terrorism risks was considered in Washington by the House Financial Services Committee in the discussions last year over the renewal of the Terrorism Risk Insurance Act. The objective was to provide a federal safety net for such extreme risks, whilst encouraging further private sector solutions for conventional terrorist attacks. This idea was considered at some length, but was a step too far for this TRIA renewal. It might be a step for Pool Re.

The modus operandi of the IRA was to avoid killing civilians. This would alienate their Catholic community support. Bomb warnings, genuine and hoax, were often given. Thus the metric of IRA attacks was typically physical destruction and economic loss. Islamist militants of all persuasions have no such qualms about killing civilians. Indeed, gruesome killings are celebrated. Terrorists follow the path of least resistance in their actions. For Islamic State, this is the path of brutal murder rather than property damage.

Winter 2015: A Season to Remember (or Forget)

This winter has brought a barrage of storms and Arctic air to more than half of the U.S., notably the New England region, resulting in record amounts of snow, sleet, freezing rain, and bitterly cold temperatures.

Arguably, no other major city has been more directly impacted than Boston, Massachusetts. As of March 9, the city has received 105.7 inches of snow this season – over three times the average seasonal total for the region! It’s the second snowiest season on record, behind only the 1995-1996 season, which brought 107.6 inches of snow. Further, February 2015 marks the snowiest month reported (more than 60 inches) and the second coldest February on record.

Damage reports from this season’s snowstorms include roof collapses, building collapses, burst pipes, power outages, and business interruption. The Massachusetts Emergency Management Agency reported more than 160 collapsed buildings or buildings at risk of collapse since February 9 with damage mainly driven by dense snow pack and strong winds. As of February, Boston has already spent a record $35 million on snow removal – almost double the allotted total of $18.5 million.


U.S. Winterstorm Risk Map. Loss cost per $1000 for Residential lines at the ZIP code level, based on RMS U.S. Winterstorm Model output using the RMS 2011 U.S. Winterstorm IED.

Businesses and supply chains have been interrupted as well. A combination of snowstorms, cold weather, and ice has closed thousands of businesses, resulting in lost wages for hourly workers. These events have disrupted all forms of travel, restricting trucks and air freight from reaching their destinations and leading to increased prices for certain goods.

All in all, these types of impacts can result in significant economic and insured damages. According to a study by IHS Global Insight, a one-day snow-related shutdown would cost some states as much as $300-700 million in economic losses.

Insured loss estimates from the cluster of February storms (five in total) that swept through parts of the Ohio Valley, Mid-Atlantic, and Northeast are likely to exceed $1 billion, which is in line with annual averages. RMS model analysis shows that on average, about $2-3 billion in U.S. annual insured losses are caused by winter storms, which can produce a combination of snow, ice, freezing rain, and frigid temperatures. This is about 5-10% of the overall U.S. average caused by perils including hurricanes, severe convective storms, floods, and winter storms.

Whether it is in regards to the harsh winters of the last few years or future winters to come, it is important for the (re)insurance industry to be adequately prepared so insured losses remain at a minimum.

The Journey to Sendai and Beyond

Sendai is a city of a million people 2 hours north of Tokyo on the Shinkansen bullet train. From March 14-17, 2015 it will attract seven thousand people to the 3rd UN World Conference on Disaster Risk Reduction (WCDRR). Twelve heads of state (including one king and one emperor), seven prime ministers and 135 ministers and vice ministers, will be present to launch a fifteen year program of coordinated action around disaster risk reduction.

The conference is being hosted in Sendai because of the city’s recent experience of a mega-catastrophe. Just four years after the great Tohoku earthquake and tsunami in March 2011 and the coastal villages adjacent to Sendai still bear the scour marks where the great tsunami surged inland through the pine forests, removing many buildings off their foundations.

The original International Decade for Disaster Risk Reduction ran from 1990-1999. The second decade from 2005-2015, renewed at Kobe ten years after its devastating 1995 earthquake, was called the Hyogo Framework for Action. The continuation of this international program is currently designed to last for fifteen years. The fact that the frameworks have been renewed reflects reality—while there have been successes for particular regions and perils, the broader goals of worldwide disaster risk reduction have not been met. For example, the 2011 Tohoku earthquake was not anticipated, and as a result had grievous consequences in terms of loss of life and damage to the Fukushima nuclear power plants.

RMS will have four people at the Sendai WCDRR conference. We have obtained a coveted presentation on the main IGNITE stage—the equivalent to a “TED talk.” I will also be speaking on two panel sessions, one organized by The Geneva Association and Tokio Marine, “Insurance as contributors to problem solving and impact reduction,”and a second on the launch of the global set of catastrophe models developed by the UNISDR agency, for which RMS has provided high-level input. We have offered to host these worldwide UNISDR catastrophe models on RMS(one), which will open up access to the models for public officials in developing countries.

We have also worked on a couple of papers (for example, ‘Setting, Measuring and Monitoring: Targets for Disaster Risk Reduction: Recommendations for post-2015 international policy frameworks’) articulating how to measure progress in disaster risk reduction. At present, international frameworks have shied away from setting numerical commitments. We have argued that only probabilistic methods, which simulate thousands of possible events, can show baseline levels of risk, what actions will achieve progress, and whether targets have been achieved. We take Michael Bloomberg’s quote from the foreword to the Risky Business report: “if you can’t measure it, you can’t manage it.”

The work by the UNISDR on catastrophe modeling highlights the accelerated recognition of the role of modeling in managing and reducing disaster risk. There is now a real focus on public-private partnerships in achieving disaster reduction. With RMS’ rich and deep experience in catastrophe modeling, there is much we can offer to these expanded applications. For users of models in governments, public organisations and NGOs, models are required to:

  • explore how to manage a wide range of potential disasters
  • perform cost benefit analyses of alternative actions to reduce risk of loss of life or economic impacts
  • explore potential implications of climate change
  • explore holistically the potential for significant financial shocks to national economies

If you are attending the conference, come and visit us at our booth on the 6th floor of the Sendai International Center where we will be distributing information about our proposals for disaster risk modeling, and articulating our role as leaders in catastrophe risk modeling. It will be a highly publicized event with 500 journalists and around 300 private sector members, including several of our key clients. We will also be meeting with other organizations with which we are affiliated, including the UN Principles for Sustainable Insurance and the Rockefeller Foundation’s 100 Resilient Cities initiative.

We look forward to sharing more insight after the event.

Rising Storm Surge Losses in the U.S. Northeast

Co-authored by Anaïs Katz and Oliver Withers, analysts, Capital Market Solutions, RMS

A recent article in Nature Communications, picked up by the BBC, identified a record mean sea-level rise of 5” (127mm) along the coastline north of New York City during 2009-10. Sea levels fluctuate between years; a swing of this size, however, was unprecedented.

This extreme rise in 2009-2010 has been attributed to the downturn of a major current called the Atlantic meridional overturning circulation (AMOC). As changes to sea levels are sensitive to multiple factors, there is volatility around this increase. The AMOC is one of the ocean’s dynamics that is known to have greatly changed over time. It has been shown that weakening and variation of the AMOC is linked to increases of greenhouse gas emissions.

Sea level rise is one of the most tangible and certain consequences of a warmer climate. Climate models suggest that even if greenhouse gas emissions were reduced sea levels will continue to increase. Such a dramatic fluctuation, as seen in 2009-10, highlights the potential for significantly elevated storm surge risk in the region and raises the question what will the impact of future long-term sea-level rise have on storm risk.

A study by Kopp et al. has attempted to predict probability bands for sea rise. The figure below shows the distribution of expected sea-level rise at New York City’s Battery Park throughout the 21st century. The 50th percentile projection of sea level rise is represented as the red line in the figure. Also shown are the maximum rises in sea levels associated with previous hurricane storm surges.

Based on RMS’ estimate of the impacts from hurricanes on residential and commercial property in the Northeast US (from New Jersey north), the 2010 estimate of storm surge contribution to hurricane losses is about 10%. Even where the activity of hurricanes does not change, sea level rise will increase the damage associated with hurricane storm surges. Based on Kopp’s estimates of sea level rise, by 2100 surge losses would contribute about 25% of total hurricane losses.

The largest recent hurricane loss occurred on October 29th 2012, when Superstorm Sandy made landfall near Atlantic City, NJ. Based on the RMS best loss estimate, Sandy caused insured losses between $20 and $25 billion, with much of the damage due to storm surge, not wind.

In terms of a simple extreme value analysis, the storm surge caused by Superstorm Sandy combined with the tide at New York City’s Battery Park was approximately a 1-in-450 year return period for that location. Based on sea level rise alone, this surge and tide combination at this location would move closer to a 1-in-100 year event by the end of the century. The figure below shows the return periods for a storm surge as high as Sandy’s occurring at New York City’s Battery Park, under different sea-level assumptions.

A direct result of increasing amounts of greenhouse gases in the atmosphere will be an increase in sea surface temperatures. While increased sea surface temperatures are likely to cause changes to the activities and intensities of hurricanes, there is no consensus among climate modelers as to the magnitude and direction of these changes. For this reason, the figure below does not consider potential changes in hurricane activity, but focuses solely on sea-level rise, for which there is much more of a general agreement.

While the impacts of climate change remain much debated, changes in loss potential will have material effects on the risk to insurers. With the appreciation of the significance of climate change coming to the fore, the next decades will pose a research challenge for the insurance industry, as to how to incorporate evidence for changes in the level of risk.

This post was co-authored by Anaïs Katz and Oliver Withers. 

Anaïs Katz

Analyst, Capital Market Solutions, RMS
As a member of the advisory team within capital market solutions, Anaïs works on producing capital markets’ deal commentary and expert risk analysis. Based in Hoboken, she provides transaction characterizations to clients for bonds across the market and supports the deal team in modeling transactions. She has woked on notable deals for clients such as Tradewynd Re and Golden State Re. Anaïs has also helped to model and develop her group’s internal collateralized insurance pricing model that provides mark to market prices for private transactions. Anaïs holds a BA in physics from New York University and an MSc in Theoretical Systems Biology and Bioinformatics from Imperial College London.

Measuring Disaster Risk for Global UN Goals

A dispiriting part of the aftermath of a disaster is hearing about the staggering number of deaths and seemingly insurmountable economic losses. Many of the disaster risk reduction programs that implement disaster prevention and preparedness capabilities are helping to create more resilient communities. These worthwhile programs require ongoing financing, and their success must be measured and evaluated to continue to justify the allocation of limited funds.

There are two global UN frameworks being renewed this year:

Both frameworks will run for 15 years. This is the first time explicit numerical targets have been set around disaster risk, and consequently, there is now a more pressing need to measure the progress of disaster risk reduction programs to ensure the goals are being achieved.

The most obvious way to measure the progress of a country’s disaster risk reduction would be to observe the number of deaths and economic losses from disasters.

However, as we have learned in the insurance industry in the early 1990s, this approach presents big problems around data sampling. A few years or even decades of catastrophe experience do not give a clear indication of the level of risk in a country or region because catastrophes have a huge and volatile range of outcomes. An evaluation that is purely based on observed deaths or losses can give a misleading impression of success or failure if countries or regions are either lucky in avoiding (or unlucky in experiencing) severe disaster events during the period measured.

A good example is the 2010 Haiti earthquake, which claimed more than 200,000 lives and cost more than $13 billion. Yet for more than 100 years prior to this devastating event, earthquakes in Haiti had claimed fewer than 10 lives.

Haiti shows that it is simply not possible to determine the true level of risk from 15 years of observations for a single country. Even looking at worldwide data, certain events dominate the disaster mortality data, and progress cannot be measured.

Global disaster-related mortality rate (per million global population), 1980–2013 (From Setting, measuring and monitoring targets for disaster risk reduction: recommendations for post-2015 international policy frameworks. Source: adapted from www.emdat.be)

A more reliable way to measure the progress of disaster risk reduction programs is to use a probabilistic methods, which rely on a far more extensive range of possibilities, simulating tens of thousands of catastrophic events. These can then be combined with data on exposures and vulnerabilities to output metrics of specific interest for disaster risk reduction, such as houses or lives lost. Such metrics can be used to:

  • Measure disaster risk in a village, city, or country and how it changes over time
  • Analyze the cost-benefit of mitigation measures:
    • For a region: For example, the average annual savings in lives due to a flood defense or earthquake early warning system
    • For a location: For example, choosing which building has the biggest reduction in risk if retrofitted
  • Quantify the impact of climate change and how these risks are expected to vary over time

In the long term, probabilistic catastrophe modeling will be an important way to ensure improved measurement and, therefore, management of disaster risk, particularly in countries and regions at greatest risk.

The immediate focus should be on educating government bodies and NGOs on the valuable use of probabilistic methods. For the 15 year frameworks which are being renewed this year, serious consideration should be given on how to implement a useful and practical probabilistic method of measuring progress in disaster risk reduction, for example by using hazard maps. See here for further recommendations: http://www.preventionweb.net/english/professional/publications/v.php?id=39649 

2015 is an important year for measuring disaster risk: let’s get involved.

High Tides a Predictor for Storm Surge Risk

On February 21, 2015, locations along the Bristol Channel experienced their highest tides of the first quarter of the 21st century, which were predicted to reach as high as 14.6 m in Avonmouth. When high tides are coupled with stormy weather, the risk of devastating storm surge is at its peak.

Storm surge is an abnormal rise of water above the predicted astronomical tide generated by a storm, and the U.K. is subject to some of the largest tides in the world, which makes its coastlines very prone to storm surge.


A breach at Erith, U.K. after the 1953 North Sea Flood

The sensitivity of storm surge to extreme tides is an important consideration for managing coastal flood risk. While it’s not possible to reliably predict the occurrence or track of windstorms—even a few days before they strike land—it is at least possible to predict years with a higher probability of storm surge well in advance—which can help in risk mitigation operation planning, insurance risk management, and pricing.

Perfect timing is the key to a devastating storm surge. The point at which a storm strikes a coast relative to the time and magnitude of the highest tide will dictate the size of the surge. A strong storm on a neap tide can produce a very large storm surge without producing dangerously high water levels. Conversely, a medium storm on a spring tide may produce a smaller storm surge, but the highest water level can lead to extensive flooding. The configuration of the coastal geometry, topography, bathymetry, and sea defenses can all have a significant impact on the damage caused and the extent of any coastal flooding.

This weekend’s high tides in the U.K. remind us of the prevailing conditions of the catastrophic 1607 Flood, which also occurred in winter. The tides reached an estimated 14.3 m in Avonmouth which, combined with stormy conditions at the time, produced a storm surge that caused the largest loss of life in the U.K. from a sudden onset natural catastrophe. Records estimate between 500 and 2,000 people drowned in villages and isolated farms on low-lying coastlines around the Bristol Channel and Severn Estuary. The return period of such an event is probably over 500 years and potentially longer.

The catastrophic 1953 Flood is another example of a U.K. storm surge event. These floods caused unprecedented property damage along the North Sea coast in the U.K. and claimed more than 2,000 lives along northern European coastlines. This flood occurred close to a Spring tide, but not on an exceptional tide. Water level return periods along the east coast are varied, peaking at just over 200 years in Essex and just less than 100 years in the Thames. So, while the 1953 event is rightfully a benchmark event for the insurance industry, it was not as “extreme” as the 1607 Flood, which coincided with an exceptionally high astronomical tide.

Thankfully, there were no strong storms that struck the west coast of the U.K. this weekend. So, while the high tides may have caused some coastal flooding, they were not catastrophic.

RMS(one): Tackling a Unique Big Data Problem

I am thrilled to join the team at RMS as CTO, with some sensational prospects for growth ahead of us. I originally came to RMS in a consulting role with CodeFutures Corporation, tapped to consult RMS on the development of RMS(one). In that role, I became fascinated by RMS as a company, by the vision for RMS(one), and by the unique challenges and opportunities that it presented. I am delighted to bring my experience and expertise in-house, where my primary focus is continuing the development of the RMS(one) platform and ensuring a seamless transition from our existing core product line.

I have tackled many big data problems in my previous role as CEO and COO of CodeFutures, where we created a big data platform designed to remove the complexity and limitations of current data management approaches. In my more than 20 years of experience with advanced software architectures, I worked with many of the most innovative and brilliant people in high-performance computing; I have helped organizations address the challenges of big data performance and scalability, encouraging effective applications of emerging technologies to fields including social networking, mobile applications, gaming, and complex computing systems.

Each big data problem is unique, but RMS’ is particularly intriguing. Part of what attracted me to the CTO role at RMS was the idea of tackling head-on the intense technical challenges of delivering a scalable risk management platform to an international group of the world’s leading insurance companies. Risk management is unique in the type and scale of data it manages; traditional big data techniques fall far short when tackling this problem. Not only do we need to handle data and processing at tremendous scale, we need to do it with the speed that meets customer expectations. RMS has customers all around the world and we need to deliver a platform they can all leverage to get results they need and expect.

The primary purpose of RMS(one) is to enable companies in the insurance, reinsurance, and insurance-linked securities industries to run RMS next generation HD catastrophe models. It will also allow them to implement their own models and give them access to others by third-party developers in an ever-growing ecosystem. It is designed as an open exposure and risk management platform on which users can define the full gamut of their exposures and contracts, and then implement their own analytics on a highly scalable and purpose-built cloud-based platform. RMS(one) will offer unprecedented flexibility, as well as truly real-time and dynamic risk management processes that will generate more resilient and profitable portfolios—very exciting stuff!

During development of RMS(one), we have garnered outstanding support and feedback from key customers and joint development partners; we know the platform is the first of its kind—a truly integrated and scalable platform for managing risk has never been accomplished before. Through beta testing we obtained hands-on feedback from said customers that we are leveraging into our new designs and capabilities. The idea is to provide new means to enable risk managers to change how they work, providing better results while expending less effort and time.

I work closely with several teams within the company, including software development, model development, product management, sales, and others to deliver on the platform’s objectives. The most engaging part of this work is turning the plans into workable designs that can then be executed by our teams. There is a tremendous group of talented individuals at RMS, and a big part of my job is to coalesce their efforts into a great final product, leveraging the brilliant ideas I encounter from many parts of the company. It is totally exciting, and our focus is riveted on delivering against the plan for RMS(one).

The challenges around modeling European windstorm clustering for the (re)insurance industry

In December I wrote about Lothar and Daria, a cluster of windstorms that emphasized the significance of ‘location’ when assessing windstorm risk. This month we have the 25th anniversary of the most damaging cluster of European windstorms on record—Daria, Herta, Wiebke, and Vivan.

This cluster of storms highlighted the need for better understanding the potential impact of clustering for insurance industry.

At the time of the events the industry was poorly prepared to deal with the cluster of four extreme windstorms that struck in rapid succession over a very short timeframe. However, since then we have not seen such a clustering again of such significance, so how important is this phenomena really over the long term?

There has been plenty of discourse over what makes a cluster of storms significant, the definition of clustering and how clustering should be modeled in recent years.

Today the industry accepts the need to consider the impact of clustering on the risk, and assess its importance when making decisions on underwriting and capital management. However, identifying and modeling a simple process to describe cyclone clustering is still proving to be a challenge for the modeling community due to the complexity and variety of mechanisms that govern fronts and cyclones.

What is a cluster of storms?

Broadly, a cluster can be defined as a group of cyclones that occur close in time.

But the insurance industry is mostly concerned with severity of the storms. Thus, how do we define a severe cluster? Are we talking about severe storms, such as those in 1990 and 1999, which had very extended and strong wind footprints. Or is it storms like those in the winter 2013/2014 season, that were not extremely windy but instead very wet and generated flooding in the U.K.? There are actually multiple descriptions of storm clustering, in terms of storm severity or spatial hazard variability.

Without a clearly identified precedence of these features, defining a unique modeled view for clustering has been complicated and brings uncertainty in the modelled results. This issue also exists in other aspects of wind catastrophe modeling, but in the case of clustering, the limited amount of calibration data available makes the problem particularly challenging.

Moreover, the frequency of storms is impacted by climate variability and as a result there are different valid assumptions that could be applied for modeling, depending on the activity time frame replicated in the model. For example, the 1980s and 1990s were more active than the most recent decade. A model that is calibrated against an active period will produce higher losses than one calibrated against a period of lower activity.

Due to the underlying uncertainty in the model impact, the industry should be cautious of only assessing either a clustered or non-clustered view of risk until future research has demonstrated that one view of clustering is superior to others.

How does RMS help?

RMS offers clustering as an optional view that reflects well-defined and transparent assumptions. By having different views of risk model available to them, users can better deepen their understanding of how clustering will impact a particular book of business, and explore the impact of the uncertainty around this topic, helping them make more informed decisions.

This transparent approach to modeling is very important in the context of Solvency II and helping (re)insurers better understand their tail risk.

Right now there are still many unknowns surrounding clustering but ongoing investigation, both in academia and industry, will help modelers to better understand the clustering mechanisms and dynamics, and the impacts on model components to further reduce the prevalent uncertainty that surrounds windstorm hazard in Europe.