Monthly Archives: September 2014

RMS and 100 Resilient Cities at the Clinton Global Initiative

I’ve just returned from the Clinton Global Initiative (CGI) annual meeting in New York. Every September, political, corporate, and non-profit leaders from around the world gather to discuss pressing challenges, form partnerships, and make commitments to action. It was inspiring to see the tremendous work already being done and the new commitments being made to address a diverse and wide range of issues, from containing the Ebola epidemic, to increasing access to education, to combatting climate change, and helping Haiti develop a self-sustaining economy.

One prevailing theme at the event this year was the importance of cross-sector partnerships to successfully tackle such complex issues. Not surprisingly, data crunched by the CGI team on commitments made over the past 10 years demonstrates the highest rate of success from partnerships vs. go-it-alone approaches.

In this spirit, we announced an RMS commitment last week to partner with the Rockefeller Foundation’s 100 Resilient Cities initiative to help increase the resilience of cities around the world. We will be making our RMS(one) platform and our catastrophe models available to cities in the 100RC network so that they can better understand their exposures, assess risk to catastrophic events as well as climate change, and prioritize investments in mitigating and managing that risk.

As the saying goes, “if you can measure it, you can manage it.” From our 25 years of experience helping the insurance industry better measure and then manage catastrophe risk, we believe there is a largely untapped opportunity for the public sector to similarly leverage exposure management and catastrophe modeling technology to establish more informed policies for managing risk and increasing resilience in cities throughout the world, both in developed and emerging economies.

It was also clear this week that the conversation in corporate boardrooms is increasingly moving from being focused solely on the financial bottom line to also having a positive impact on the world in a way that is strategically aligned with the core mission of the business.

Our partnership with 100RC, along with the partnerships with the UNISDR and the World Bank that we announced this summer, is another step in our own version of this journey. Through both our direct philanthropic support of Build Change and their admirable work to improve construction practices in developing countries and through the leveraging of our technology and the expertise of our colleagues to help the public sector, we are aligning all of our activities in support of our core mission to increase the resilience of our society.

Many of our clients have shared with us that they are on similar journeys, building on traditional support for local organizations to implement more strategic programs with broader impact and employee engagement. In particular, the insurance industry is uniquely positioned to understand the value of proactively investing in mitigation and in increasing resilience, instead of waiting until a tragedy has occurred and all that can be done is to support humanitarian response efforts.

With this common frame of reference, we look forward to increasingly partnering with our clients in the coming years not just to help them manage their own risk but to collectively help increase resilience around the world.

Matching Modeled Loss Against Historic Loss in European Windstorm Data

To be Solvency II compliant, re/insurers must validate the models they use, which can include comparisons to historical loss experience. In working towards model validation, companies may find their experience of European windstorm hazard does not match the modeled loss. However, this seeming discrepancy does not necessarily mean something is wrong with the model or with the company’s loss data. The underlying timelines for each dataset may simply differ, which can have a significant influence for a variable peril like European windstorm.

Most re/insurers’ claims records only date back 10 to 20 years, whereas European windstorm models use much longer datasets – generally up to 50 years of the hazard. Looking over the short term, the last 15 years represented a relative lull in windstorm activity, particularly when compared to the more extreme events that occurred in the very active 1980s and 1990s.

Netherlands windstorm variability

 

 

 

 

 

 

RMS has updated its European windstorm model specifically to support Solvency II model validation. The enhanced RMS model includes the RMS reference view, which is based on the most up-to-date, long-term historical record, as well as a new shorter historical dataset that is based on the activity of the last 25 years.

By using the shorter-term view, re/insurers gain a deeper understanding of how historical variability can impact modeled losses. Re/insurers can also perform a like-for-like validation of the model against their loss experience, and develop confidence in the model’s core methodology and data. Alternate views of risk also support a deeper understanding of risk uncertainty, which enhances model validation and provides greater confidence in the models that are used for risk selection and portfolio management.

Beyond Solvency II validation, the model also empowers companies to explore the hazard variability, which is vitally important for a variable peril like European windstorm. If a catastrophe model and a company rely on different but equally valid assumptions, the model can present a different perspective to provide a more complete view of the risk.

Serial Clustering Activity around the Baja Peninsula during September 2014

In the past two weeks, two major hurricanes have impacted the Baja Peninsula in Mexico. Hurricane Norbert bypassed a large portion of the west coast of the peninsula from September 5 to 7, and Hurricane Odile made landfall near Cabo San Lucas on September 14th as a Category 3 hurricane on the Saffir-Simpson Wind Scale. A third system, Hurricane Polo, formed Tuesday, September 16 and is forecasted to follow a similar track to Norbert and Odile, making it the third such tropical cyclone to develop in the region since the beginning of the month.

This serial cluster of storms has been driven primarily by steady, favorable conditions for tropical cyclone development and consistent atmospheric patterns present over the Eastern Pacific. A serial cluster is defined as a set of storms that form in the same part of a basin, and subsequently follow one another in an unbroken sequence over a relatively short period of time. To qualify as a cluster, there needs to be measurable consistency between the tracks. This is typically a result of steady, predominant atmospheric steering currents, which play a major role in influencing the speed and direction of tropical cyclones. One example of a serial cluster is the four major hurricanes (Charley, Francis, Ivan, and Jeanne) that impacted Florida during a six-week period in 2004.

During this recent two-week period, the area off the west coast of Mexico has maintained high sea-surface temperatures near 85.1 degree Fahrenheit and limited vertical wind shear, leading to an active tropical development region. A mid-level atmospheric ridge over northern Mexico has provided a consistent steering pattern towards the north-northwest, producing similar observed tracks for Norbert and Odile and forecasted track for Polo. Devastating amounts of rainfall have occurred with these storms. Hurricane Odile dropped nearly 18 inches of rain in areas around Cabo San Lucas, representing nearly 21 months-worth of typical rainfall. This cluster, while generating significant wind and flood damage along the Baja Peninsula, has also caused torrential rainfall in the southwestern U.S., including Arizona, southern Nevada, and southern California. Last week, Phoenix, AZ, one of the hardest hit areas, experienced over 3 inches of rain in a 7 hour span due to the remnants of Hurricane Norbert. This was the most rainfall to occur in a 24-hour period in the city since 1911, an estimated 1-in-200 year event by the National Oceanic and Atmospheric Administration. Significant rainfall and inland flooding is forecast to continue as the remnants of Odile and Polo move inland, which may lead to widespread flood losses and the potential for compound post-event loss amplification.

Using Network Theory to Understand the Interconnectivity of Financial Risk

For today’s regulators, systemic risk remains a major issue. Tracing the connections between financial institutions and understanding how different mechanisms of financial contagion might flow through the system is complex.

Modern finance is a collective of the activities of tens of thousands of individual enterprises, all interacting in a “living” system. Today, nobody truly understands this system. It is organic and market-driven, but the fundamental processes that drive it occasionally collapse in a financial crisis that affects us all.

The increasing risk of financial contagion in the financial industry has triggered a new discipline of research – called “network theory in financial risk management” – which is quickly gathering pace. These valuable studies aim to identify and analyze all possible connections between financial institutions, as well as how their interconnectivity can contribute to crisis propagation.

Later this month, Risk.net will launch the Journal of Network Theory in Finance. This journal will compile the key papers of financial risk studies worldwide to provide industry participants with a balanced view of how network theory in finance can be applied to business.

Papers from the inaugural edition of the new journal will be showcased on September 23 at the Financial Risk & Network Theory conference, which is hosted by the Centre for Risk Studies at the University of Cambridge. I will be presenting a keynote on how catastrophe modeling methodologies can be applied to model financial risk contagion.

Our financial institutions are connected in a multitude of ways. For example, by holding similar portfolios of investments, using common settlement mechanisms, owning shares in each other’s companies, and through inter-bank lending.

As the interconnectivity of the world’s financial institutions and markets deepens, financial risk managers and macro-economic planners need to know the likelihood and severity of potential future downturns, particularly the “tail” events of economic catastrophe. Companies must continually understand how they are exposed to the risk of contagion; many were surprised by how fast contagion spread through the financial system during the 2008 credit crunch.

The regulator’s role in limiting the risk of future financial crises includes identifying Systemically Important Financial Institutions (SIFIs) and understanding what aspects of a SIFI’s business to monitor. Regulators have already pioneered network modelling to identify the core banks and to rank their systemic importance, and can now demand much higher standards of risk management from the SIFIs. Increasingly, similar models are being used by risk practitioners and investment managers.

The studies of network theory in financial risk management, such as those carried out by the Centre of Risk Studies, provide valuable insight for all risk practitioners involved in managing financial risk by providing a robust foundation of science from which to understand, model and, ultimately, manage financial risk effectively.

2014 Atlantic Hurricane Season Update: Not Quite 2004

The 2014 Atlantic Hurricane Season is already half over, and with only five named storms in the books and El Niño conditions likely by late fall, all signs are pointing to a below-average season.

Over the last six weeks, organizations like Colorado State University (CSU) and the National Oceanic and Atmospheric Administration (NOAA) updated their seasonal outlooks with similar or slightly reduced numbers, attributing them to a variety of oceanic and atmospheric conditions acting to suppress activity, including cooler than normal sea surface temperatures, higher than normal sea level pressures, and stronger than normal wind shear.

Interestingly, the suppressed activity is not being attributed nearly as much to El Niño conditions as originally thought. Despite high likelihoods that the equatorial Pacific would warm to El Niño levels by late summer, observed El Niño Southern Oscillation (ENSO) conditions were neutral during the July and August period, according to the International Research Institute for Climate and Society.

Such observations have certainly impacted ENSO forecasts for the remainder of 2014 into 2015. As of September 4, the likelihood for El Niño conditions to form during the period from September to November dropped to 55% from a convincing 74% probability back in May. Despite this material reduction, most of the ENSO prediction models still forecast the onset of El Niño by early Fall, peaking during Northern Hemisphere winter 2014-2015 and lasting into the first few months of 2015.

Barring any late season surge in activity, this year will be a far cry from the busier seasons of the past, most notably the 2004 season. Like this year, 2004 was also impacted by weak, neutral El Niño conditions. However, the 2004 season was impacted by a rare type of storm known as Modoki El Niño in which unfavorable hurricane conditions are produced in the Pacific instead of the Atlantic Ocean, resulting in above average activity in the Atlantic.

The most notable U.S. hurricanes during the 2004 season were Hurricanes Charley, Frances, Ivan, and Jeanne. These four events damaged an estimated 2 million properties in Florida – approximately one in five houses – and caused more than $20 billion in insured losses throughout the U.S.

The strongest system to hit land that season was Hurricane Charley. The storm made landfall on the southwest coast of Florida on August 13 as a Category 4 hurricane, causing nearly $15 billion in economic damages – one of the most destructive hurricanes in U.S. history.

Just over three weeks later, Hurricane Frances, a large, slow-moving, but less-intense system made landfall on the east coast of Florida as a Category 2 storm with peak winds of 105 mph.

In early September, Hurricane Ivan developed just south of where Frances formed, intensifying quickly. Moving through warm ocean waters, the storm reached Category 5 strength three separate times before making landfall as a Category 3 hurricane along the Mississippi/Alabama border.

When Hurricane Jeanne made landfall in Stuart, Florida on September 26, it marked the second time in history that one state was impacted by four hurricanes in one season.

At this point 10 years ago, nine named storms had already formed in the basin, with six reaching hurricane status. In total, 2004 saw 15 named storms, nine of which became hurricanes, including 6 that reached major hurricane status (Category 3+).

While this hurricane season shares some common characteristics with the 2004 season, so far, 2014 has been relatively quiet while 2004 was the second costliest Atlantic hurricane season in history.

Managing the Changing Landscape of Terrorism Risk

RMS has released an updated version of its Probabilistic Terrorism Model, which reflects the considerable changes in terrorism risk for Canada, Denmark, Ireland, Italy, and the U.K. as well as the decreased frequency of large-scale-terrorism events for each of the five countries.

To inform the new view of risk, our scientists carried out a comprehensive analysis of global attack and plot data from the past decade. We focused heavily on large-scale attacks – those with the potential to threaten the solvency of an insurer.

The analysis showed that incidents of large-scale attacks have steadily and significantly decreased, which corresponds with a rise in the funding and sophistication of major intelligence agencies in the west.

Our approach to terrorism modeling follows three principles, which have been validated by data on intercepted plots, past successful attacks, and recent intelligence leaks:

  • Effective terrorists seek to achieve optimal results relative to their effort
  • Their actions are highly rational
  • They are highly constrained by pervasive counter-terrorism measures

Of the estimated 200,000 documents taken or leaked by Edward Snowden, one of the most relevant validations of the RMS model is an N.S.A. presentation that explains the routing of international telecommunications traffic. A very significant proportion of international telecommunications traffic is routed through the U.S. and Europe which, coupled with advances in big data analytics and plummeting data storage costs, has made intelligence collection easier and more robust than it has ever been.

 an N.S.A. PRISM presentation explains the routing of international telecommunications traffic

According to available data on the frequency of plots and attacks, the risk of a large-scale attack has been in decline since 2007, but the risk of smaller-scale attacks perpetrated by lone-wolf operatives and homegrown militants remains high.

However, we have learned over the past decade that terrorism risk levels are fluid and can change quickly. With the rise of the Islamic State in Iraq and reports of its successful recruitment of foreigners, as well as ongoing instability in Afghanistan and Pakistan, the risk outlook can change at any moment.

The RMS Probabilistic Terrorism Model incorporates multiple risk outlooks to provide users with the agility to quickly respond to any changes in terrorism risk. RMS is committed to updating its terrorism model as frequently as necessary to provide the most up-to-date, granular, and accurate view of global terrorism risk.

Understanding Aftershock Risk: The 10th U.S. National Conference on Earthquake Engineering

Recent earthquakes in New Zealand and Japan show that aftershock risk can be significant, even though this risk is not explicitly considered in portfolio risk assessment. It is no secret that large-magnitude earthquakes are generally followed by high numbers of smaller magnitude earthquakes and sometimes the ground motions from these aftershocks cause substantial damage to buildings. The science of forward prediction of aftershock hazard is still evolving and assessment of building vulnerability due to mainshock and aftershock sequences is currently an active topic of research.

In order to address this issue with the scientists and engineers, I organized a special session during the U.S. National Conference on Earthquake Engineering with Dr. Nicolas Luco of the USGS and Dr. Matt Gerstenberger of GNS Science. We invited a number of prominent researchers to discuss:

  • aftershock hazard
  • structural fragility/vulnerability before and after the mainshock
  • change in risk due to aftershocks

Aftershock risk is real and consumers feel the pain from increased insurance premiums, as observed following the Tohoku earthquake. According to Insurance Insight, “local earthquake premiums are up 25 percent to 50 percent, say, when compared with normal circumstances.“ So, this issue needs to be addressed to improve our understanding before another large magnitude event strikes.

Professor Fusakichi Omori first observed in 1894 that aftershocks decrease regularly with time. He developed Omori’s law, which is still used for estimating aftershock risk. I presented a paper in a separate conference session on estimation of aftershock risk in Japan following the Tohoku earthquake, which was based on the Omori Law. This is the basis of the RMS® Japan Earthquake Model update. Currently, the USGS is working on developing aftershock hazard based on the Epidemic Type Aftershock Sequences (ETAS) model.

Dr. Ned Field of the USGS, one of the speakers in the session, stated that developing the aftershock model is “one of the strategic-action priorities of the USGS in terms of providing effective situational awareness during hazardous events.”

In the meantime, GNS Science has developed a time-dependent hazard model for continuing the Canterbury earthquake sequence. Dr. Gerstenberger of GNS Science reported that GNS has carried out “broadband ground motion simulations” for a suite of moderate sized aftershocks in order to develop aftershock hazards in the region.

Dr. Luco, the co-convener of the session, proposed probabilistic risk assessment for “post-earthquake mitigation decisions” after the occurrence of mainshock (or any other earthquake).

They discussed three different approaches of aftershock hazard calculations and two approaches for estimating increased collapse probability of buildings due to aftershocks. These approaches can ultimately be synthesized to compute the increased earthquake risk of damage or collapse of buildings following earthquakes.

RMS will continue to work alongside our industry colleagues to improve understanding of aftershock risk.