New Storms, New Insights: Two Years After Hurricane Sandy

When people think about the power of hurricanes, they imagine strong winds and flying debris. Wind damage will always result from hurricanes, but Hurricane Sandy highlighted the growing threat of storm surge as sea levels rise.

While Sandy’s hurricane-force winds were not unusual, the storm delivered an unprecedented storm surge to parts of the Mid-Atlantic and Northeast U.S. In total, Sandy caused insured losses of nearly $20 billion in the U.S., 65 percent of which resulted from surge-driven coastal flooding.

Considering the hazard and severity of the event, we used Sandy as the first real opportunity to validate our hydrodynamic storm surge model, which we released in 2011 and embedded in the RMS U.S. Hurricane Model. We verified the model against more than 300 independent wind and flood observations, the Federal Emergency Management Agency’s (FEMA) 100-year flood zones, and the FEMA best surge inundation footprint for New York City. The model captured the extent and severity of Sandy’s coastal flooding exceptionally well.

We also conducted extensive analysis of claims data from Sandy, which involved reviewing nearly $3 billion in location-level claims and exposure data across seven lines of business, provided by several companies. The purpose of the study was to deepen our understanding of the impacts of flooding on coastal exposures, particularly for commercial and industrial structures.

What struck us was how vulnerable buildings are to below-ground flooding. In many cases, damage to ground- and basement-level property and contents contributed a much higher proportion of the overall losses than expected, particularly for commercial structures in New York’s central business districts.

This insight has prompted us to improve the flexibility of how losses are modeled for contents and business interruption, specifically for basements. Early next year, we will release an update to our flagship North Atlantic Hurricane Models to provide the most-up-to-date view of hurricane risk with new vulnerability modeling capabilities based on insights gained from Sandy.

The model update includes new location-specific content triggers to enable users to make business interruption loss projections dependent on either contents or building damage, rather than on building damage alone. The model also allows users to assess the impact of multiple basement levels in a building, as well as the total value of contents stored within.

The claims data analysis also highlighted the importance of using high-resolution data to model high-gradient perils, such as coastal flooding. Flood losses are extremely sensitive to the locations of coastal exposures, as well as the surrounding topographical and bathymetrical features. Using high quality data with location-level specificity across a variety of building characteristics, as well as a high-resolution storm surge model that can accurately capture the flow of water around complex coastlines and local terrain, minimizes uncertainty.

At this time, RMS remains the only catastrophe modeling firm to integrate a hydrodynamic, time-stepping storm surge model into its hurricane models to represent the complex interactions of wind and water throughout a hurricane’s life-cycle, and we continue to implement lessons learned from new storms.

Betting on Mother Nature

When you consider lofty odds like the chances you will win the Mega Million lottery (1 in 259 million) or the chance you will get hit by lightning (1 in 280,000) it blunts your appreciation of very unlikely, if not statistically improbable, events.

Consider some of the things with 500-to-1 odds:

We witnessed something that has a 500-1 chance of happening recently in nature: two storms hit Bermuda within six days of each other. Bermuda was recently hit by a tropical storm (Fay) and a borderline category 2/3 hurricane (Gonzalo) within six days. The chances of two tropical systems hitting what amounts to a tiny dot in the middle of Atlantic Ocean this year was 1-in-500 according to our modeling.

Not statistically improbable, but certainly not a normal occurrence.

Even more impressive, the natural fluke comes after a relatively inactive hurricane season. There have been seven named storms in 2014; six have reached hurricane status.

The climatological peak of the Atlantic season for hurricanes is mid-September and generally associated with storms that develop as they cross from Africa toward the Caribbean. However, there is a secondary peak in October related to storms developing closer to the U.S. in areas such as the Caribbean Sea and the Gulf of Mexico.

The cumulative intensity of this season’s storms is lower than average. Our research attributes this to cooler than normal sea temperatures in the Atlantic lessening energy, along with higher than normal sea pressures suppressing thunderstorms. It would be interesting to see what the odds were for Gonzalo and Fay both hitting Bermuda if we factored in a slow hurricane season.


Satellite Eyes First Major Atlantic Hurricane in 3 Years: Gonzalo

Are fears of a global Ebola pandemic warranted?

Ebola is a hot topic in the media right now, with multiple cases being reported outside of West Africa and much confusion among the general public around the reality of the danger. So, are the fear and sensationalism warranted?

RMS models infectious diseases and recently developed the world’s first probabilistic model for the current West African Ebola outbreak. While Ebola is indeed a very scary and relatively deadly disease, with a case fatality rate between 69 and 73 percent according to the WHO, RMS modeling shows that it is unlikely the outbreak will become a significant threat globally.

The spread of Ebola in West Africa is in part due to misconceptions and fear surrounding the disease and a lack of public health practices. Ebola can be passed solely via bodily fluids; the risk of unknowingly contracting the disease is low.

Fear is prevalent among some West African communities that Ebola is a lie or is being used purposefully to wipe out certain ethnic groups, causing them to hide sick family members from healthcare and aid workers. Customary burial practices, in which family members kiss and interact with the dead, also have contributed to Ebola’s spread. Getting the populace in these countries to trust foreigners who are telling them to abandon their customs has been an uphill struggle.

In more developed countries where health care is more advanced and understood, the chances of transmission are exponentially smaller due to the fact that extreme containment measures are taken. Controlling the spread of the disease comes down to a question of logistics; if the medical community can control the existing cases and trace the contact made with carriers, spread is much less likely. For example, the case in Texas can be contained to one degree as long as every single person in contact with the patient is tracked.

There is also a (speculative) fear of the virus mutating into an airborne pathogen; the fact is, the chances of the virus changing the way it is transmitted, from fluid contact to airborne passage, are very low and of a similar order of magnitude to the chance of emergence of a different highly virulent novel pathogen.

Vincent Racaniello, a prominent virologist at Columbia University wrote:

“When it comes to viruses, it is always difficult to predict what they can or cannot do. It is instructive, however, to see what viruses have done in the past, and use that information to guide our thinking. Therefore, we can ask: has any human virus ever changed its mode of transmission? The answer is no. We have been studying viruses for over 100 years, and we’ve never seen a human virus change the way it is transmitted.”

The tipping point in the modeling of a virus like Ebola is the point where the resources being used to mitigate the threat outpace the increase in new cases. Trying to get ahead of the epidemic itself is like a race against a moving target, but as long as people get into treatment centers, progress will be made in getting ahead of the illness.

So, while Ebola is a very scary and dangerous illness, it is not something that we expect to become a global pandemic. However, while the current outbreak is not expected to spread significantly beyond West Africa, it still has the potential to be the most deadly infectious disease in a century and could have drastic economic impacts on the communities that suffer from Ebola breakouts. In fact, the economic impacts are likely to be worse than the actual impacts of the disease, due to negative impacts to trade and inter-community relations.

The key is to contain it where it is, reach the tipping point as quickly as possible, and to promote safety around existing infected persons. Through travel control measures and the development of several new drugs to combat the virus, the danger of epidemic should be drastically reduced in Africa and, as a result, the rest of the world.

Your Excellent Questions On Earthquakes

Today marks the 25th anniversary of the magnitude 6.9 Loma Prieta earthquake which rocked California’s San Francisco Bay Area on October 17, 1989. To commemorate the anniversary and raise awareness about resilience against earthquakes, Dr. Robert Muir-Wood, RMS chief research officer, and Dr. Patricia Grossi, RMS senior director of global earthquake modeling, hosted a Reddit Science AMA (Ask Me Anything).

They discussed a number of topics; participants expressed curiosity not just for routine details like the best immediate action in the event of a quake, but also what fault lines are at risk and the finer points of earthquake insurance.

Here are just a few of the subjects they tackled in a conversation that generated close to 200 comments by Thursday afternoon – you can also read the entire Reddit thread.

Is the Bay Area is better prepared [now] than for the Loma Prieta quake? What role have you (or other scientists) played in planning?

Grossi: There’s been a lot of work by PG&E, BART, and other agencies to mitigate earthquake risk – as well as the new span of the Bay Bridge. In addition, the California Earthquake Authority has been encouraging mitigation – and have mitigation incentives if you retrofit your home to withstand earthquake ground shaking. Scientists can help by creating strategic plans or perform cost-benefit analyses for mitigation/retrofit.

Is there a link between fracking and earthquakes?

Muir-Wood: The term ‘earthquake’ can cover an enormous range of sizes of energy release. Fracking may sometimes trigger small shallow earthquakes or tremors. One day there might be a bigger earthquake nearby and people will argue over whether it was linked to the fracking. The link, however, will remain tenuous.

Am I being overcharged for earthquake insurance? I was charged $1,500 a year with a 15 percent deductible.

Grossi: Premiums associated with the coverage seem high (as generally double premiums here in California). However, they are based on price-based pricing. The coverage is meant to be a ‘minimum’ coverage – and provide protection for the worst-case scenario.

Is Tokyo due for another big earthquake?

Muir-Wood: The Big One happened beneath Tokyo in 1923, and before that a similar Big One (not quite on the same fault) occurred in 1703. The 1923 earthquake is not so likely to come around again. However, there was a M7 earthquake in 1855 that occurred right under Tokyo and may be the type of damaging earthquake we can expect. It could do a lot of damage.

 

Was there anything we missed you wanted to discuss? Please let us know in the comments. 

The Next Big One: Expert Advice On Planning For The Inevitable

The 25th anniversary of the Loma Prieta earthquake provides an opportunity to remember and reflect about what we lost. It also offers an opportunity to think about how we can better plan and prepare for an inevitable earthquake on the Bay Area’s precarious fault lines.

While we can’t accurately predict when an earthquake will strike, we can say there’s more at risk here then there was in 25 years ago; the Bay Area’s population has grown 25 percent and the value of residential property is now $1.2 trillion. A worst-case, magnitude 7.9 earthquake on the San Andreas Fault could strike an urban center with 32 times the destructive force of Loma Prieta, potentially causing commercial and residential property losses over $200 billion.

As part of our activities around the Loma Prieta anniversary, we gathered experts at a roundtable to discuss how to improve resilience in the Bay Area. Here are some of their lessons and observations:

Patrick Otellini, Chief Resilience Officer, San Francisco
Think about people when crafting public policy:

Preparing for an earthquake is an enormous task. San Francisco is working to retrofit 4,800 buildings during the next seven years. You have to get the right people at the table when crafting policy changes and understand how citizens will be affected. There needs to be a dual focus: protect the public interest while building consensus on changes that protect safety and health.

Dr. Patricia Grossi, Earthquake expert and senior director of product model management, RMS
Don’t short change risk modeling:

Risk modeling helps us assess how we are planning for the next big event, highlights uncertainties and leads to thorough preparation. But any analysis shouldn’t just consider dollar signs; it should analyze the worst-case scenario and what an earthquake would do to our lives in the immediate days and weeks after.

Kristina Freas, Director of Emergency Preparedness, Dignity Health
Retrofit hospitals and prepare to help the most vulnerable:

Hospitals are little cities. The same issues with supplies and logistics affecting metropolitan areas in a disaster would affect hospitals. Hospitals need to have plans to mitigate damages from water and power loss and protect patients.

Danielle Hutchings Mieler, Resilience Program Coordinator for the Association of Bay Area Governments
Bridge the private and public gap in infrastructure repair:

There’s been progress in retrofitting public buildings. But many private facilities – homes, businesses and private schools – are vulnerable. This is problematic because the Bay Area is growing in areas like the shoreline, which are close to fault lines and at greater risk. Work is needed to ensure that all types of buildings – both private and public – are well prepared and sturdy.

Lewis Knight, planning and urban design practice leader, Gensler
Think different about infrastructure and retrofitting:

Many engineering firms report to Wall Street and big infrastructure. They aren’t truly considering changes that need to be made to protect communities affected by both earthquake risk and climate change. There needs to be frank discussions about how infrastructure can be part of a defense against natural disasters.

What else is crucial to consider when thinking about the next earthquake?

Infographic: When the "Big One" Hits

The Need for Preparation and Resiliency in the Bay Area

With the recent August 24, 2014 M6.0 Napa Earthquake, the San Francisco Bay Area was reminded of the importance of preparing for the next significant earthquake. The largest earthquake in recent memory in the Bay Area is the 1989 Loma Prieta earthquake. However, in the event of a future earthquake, the impacts on property and people at risk are higher than ever. Since 1989, the population of the region has grown 25 percent, along with the value of property at risk, and according to the United States Geological Survey, there is a 63 percent chance that a magnitude 6.7 or larger earthquake will hit the Bay Area in the next 30 years.

The next major earthquake could strike anywhere – and potentially closer to urban centers than the 1989 Loma Prieta event.  As part of the commemoration of the 25th anniversary of the earthquake, RMS has developed a timeline of events could unfold in a worst-case scenario event impacting the entire Bay Area region.

In the “Big One’s” Aftermath

Prepare

This black swan scenario is extreme and is meant to get the stakeholders in the earthquake risk management arena to consider long-term ramifications of very uncertain outcomes. According to RMS modeling, a likely location of the next big earthquake to impact the San Francisco Bay area is on the Hayward fault, which could reach a magnitude of 7.0. An event of this size could cause hundreds of billions of dollars of damage, with only tens of billions covered by insurance. Without significant earthquake insurance penetration to facilitate rebuilding, the recovery from a major earthquake will be significantly harder. A cluster of smaller earthquakes could also impact the area, which, sustained over months, could have serious implications for the local economy.

While the Bay Area has become more resilient to earthquake damage, we are still at risk from a significant earthquake devastating the region. Now is the time for Bay Area residents to come together to develop innovative approaches and ensure resilience in the face of the next major earthquake.

RMS To Launch Global Tsunami Scenario Catalog

The 2011 Tohoku earthquake and its accompanying mega-tsunami highlighted how a single magnitude 9.0 (Mw9) tsunami could impact multiple regions and lines of business. The size of the earthquake was considered beyond what was possible on this plate boundary, and there are many areas worldwide where a massive earthquake and accompanying tsunami could impact coastal exposures over a very wide area.

Global coastal exposure is increasing rapidly including port cities, refineries, power plants, hotels and beach resorts. On regions around the Pacific and parts of the Indian and Atlantic Oceans, some of these exposure accumulations are at frontline risk from the mega-tsunamis that would accompany magnitude 9.0 (Mw9) earthquakes.

Later this year, RMS will release a Global Tsunami Scenario Catalog to provide (re)insurers with a broad and relevant set of tsunami scenarios that include both local and ocean-wide impacts. The tsunamis scenarios have been generated by modeling fault rupture and sea floor deformation associated with earthquakes on the principal subduction zones worldwide, with magnitudes ranging between M8.0-9.5.

For each scenario the tsunami is modeled in three stages – a) the initial generation of the water level changes caused by sudden movements in the configuration of the seafloor, b) tsunami wave propagation, and c) the flooding inundation of coastlines.

For each scenario the tsunami flood is represented as the elevation of the water level at each onshore location in the path of a tsunami. The tsunami flood data also includes the maximum expected inundation depth of tsunami flooding so that users can estimate the level of destruction to different building categories. The tsunami modeling capability has been extensively tested to show the method reproduces the observed coastal water heights from recent tsunamis.

A key element of the work to create the new Global Tsunami Scenario Catalog involved identifying where Mw9 earthquakes had the potential to occur, and hence which were the coastal regions at risk from mega- tsunami. These regions include cities with high insurance penetration such as Hong Kong and Macao, the main Taiwanese port of Kaohsiung, the island of Barbados, as well as Muscat, Oman. Our research also shows that a mega-tsunami as large as Tohoku could even occur in the Eastern Mediterranean – and in fact a mega-tsunami was generated in this region in 365 A.D. A repeat of such a tsunami could impact a wide stretch of coastal cities from Alexandria, Egypt to Kalamata, Greece and Antalya, Turkey.

The Tohoku earthquake and tsunami surprised the world because it occurred on a plate boundary that was not considered capable of producing a giant earthquake. The lessons from Tohoku should be applied to other ‘dormant’ subduction zone plate boundaries worldwide where M9 earthquakes have the potential to occur even though they have not previously been experienced in the past few hundred years of history. The region-wide loss correlations associated with some of these events have the potential to affect multiple lines of property and marine exposures in diverse coastal locations, potentially spanning several countries in a single loss. (Re)insurers wishing to manage their regional coastal exposures should be testing their exposure accumulations against a credible set of the largest-scale earthquake and tsunami scenarios.

RMS and 100 Resilient Cities at the Clinton Global Initiative

I’ve just returned from the Clinton Global Initiative (CGI) annual meeting in New York. Every September, political, corporate, and non-profit leaders from around the world gather to discuss pressing challenges, form partnerships, and make commitments to action. It was inspiring to see the tremendous work already being done and the new commitments being made to address a diverse and wide range of issues, from containing the Ebola epidemic, to increasing access to education, to combatting climate change, and helping Haiti develop a self-sustaining economy.

One prevailing theme at the event this year was the importance of cross-sector partnerships to successfully tackle such complex issues. Not surprisingly, data crunched by the CGI team on commitments made over the past 10 years demonstrates the highest rate of success from partnerships vs. go-it-alone approaches.

In this spirit, we announced an RMS commitment last week to partner with the Rockefeller Foundation’s 100 Resilient Cities initiative to help increase the resilience of cities around the world. We will be making our RMS(one) platform and our catastrophe models available to cities in the 100RC network so that they can better understand their exposures, assess risk to catastrophic events as well as climate change, and prioritize investments in mitigating and managing that risk.

As the saying goes, “if you can measure it, you can manage it.” From our 25 years of experience helping the insurance industry better measure and then manage catastrophe risk, we believe there is a largely untapped opportunity for the public sector to similarly leverage exposure management and catastrophe modeling technology to establish more informed policies for managing risk and increasing resilience in cities throughout the world, both in developed and emerging economies.

It was also clear this week that the conversation in corporate boardrooms is increasingly moving from being focused solely on the financial bottom line to also having a positive impact on the world in a way that is strategically aligned with the core mission of the business.

Our partnership with 100RC, along with the partnerships with the UNISDR and the World Bank that we announced this summer, is another step in our own version of this journey. Through both our direct philanthropic support of Build Change and their admirable work to improve construction practices in developing countries and through the leveraging of our technology and the expertise of our colleagues to help the public sector, we are aligning all of our activities in support of our core mission to increase the resilience of our society.

Many of our clients have shared with us that they are on similar journeys, building on traditional support for local organizations to implement more strategic programs with broader impact and employee engagement. In particular, the insurance industry is uniquely positioned to understand the value of proactively investing in mitigation and in increasing resilience, instead of waiting until a tragedy has occurred and all that can be done is to support humanitarian response efforts.

With this common frame of reference, we look forward to increasingly partnering with our clients in the coming years not just to help them manage their own risk but to collectively help increase resilience around the world.

Matching Modeled Loss Against Historic Loss in European Windstorm Data

To be Solvency II compliant, re/insurers must validate the models they use, which can include comparisons to historical loss experience. In working towards model validation, companies may find their experience of European windstorm hazard does not match the modeled loss. However, this seeming discrepancy does not necessarily mean something is wrong with the model or with the company’s loss data. The underlying timelines for each dataset may simply differ, which can have a significant influence for a variable peril like European windstorm.

Most re/insurers’ claims records only date back 10 to 20 years, whereas European windstorm models use much longer datasets – generally up to 50 years of the hazard. Looking over the short term, the last 15 years represented a relative lull in windstorm activity, particularly when compared to the more extreme events that occurred in the very active 1980s and 1990s.

Netherlands windstorm variability

 

 

 

 

 

 

RMS has updated its European windstorm model specifically to support Solvency II model validation. The enhanced RMS model includes the RMS reference view, which is based on the most up-to-date, long-term historical record, as well as a new shorter historical dataset that is based on the activity of the last 25 years.

By using the shorter-term view, re/insurers gain a deeper understanding of how historical variability can impact modeled losses. Re/insurers can also perform a like-for-like validation of the model against their loss experience, and develop confidence in the model’s core methodology and data. Alternate views of risk also support a deeper understanding of risk uncertainty, which enhances model validation and provides greater confidence in the models that are used for risk selection and portfolio management.

Beyond Solvency II validation, the model also empowers companies to explore the hazard variability, which is vitally important for a variable peril like European windstorm. If a catastrophe model and a company rely on different but equally valid assumptions, the model can present a different perspective to provide a more complete view of the risk.

Serial Clustering Activity around the Baja Peninsula during September 2014

In the past two weeks, two major hurricanes have impacted the Baja Peninsula in Mexico. Hurricane Norbert bypassed a large portion of the west coast of the peninsula from September 5 to 7, and Hurricane Odile made landfall near Cabo San Lucas on September 14th as a Category 3 hurricane on the Saffir-Simpson Wind Scale. A third system, Hurricane Polo, formed Tuesday, September 16 and is forecasted to follow a similar track to Norbert and Odile, making it the third such tropical cyclone to develop in the region since the beginning of the month.

This serial cluster of storms has been driven primarily by steady, favorable conditions for tropical cyclone development and consistent atmospheric patterns present over the Eastern Pacific. A serial cluster is defined as a set of storms that form in the same part of a basin, and subsequently follow one another in an unbroken sequence over a relatively short period of time. To qualify as a cluster, there needs to be measurable consistency between the tracks. This is typically a result of steady, predominant atmospheric steering currents, which play a major role in influencing the speed and direction of tropical cyclones. One example of a serial cluster is the four major hurricanes (Charley, Francis, Ivan, and Jeanne) that impacted Florida during a six-week period in 2004.

During this recent two-week period, the area off the west coast of Mexico has maintained high sea-surface temperatures near 85.1 degree Fahrenheit and limited vertical wind shear, leading to an active tropical development region. A mid-level atmospheric ridge over northern Mexico has provided a consistent steering pattern towards the north-northwest, producing similar observed tracks for Norbert and Odile and forecasted track for Polo. Devastating amounts of rainfall have occurred with these storms. Hurricane Odile dropped nearly 18 inches of rain in areas around Cabo San Lucas, representing nearly 21 months-worth of typical rainfall. This cluster, while generating significant wind and flood damage along the Baja Peninsula, has also caused torrential rainfall in the southwestern U.S., including Arizona, southern Nevada, and southern California. Last week, Phoenix, AZ, one of the hardest hit areas, experienced over 3 inches of rain in a 7 hour span due to the remnants of Hurricane Norbert. This was the most rainfall to occur in a 24-hour period in the city since 1911, an estimated 1-in-200 year event by the National Oceanic and Atmospheric Administration. Significant rainfall and inland flooding is forecast to continue as the remnants of Odile and Polo move inland, which may lead to widespread flood losses and the potential for compound post-event loss amplification.