Are fears of a global Ebola pandemic warranted?

Ebola is a hot topic in the media right now, with multiple cases being reported outside of West Africa and much confusion among the general public around the reality of the danger. So, are the fear and sensationalism warranted?

RMS models infectious diseases and recently developed the world’s first probabilistic model for the current West African Ebola outbreak. While Ebola is indeed a very scary and relatively deadly disease, with a case fatality rate between 69 and 73 percent according to the WHO, RMS modeling shows that it is unlikely the outbreak will become a significant threat globally.

The spread of Ebola in West Africa is in part due to misconceptions and fear surrounding the disease and a lack of public health practices. Ebola can be passed solely via bodily fluids; the risk of unknowingly contracting the disease is low.

Fear is prevalent among some West African communities that Ebola is a lie or is being used purposefully to wipe out certain ethnic groups, causing them to hide sick family members from healthcare and aid workers. Customary burial practices, in which family members kiss and interact with the dead, also have contributed to Ebola’s spread. Getting the populace in these countries to trust foreigners who are telling them to abandon their customs has been an uphill struggle.

In more developed countries where health care is more advanced and understood, the chances of transmission are exponentially smaller due to the fact that extreme containment measures are taken. Controlling the spread of the disease comes down to a question of logistics; if the medical community can control the existing cases and trace the contact made with carriers, spread is much less likely. For example, the case in Texas can be contained to one degree as long as every single person in contact with the patient is tracked.

There is also a (speculative) fear of the virus mutating into an airborne pathogen; the fact is, the chances of the virus changing the way it is transmitted, from fluid contact to airborne passage, are very low and of a similar order of magnitude to the chance of emergence of a different highly virulent novel pathogen.

Vincent Racaniello, a prominent virologist at Columbia University wrote:

“When it comes to viruses, it is always difficult to predict what they can or cannot do. It is instructive, however, to see what viruses have done in the past, and use that information to guide our thinking. Therefore, we can ask: has any human virus ever changed its mode of transmission? The answer is no. We have been studying viruses for over 100 years, and we’ve never seen a human virus change the way it is transmitted.”

The tipping point in the modeling of a virus like Ebola is the point where the resources being used to mitigate the threat outpace the increase in new cases. Trying to get ahead of the epidemic itself is like a race against a moving target, but as long as people get into treatment centers, progress will be made in getting ahead of the illness.

So, while Ebola is a very scary and dangerous illness, it is not something that we expect to become a global pandemic. However, while the current outbreak is not expected to spread significantly beyond West Africa, it still has the potential to be the most deadly infectious disease in a century and could have drastic economic impacts on the communities that suffer from Ebola breakouts. In fact, the economic impacts are likely to be worse than the actual impacts of the disease, due to negative impacts to trade and inter-community relations.

The key is to contain it where it is, reach the tipping point as quickly as possible, and to promote safety around existing infected persons. Through travel control measures and the development of several new drugs to combat the virus, the danger of epidemic should be drastically reduced in Africa and, as a result, the rest of the world.

Your Excellent Questions On Earthquakes

Today marks the 25th anniversary of the magnitude 6.9 Loma Prieta earthquake which rocked California’s San Francisco Bay Area on October 17, 1989. To commemorate the anniversary and raise awareness about resilience against earthquakes, Dr. Robert Muir-Wood, RMS chief research officer, and Dr. Patricia Grossi, RMS senior director of global earthquake modeling, hosted a Reddit Science AMA (Ask Me Anything).

They discussed a number of topics; participants expressed curiosity not just for routine details like the best immediate action in the event of a quake, but also what fault lines are at risk and the finer points of earthquake insurance.

Here are just a few of the subjects they tackled in a conversation that generated close to 200 comments by Thursday afternoon – you can also read the entire Reddit thread.

Is the Bay Area is better prepared [now] than for the Loma Prieta quake? What role have you (or other scientists) played in planning?

Grossi: There’s been a lot of work by PG&E, BART, and other agencies to mitigate earthquake risk – as well as the new span of the Bay Bridge. In addition, the California Earthquake Authority has been encouraging mitigation – and have mitigation incentives if you retrofit your home to withstand earthquake ground shaking. Scientists can help by creating strategic plans or perform cost-benefit analyses for mitigation/retrofit.

Is there a link between fracking and earthquakes?

Muir-Wood: The term ‘earthquake’ can cover an enormous range of sizes of energy release. Fracking may sometimes trigger small shallow earthquakes or tremors. One day there might be a bigger earthquake nearby and people will argue over whether it was linked to the fracking. The link, however, will remain tenuous.

Am I being overcharged for earthquake insurance? I was charged $1,500 a year with a 15 percent deductible.

Grossi: Premiums associated with the coverage seem high (as generally double premiums here in California). However, they are based on price-based pricing. The coverage is meant to be a ‘minimum’ coverage – and provide protection for the worst-case scenario.

Is Tokyo due for another big earthquake?

Muir-Wood: The Big One happened beneath Tokyo in 1923, and before that a similar Big One (not quite on the same fault) occurred in 1703. The 1923 earthquake is not so likely to come around again. However, there was a M7 earthquake in 1855 that occurred right under Tokyo and may be the type of damaging earthquake we can expect. It could do a lot of damage.

 

Was there anything we missed you wanted to discuss? Please let us know in the comments. 

The Next Big One: Expert Advice On Planning For The Inevitable

The 25th anniversary of the Loma Prieta earthquake provides an opportunity to remember and reflect about what we lost. It also offers an opportunity to think about how we can better plan and prepare for an inevitable earthquake on the Bay Area’s precarious fault lines.

While we can’t accurately predict when an earthquake will strike, we can say there’s more at risk here then there was in 25 years ago; the Bay Area’s population has grown 25 percent and the value of residential property is now $1.2 trillion. A worst-case, magnitude 7.9 earthquake on the San Andreas Fault could strike an urban center with 32 times the destructive force of Loma Prieta, potentially causing commercial and residential property losses over $200 billion.

As part of our activities around the Loma Prieta anniversary, we gathered experts at a roundtable to discuss how to improve resilience in the Bay Area. Here are some of their lessons and observations:

Patrick Otellini, Chief Resilience Officer, San Francisco
Think about people when crafting public policy:

Preparing for an earthquake is an enormous task. San Francisco is working to retrofit 4,800 buildings during the next seven years. You have to get the right people at the table when crafting policy changes and understand how citizens will be affected. There needs to be a dual focus: protect the public interest while building consensus on changes that protect safety and health.

Dr. Patricia Grossi, Earthquake expert and senior director of product model management, RMS
Don’t short change risk modeling:

Risk modeling helps us assess how we are planning for the next big event, highlights uncertainties and leads to thorough preparation. But any analysis shouldn’t just consider dollar signs; it should analyze the worst-case scenario and what an earthquake would do to our lives in the immediate days and weeks after.

Kristina Freas, Director of Emergency Preparedness, Dignity Health
Retrofit hospitals and prepare to help the most vulnerable:

Hospitals are little cities. The same issues with supplies and logistics affecting metropolitan areas in a disaster would affect hospitals. Hospitals need to have plans to mitigate damages from water and power loss and protect patients.

Danielle Hutchings Mieler, Resilience Program Coordinator for the Association of Bay Area Governments
Bridge the private and public gap in infrastructure repair:

There’s been progress in retrofitting public buildings. But many private facilities – homes, businesses and private schools – are vulnerable. This is problematic because the Bay Area is growing in areas like the shoreline, which are close to fault lines and at greater risk. Work is needed to ensure that all types of buildings – both private and public – are well prepared and sturdy.

Lewis Knight, planning and urban design practice leader, Gensler
Think different about infrastructure and retrofitting:

Many engineering firms report to Wall Street and big infrastructure. They aren’t truly considering changes that need to be made to protect communities affected by both earthquake risk and climate change. There needs to be frank discussions about how infrastructure can be part of a defense against natural disasters.

What else is crucial to consider when thinking about the next earthquake?

Infographic: When the "Big One" Hits

The Need for Preparation and Resiliency in the Bay Area

With the recent August 24, 2014 M6.0 Napa Earthquake, the San Francisco Bay Area was reminded of the importance of preparing for the next significant earthquake. The largest earthquake in recent memory in the Bay Area is the 1989 Loma Prieta earthquake. However, in the event of a future earthquake, the impacts on property and people at risk are higher than ever. Since 1989, the population of the region has grown 25 percent, along with the value of property at risk, and according to the United States Geological Survey, there is a 63 percent chance that a magnitude 6.7 or larger earthquake will hit the Bay Area in the next 30 years.

The next major earthquake could strike anywhere – and potentially closer to urban centers than the 1989 Loma Prieta event.  As part of the commemoration of the 25th anniversary of the earthquake, RMS has developed a timeline of events could unfold in a worst-case scenario event impacting the entire Bay Area region.

In the “Big One’s” Aftermath

Prepare

This black swan scenario is extreme and is meant to get the stakeholders in the earthquake risk management arena to consider long-term ramifications of very uncertain outcomes. According to RMS modeling, a likely location of the next big earthquake to impact the San Francisco Bay area is on the Hayward fault, which could reach a magnitude of 7.0. An event of this size could cause hundreds of billions of dollars of damage, with only tens of billions covered by insurance. Without significant earthquake insurance penetration to facilitate rebuilding, the recovery from a major earthquake will be significantly harder. A cluster of smaller earthquakes could also impact the area, which, sustained over months, could have serious implications for the local economy.

While the Bay Area has become more resilient to earthquake damage, we are still at risk from a significant earthquake devastating the region. Now is the time for Bay Area residents to come together to develop innovative approaches and ensure resilience in the face of the next major earthquake.

RMS To Launch Global Tsunami Scenario Catalog

The 2011 Tohoku earthquake and its accompanying mega-tsunami highlighted how a single magnitude 9.0 (Mw9) tsunami could impact multiple regions and lines of business. The size of the earthquake was considered beyond what was possible on this plate boundary, and there are many areas worldwide where a massive earthquake and accompanying tsunami could impact coastal exposures over a very wide area.

Global coastal exposure is increasing rapidly including port cities, refineries, power plants, hotels and beach resorts. On regions around the Pacific and parts of the Indian and Atlantic Oceans, some of these exposure accumulations are at frontline risk from the mega-tsunamis that would accompany magnitude 9.0 (Mw9) earthquakes.

Later this year, RMS will release a Global Tsunami Scenario Catalog to provide (re)insurers with a broad and relevant set of tsunami scenarios that include both local and ocean-wide impacts. The tsunamis scenarios have been generated by modeling fault rupture and sea floor deformation associated with earthquakes on the principal subduction zones worldwide, with magnitudes ranging between M8.0-9.5.

For each scenario the tsunami is modeled in three stages – a) the initial generation of the water level changes caused by sudden movements in the configuration of the seafloor, b) tsunami wave propagation, and c) the flooding inundation of coastlines.

For each scenario the tsunami flood is represented as the elevation of the water level at each onshore location in the path of a tsunami. The tsunami flood data also includes the maximum expected inundation depth of tsunami flooding so that users can estimate the level of destruction to different building categories. The tsunami modeling capability has been extensively tested to show the method reproduces the observed coastal water heights from recent tsunamis.

A key element of the work to create the new Global Tsunami Scenario Catalog involved identifying where Mw9 earthquakes had the potential to occur, and hence which were the coastal regions at risk from mega- tsunami. These regions include cities with high insurance penetration such as Hong Kong and Macao, the main Taiwanese port of Kaohsiung, the island of Barbados, as well as Muscat, Oman. Our research also shows that a mega-tsunami as large as Tohoku could even occur in the Eastern Mediterranean – and in fact a mega-tsunami was generated in this region in 365 A.D. A repeat of such a tsunami could impact a wide stretch of coastal cities from Alexandria, Egypt to Kalamata, Greece and Antalya, Turkey.

The Tohoku earthquake and tsunami surprised the world because it occurred on a plate boundary that was not considered capable of producing a giant earthquake. The lessons from Tohoku should be applied to other ‘dormant’ subduction zone plate boundaries worldwide where M9 earthquakes have the potential to occur even though they have not previously been experienced in the past few hundred years of history. The region-wide loss correlations associated with some of these events have the potential to affect multiple lines of property and marine exposures in diverse coastal locations, potentially spanning several countries in a single loss. (Re)insurers wishing to manage their regional coastal exposures should be testing their exposure accumulations against a credible set of the largest-scale earthquake and tsunami scenarios.

RMS and 100 Resilient Cities at the Clinton Global Initiative

I’ve just returned from the Clinton Global Initiative (CGI) annual meeting in New York. Every September, political, corporate, and non-profit leaders from around the world gather to discuss pressing challenges, form partnerships, and make commitments to action. It was inspiring to see the tremendous work already being done and the new commitments being made to address a diverse and wide range of issues, from containing the Ebola epidemic, to increasing access to education, to combatting climate change, and helping Haiti develop a self-sustaining economy.

One prevailing theme at the event this year was the importance of cross-sector partnerships to successfully tackle such complex issues. Not surprisingly, data crunched by the CGI team on commitments made over the past 10 years demonstrates the highest rate of success from partnerships vs. go-it-alone approaches.

In this spirit, we announced an RMS commitment last week to partner with the Rockefeller Foundation’s 100 Resilient Cities initiative to help increase the resilience of cities around the world. We will be making our RMS(one) platform and our catastrophe models available to cities in the 100RC network so that they can better understand their exposures, assess risk to catastrophic events as well as climate change, and prioritize investments in mitigating and managing that risk.

As the saying goes, “if you can measure it, you can manage it.” From our 25 years of experience helping the insurance industry better measure and then manage catastrophe risk, we believe there is a largely untapped opportunity for the public sector to similarly leverage exposure management and catastrophe modeling technology to establish more informed policies for managing risk and increasing resilience in cities throughout the world, both in developed and emerging economies.

It was also clear this week that the conversation in corporate boardrooms is increasingly moving from being focused solely on the financial bottom line to also having a positive impact on the world in a way that is strategically aligned with the core mission of the business.

Our partnership with 100RC, along with the partnerships with the UNISDR and the World Bank that we announced this summer, is another step in our own version of this journey. Through both our direct philanthropic support of Build Change and their admirable work to improve construction practices in developing countries and through the leveraging of our technology and the expertise of our colleagues to help the public sector, we are aligning all of our activities in support of our core mission to increase the resilience of our society.

Many of our clients have shared with us that they are on similar journeys, building on traditional support for local organizations to implement more strategic programs with broader impact and employee engagement. In particular, the insurance industry is uniquely positioned to understand the value of proactively investing in mitigation and in increasing resilience, instead of waiting until a tragedy has occurred and all that can be done is to support humanitarian response efforts.

With this common frame of reference, we look forward to increasingly partnering with our clients in the coming years not just to help them manage their own risk but to collectively help increase resilience around the world.

Matching Modeled Loss Against Historic Loss in European Windstorm Data

To be Solvency II compliant, re/insurers must validate the models they use, which can include comparisons to historical loss experience. In working towards model validation, companies may find their experience of European windstorm hazard does not match the modeled loss. However, this seeming discrepancy does not necessarily mean something is wrong with the model or with the company’s loss data. The underlying timelines for each dataset may simply differ, which can have a significant influence for a variable peril like European windstorm.

Most re/insurers’ claims records only date back 10 to 20 years, whereas European windstorm models use much longer datasets – generally up to 50 years of the hazard. Looking over the short term, the last 15 years represented a relative lull in windstorm activity, particularly when compared to the more extreme events that occurred in the very active 1980s and 1990s.

Netherlands windstorm variability

 

 

 

 

 

 

RMS has updated its European windstorm model specifically to support Solvency II model validation. The enhanced RMS model includes the RMS reference view, which is based on the most up-to-date, long-term historical record, as well as a new shorter historical dataset that is based on the activity of the last 25 years.

By using the shorter-term view, re/insurers gain a deeper understanding of how historical variability can impact modeled losses. Re/insurers can also perform a like-for-like validation of the model against their loss experience, and develop confidence in the model’s core methodology and data. Alternate views of risk also support a deeper understanding of risk uncertainty, which enhances model validation and provides greater confidence in the models that are used for risk selection and portfolio management.

Beyond Solvency II validation, the model also empowers companies to explore the hazard variability, which is vitally important for a variable peril like European windstorm. If a catastrophe model and a company rely on different but equally valid assumptions, the model can present a different perspective to provide a more complete view of the risk.

Serial Clustering Activity around the Baja Peninsula during September 2014

In the past two weeks, two major hurricanes have impacted the Baja Peninsula in Mexico. Hurricane Norbert bypassed a large portion of the west coast of the peninsula from September 5 to 7, and Hurricane Odile made landfall near Cabo San Lucas on September 14th as a Category 3 hurricane on the Saffir-Simpson Wind Scale. A third system, Hurricane Polo, formed Tuesday, September 16 and is forecasted to follow a similar track to Norbert and Odile, making it the third such tropical cyclone to develop in the region since the beginning of the month.

This serial cluster of storms has been driven primarily by steady, favorable conditions for tropical cyclone development and consistent atmospheric patterns present over the Eastern Pacific. A serial cluster is defined as a set of storms that form in the same part of a basin, and subsequently follow one another in an unbroken sequence over a relatively short period of time. To qualify as a cluster, there needs to be measurable consistency between the tracks. This is typically a result of steady, predominant atmospheric steering currents, which play a major role in influencing the speed and direction of tropical cyclones. One example of a serial cluster is the four major hurricanes (Charley, Francis, Ivan, and Jeanne) that impacted Florida during a six-week period in 2004.

During this recent two-week period, the area off the west coast of Mexico has maintained high sea-surface temperatures near 85.1 degree Fahrenheit and limited vertical wind shear, leading to an active tropical development region. A mid-level atmospheric ridge over northern Mexico has provided a consistent steering pattern towards the north-northwest, producing similar observed tracks for Norbert and Odile and forecasted track for Polo. Devastating amounts of rainfall have occurred with these storms. Hurricane Odile dropped nearly 18 inches of rain in areas around Cabo San Lucas, representing nearly 21 months-worth of typical rainfall. This cluster, while generating significant wind and flood damage along the Baja Peninsula, has also caused torrential rainfall in the southwestern U.S., including Arizona, southern Nevada, and southern California. Last week, Phoenix, AZ, one of the hardest hit areas, experienced over 3 inches of rain in a 7 hour span due to the remnants of Hurricane Norbert. This was the most rainfall to occur in a 24-hour period in the city since 1911, an estimated 1-in-200 year event by the National Oceanic and Atmospheric Administration. Significant rainfall and inland flooding is forecast to continue as the remnants of Odile and Polo move inland, which may lead to widespread flood losses and the potential for compound post-event loss amplification.

Using Network Theory to Understand the Interconnectivity of Financial Risk

For today’s regulators, systemic risk remains a major issue. Tracing the connections between financial institutions and understanding how different mechanisms of financial contagion might flow through the system is complex.

Modern finance is a collective of the activities of tens of thousands of individual enterprises, all interacting in a “living” system. Today, nobody truly understands this system. It is organic and market-driven, but the fundamental processes that drive it occasionally collapse in a financial crisis that affects us all.

The increasing risk of financial contagion in the financial industry has triggered a new discipline of research – called “network theory in financial risk management” – which is quickly gathering pace. These valuable studies aim to identify and analyze all possible connections between financial institutions, as well as how their interconnectivity can contribute to crisis propagation.

Later this month, Risk.net will launch the Journal of Network Theory in Finance. This journal will compile the key papers of financial risk studies worldwide to provide industry participants with a balanced view of how network theory in finance can be applied to business.

Papers from the inaugural edition of the new journal will be showcased on September 23 at the Financial Risk & Network Theory conference, which is hosted by the Centre for Risk Studies at the University of Cambridge. I will be presenting a keynote on how catastrophe modeling methodologies can be applied to model financial risk contagion.

Our financial institutions are connected in a multitude of ways. For example, by holding similar portfolios of investments, using common settlement mechanisms, owning shares in each other’s companies, and through inter-bank lending.

As the interconnectivity of the world’s financial institutions and markets deepens, financial risk managers and macro-economic planners need to know the likelihood and severity of potential future downturns, particularly the “tail” events of economic catastrophe. Companies must continually understand how they are exposed to the risk of contagion; many were surprised by how fast contagion spread through the financial system during the 2008 credit crunch.

The regulator’s role in limiting the risk of future financial crises includes identifying Systemically Important Financial Institutions (SIFIs) and understanding what aspects of a SIFI’s business to monitor. Regulators have already pioneered network modelling to identify the core banks and to rank their systemic importance, and can now demand much higher standards of risk management from the SIFIs. Increasingly, similar models are being used by risk practitioners and investment managers.

The studies of network theory in financial risk management, such as those carried out by the Centre of Risk Studies, provide valuable insight for all risk practitioners involved in managing financial risk by providing a robust foundation of science from which to understand, model and, ultimately, manage financial risk effectively.

A Message from Matthew Grant: An RMS(one) and Model Update

As I mentioned in my last blog post, tomorrow RMS is presenting an update on the progress of RMS(one)® and the momentum of our modeling agenda at the investor day for our parent company, DMGT. I want to share some of the highlights of that presentation.

We have committed to our clients that we won’t rush RMS(one) to market and that we are taking the time to get it right. To ensure that RMS(one) meets the needs of all of our clients, we are working closely with our Joint Development Partners (JDP) on an agile program of incremental deliverables throughout 2015. We will demonstrate concrete progress on a quarterly basis to our JDPs and we are confident this flexible, agile approach will subsequently meet the needs of our broader client base and ecosystem partners.

We are taking advantage of new technologies to simplify key elements of the design and improve the performance, functionality, and openness of the system to deliver a highly configurable platform. At a minimum, RMS(one) will be ready to deliver our first HD models, and those of our ecosystem partners, to all of our clients in 2015. We will make it easy for those who wish for continuity to gain access to and use the HD models with RiskLink and or in-house tools.

Client feedback from the beta test program is also being incorporated to ensure that RMS(one) is a platform that supports:

  • A system of record for exposure data
  • All current models and the next generation High Definition models
  • Exposure analytics
  • Loss analytics
  • An ecosystem of risk management solutions

In addition, we continue to expand the RMS(one) Developer Network, which includes client developers, third-party modeling, and application development partners.

Our Modeling Momentum Continues

In Spring 2015 we will release RiskLink 15, which includes updates to our flagship North Atlantic Hurricane Model and incorporates lessons from Superstorm Sandy. Also included in RiskLink 15 is our updated Europe Windstorm Model, which has been enhanced to support Solvency II model validation. Both models are complete and are now in the testing phase.

We have made strong progress on the development of our powerful HD models, which will provide the high resolution needed to model tail risk. In 2015, our new HD models will include:

  • New Pan-European Flood
  • Upgraded Japan Typhoon
  • Upgraded New Zealand Earthquake

In 2016, our HD models will include:

  • North America Earthquake
  • U.S. Flood
  • Earthquake and basin-wide typhoon models for APAC

Our expanding model development team, which grew 25% this year, continues to work closely with clients as we build out our models; for example, we recently announced our technical collaboration with China Re to reduce uncertainty in our China Typhoon and Coastal Flood model.

Our presence in Asia is expanding. We recently opened an office in Singapore and have released new modeling products that include our Economic Exposure Databases and Industrial Clusters Catalogs.

I am encouraged by the continued support for RMS(one) from you as our clients and from our parent company DMGT. I look forward to providing you with more details about RMS(one) in the months to come. We have a full agenda for Exceedance, April 27-30, 2015 in Miami, FL., that includes RMS(one) and our model roadmap and I hope to see you there.