Monthly Archives: September 2015

Reflections from Rendezvous: Innovation to Drive Growth in the Global (Re)Insurance Industry

Each year, the (re)insurance industry meets at the Rendezvous in Monte Carlo to discuss pressing issues facing the market. This year, my colleagues and I had lively discussions about the future of our industry, explored what’s top of mind for our clients and partners, and shared our own perspectives.

Source: The Fairmont Monte Carlo

Over the course of the week, a number of themes emerged.

The industry is at an inflection point, poised for growth

The (re)insurance industry is at an inflection point. While the existing market remains soft, there was a growing recognition at the Rendezvous that the real issue is innovation for growth. We heard time and again that too much of the world’s risk is uninsured, and that (re)insurers need strategies to expand coverage to catastrophic events. Not only in the developing world, but in established markets such as the U.S. and Europe.

Flood risk was of particular interest and discussion at the event. Against the backdrop of a changing climate and a growing concentration of exposures, flood losses nearly doubled in the 10 years from 2000 to 2009, compared to the decade prior. With better data and models, (re)insurers are growing confident they can underwrite, structure, and manage flood risks and provide solutions to meet growing global demand.

In many conversations we shared our thesis that the world’s exposures are evolving from assets@risk to systems@risk. Economic growth and activity is vulnerable to disruption in systems, and innovation, supported by models, data and analytics, is needed to provide new forms of coverage. Take cyber, for example. Insurers see significant opportunities for new forms of cyber risk coverage, but there are fundamental gaps in the industry’s understanding of the risk. When the market is better able to understand cyber risks and model and manage accumulations, cyber could really take off.

Alternative capital is no longer alternative

Amidst a general sense of stability—in part due to more acceptance of the “new normal” after falling prices and a number of mergers and acquisitions, and in part due to a very benign catastrophe risk market—there is a shifting dynamic between insurance-linked securities (ILS) and reinsurance. Alternative capital is now mainstream. In fact, one equity analyst called the use of third party capital a “fiduciary duty.”

Risk is opportunity

I was motivated by how many industry leaders see their market as primed for innovation-driven growth. This is not to overlook present day challenges, but to recognize that the industry can combine capital and know-how, increasingly informed by data analytics, to develop new solutions to expand coverage to an increasingly risky and interconnected world. As I recently wrote, risk is opportunity.

The Ever-present Threat of Tsunami: Are We Prepared?

Last week’s Mw8.3 earthquake offshore the Coquimbo region of central Chile served as a reminder that many coastal regions are exposed to earthquake and subsequent tsunami hazard.

While the extent of damage and loss of life from the recent Chile earthquake and tsunami continues to emerge and is tragic in itself, it is safe to say that things could have been much worse. After all, this is the same subduction zone that produced the 1960 Valdivia earthquake (or “Great Chilean earthquake”) 320 miles further to the south—the most powerful earthquake in recorded history.

The 1960 Valdivia earthquake had a magnitude of Mw9.6 and triggered a localized tsunami that battered the Chilean coast with waves in excess of 20 meters as well as far-field tsunami around the Pacific Ocean. Many events of M8.5+ produce tsunami that are truly global in nature and waves of several meters height can even reach coast lines more than 10,000 kilometers away from the event source, highlighting the need for international tsunami warning systems and awareness of population, city planners, and engineers in coastal areas.

 Coastlines At Risk of Tsunami

Tsunami and their deadly consequences have been with us since the beginning of mankind. What’s new, however, is the increasing awareness of the economic and insured losses that tsunami can cause. There are several mega cities in developed and emerging nations that are in the path of a future mega-tsunami, as reported by Dr. Robert Muir-Wood in his report Coastlines at Risk of Giant Earthquakes & Their Mega-Tsunami.

The 2011 earthquake and tsunami off the Pacific coast of Tohoku, Japan acted as a wake-up call to the insurance industry moving tsunami out of its quasi-niche status. With more than 15,000 lives lost, more than USD 300 billion in economic losses, and roughly USD 40 billion in insured losses, clients wanted to know where other similar high magnitude earthquakes and subsequent tsunami could occur, and what they would look like.

In response, RMS studied a multitude of high magnitude (Mw8.9-Mw9.6) event sources around the world and modeled the potential resulting tsunami scenarios. The scenarios are included in the RMS® Global Tsunami Scenario Catalog and include both historical and potential high-magnitude tsunami events that can be used to identify loss accumulations and guide underwriting decisions.

For example, below is an example output, showing the potential impact of a recurrence of the 1877 Chile Mw9.1 Earthquake (Fig 1a) and the impact of a potential future M9 scenario (Fig 1b) stemming from the Nankai Trough on the coast of Toyohashi, Japan.

Fig 1a: Re-simulation of the 1877 Chile Mw9.1 Earthquake. Coquimbo area shown. The inundation from this event would impact the entire Chilean coastline and exceed 9 meters inundation depth (further to the North). Fig 1b: M9 scenario originating on the Nankai Trough south of Japan, impacting the city of Toyohashi (population ~376 thousand), with inundation going far inland and exceeding 6 meters in height.

With rapid advances in science and engineering enabling a deeper understanding of tsunami risk, the insurance industry, city planners and local communities can better prepare for devastating tsunami, implementing appropriate risk mitigation strategies to reduce fatalities and the financial shocks that could be triggered by the next “big one.”

Exposure Data: The Undervalued Competitive Edge

High-quality catastrophe exposure data is key to a resilient and competitive insurer’s business. It can improve a wide range of risk management decisions, from basic geographical risk diversification to more advanced deterministic and probabilistic modeling.

The need to capture and use high quality exposure data is not new to insurance veterans. It is often referred to as the “garbage-in-garbage-out” principle, highlighting the dependency of catastrophe model’s output on reliable, high quality exposure data.

The underlying logic of this principle is echoed in the EU directive Solvency II, which requires firms to have a quantitative understanding of the uncertainties in their catastrophe models; including a thorough understanding of the uncertainties propagated by the data that feeds the models.

The competitive advantage of better exposure data

The implementation of Solvency II will lead to a better understanding of risk, increasing the resilience and competitiveness of insurance companies.

Firms see this, and more insurers are no longer passively reacting to the changes brought about by Solvency II. Increasingly, firms see the changes as an opportunity to proactively implement measures that improve exposure data quality and exposure data management.

And there is good reason for doing so: The majority of reinsurers polled recently by EY (formerly known as Ernst & Young) said quality of exposure data was their biggest concern. As a result, many reinsurers apply significant surcharges to cedants that are perceived to have low-quality exposure data and exposure management standards. Conversely, reinsurers are more likely to provide premium credits of 5 to 10 percent or offer additional capacity to cedants that submit high-quality exposure data.

Rating agencies and investors also expect more stringent exposure management processes and higher exposure data standards. Sound exposure data practices are, therefore, increasingly a priority for senior management, and changes are driven with the mindset of benefiting from the competitive advantage that high-quality exposure data offers.

However, managing the quality of exposure data over time can be a challenge: During its life cycle, exposure data degrades as it’s frequently reformatted and re-entered while passed on between different insurance entities along the insurance chain.

To fight the decrease of data quality, insurers spend considerable time and resources to re-format and re-enter exposure data as its being passed on along the insurance chain (and between departments within each individual touch point on the chain). However, due to the different systems, data standards and contract definitions in place a lot of this work remains manual and repetitive, inviting human error.

In this context, RMS’ new data standards, exposure management systems, and contract definition languages will be of interest to many insurers; not only because it will help them to tackle the data quality issue, but also by bringing considerable savings through reduced overhead expenditure, enabling clients to focus on their core insurance business.

Asia’s Costliest Cyclones: The Curse of September

The northwest Pacific is the most active tropical cyclone basin in the world, having produced some of the most intense and costly cyclone events on record. The 2015 typhoon season has been particularly active due to this year’s strong El Niño conditions.

Sea surface temperature in the equatorial Pacific Ocean. El Niño is characterized by unusually warm temperatures in the equatorial Pacific. (NOAA)

The unpredictable nature of the El Niño phenomenon, which affects the genesis and pathway of tropical cyclones, and the complexity of tropical cyclone systems underscore the need to fully understand typhoon risk—particularly in Japan where exposure concentrations are high. Catastrophe models, such as the forthcoming RMS® Japan Typhoon Model, using a basin-wide event set to model the three key correlated perils—wind, inland and coastal flood—are more effective in enabling firms to price and manage the ever-evolving exposures that are at risk from this multifaceted peril.

The Significance of September

Peak typhoon season in the northwest Pacific basin is between July and October, but it’s September that typically sees the highest number of strong category 3-5 typhoons making landfall: eight of the top ten greatest insured losses from northwest Pacific tropical cyclones since 1980 all occurred in September.

In September, during El Niño years, Guam is significantly more susceptible to a higher proportion of landfalls, and Japan and Taiwan experience a slight increase due to the genesis and pathway of tropical cyclones. While wind is the primary driver of tropical cyclone loss in Japan, inland and coastal flooding also contribute substantially to the loss.

In September 1999, Typhoon Bart caused $3.5 billion in insured losses due to strong winds, heavy rainfall, and one of the highest storm surges on record at the time. The height of the storm surge reached 3.5 meters in Yatushiro Bay, western Japan, and destroyed coastal defences, inundating vast areas of land.

Five years later in September 2004, Typhoon Songda caused insured losses of $4.7 billion. Much of the loss was caused by rain-related events and flooding of more than 10,000 homes across South Korea and Japan in the Chugoku region, western Honshu.

Table 1 Top 10 Costliest Tropical Storms in Asia (1980-2014):

Date Event Affected Area Maximum Strength (SSHWS) Insured Loss ($mn)
Sept, 1991 Mireille Japan Cat 4 6,000
Sept, 2004 Songda Japan, South Korea Cat 4 4,700
Sept, 1999 Bart Japan, South Korea Cat 5 3,500
Sept, 1998 Vicki Japan, Philippines Cat 2 1,600
Oct, 2004 Tokage Japan Cat 4 1,300
Sept 2011 Roke Japan Cat 4 1,200
Aug – Sept, 2004 Chaba Japan, Russia Cat 5 1,200
Sept, 2006 Shanshan Japan, South Korea Cat 4 1,200
Sept, 2000 Saomai Japan, South Korea, Guam, Russia Cat 5 1,100
Sept, 1993 Yancy Japan Cat 4 980

Munich Re

September 2015 – A Costly Landfall for Japan?

This September we have already seen Tropical Storm Etau, which brought heavy rains to Aichi Prefecture on Honshu Island causing immense flooding to more than 16,000 buildings, and triggered dozens of landslides and mudslides.

The increased tropical cyclone activity in the northwest Pacific this year has been attributed to an El Niño event that is forecast to strengthen further. Two factors linked to El Niño events suggest that this September could still see a costly landfall in Japan:

  • El Nino conditions drive the formation of tropical cyclones further eastward, increasing the travel times and distances of typhoons over water, giving rise to more intense events.
  • More northward recurving of storms produces tropical cyclones that track towards Japan, increasing the number of typhoons that could make landfall.

Combined, the above conditions increase the number of strong typhoons that make landfall in Japan.

Damaging Typhoons Don’t Just Occur In September

Damaging typhoons don’t just occur in September or El Niño years – they can happen under any conditions.

Of the ten costliest events, only Typhoon Mireille in 1999 and Typhoons Songda, Chaba, and Tokage, all of which made landfall in 2004, occurred during El Niño years

Look out for more information on this topic in the RMS paper “Effects of the El Niño Southern Oscillation on Typhoon Landfalls in the Northwest Pacific”, due to be published in October.

A Coup Against Flu?

Recent articles from two separate research groups published in Science and Nature Medicine report major breakthroughs in flu vaccine research. The advances could ultimately lead to the holy grail of influenza prevention–a universal flu vaccine.

By conferring immunity against large numbers of flu strains, the new vaccines have the potential to reduce the severity of seasonal flu outbreaks and vastly reduce the risk of novel pandemics.  Using the RMS Infectious Disease Model, we calculated that if such vaccines were able to confer immunity to 50% of people globally, the risk of a novel flu pandemic outbreak could be reduced by as much as 75%.

This would be a huge success in reducing the risk of excess mortality events and improving global health. Though I should emphasise that while Edward Jenner invented the smallpox vaccine in 1796, it took until 1980 for smallpox to be eradicated from the wild. Beyond development of effective broad-spectrum vaccines, there is a lot of work to do to make the world safe from flu.

A high proportion of flu victims are the elderly. Significantly reducing deaths from flu would disproportionately reduce old-age mortality. This is particularly interesting; not only is it an important milestone in improving old-age public health, it is also relevant to old-age care and budgeting for retirement too.

Influenza Is The Most Likely Source of Future Pandemic Sickness and Mortality. 

In the U.S., in a single flu season, the average number of flu-related deaths is 30-40,000, peaking at 47,000 deaths in previous seasons. This does not take account of the viruses that can cause major pandemics: death tolls in the 1918 “Spanish Flu” event reached as high 50-100 million people worldwide. 

Widespread use of a universal vaccine conferring lifelong immunity could eliminate these deaths, making a meaningful contribution to reducing infectious disease mortality.

Structure of influenza, showing hemagglutinin as HA. Source

The marvel of the new vaccines under development is their potential to confer immunity against many strains, including ones that have not yet emerged. They work by using cutting-edge molecular machinery to target the stem of the haemagglutinin protein on the virus’ surface. The vaccines have only been tested on animal models and only on a small scale so far, but have worked well in reducing viral loads and mortality in these tests.

If this breakthrough translates into future vaccines that prove efficacious in clinical trials, these could become immensely powerful in combatting both seasonal flu cases and reducing the likelihood of new flu pandemics.

Today, beyond seasonal flu, there are no vaccines capable of preventing novel flu pandemics. However, the production pipeline for the current seasonal flu vaccine can be put to use in pandemics, with current capacity of pipelines estimated to produce decisive quantities of vaccine within three months of a pandemic outbreak.

As quantified in the RMS Infectious Disease Model, while this current technology has the potential to substantially reduce the total caseload of a pandemic, it is not a panacea. Three months is a relatively long time for highly transmissible viruses, so very large numbers of people could be infected in this interval. Even more infections would happen during the roll-out period before the vaccine has successfully been given to sufficient people to halt the spread. Furthermore, complications could emerge during the production that either mean it takes longer than three months, or that such a vaccine only confers partial immunity.

RMS created the world’s first probabilistic model of pandemic influenza and the first probabilistic model of vaccine development, delivery, and efficacy. The recent breakthroughs in flu vaccine research are welcome news and RMS scientists are closely monitoring the developments.

Understanding the Principles of Earthquake Modeling from the 1999 Athens Earthquake Event

The 1999 Athens Earthquake occurred on September 7, 1999, registering a moment-magnitude of 6.0 (USGS). The tremor’s epicenter was located approximately 17km to the northwest of the city center. Its proximity to the Athens Metropolitan Area resulted in widespread structural damage.

More than 100 buildings including three major factories across the area collapsed. Overall, 143 people lost their lives and more than 2,000 were treated for injuries in what eventually became Greece’s deadliest natural disaster in almost half a century. In total the event caused total economic losses of $3.5 billion, while insured loss was $130 million (AXCO).


Losses from such events can often be difficult to predict; historical experience alone is inadequate to predict future losses. Earthquake models can assist in effectively managing this risk, but must take into account the unique features that the earthquake hazard presents, as the 1999 Athens Earthquake event highlights.

Background seismicity must be considered to capture all potential earthquake events

The 1999 event took Greek seismologists by surprise as it came from a previously unknown fault. Such events present a challenge to (re)insurers as they may not be aware of the risk to properties in the area, and have no historical basis for comparison. Effective earthquake models must not only incorporate events on known fault structures, but also capture the background seismicity. This allows potential events on unknown or complicated fault structures to be recorded, ensuring that the full spectrum of possible earthquake events is captured.

Hazard can vary greatly over a small geographical distance due to local site conditions

Soil type had significant implications in this event. Athens has grown tremendously with the expansion of the population into areas of poorer soil in the suburbs, with many industrial areas concentrated along the alluvial basins of the Kifissos and Ilisos rivers. This has increased the seismic hazard greatly with such soils amplifying the ground motions of an earthquake.

The non-uniform soil conditions across the Athens region resulted in an uneven distribution of severe damage in certain regions. The town of Adames in particular, located on the eastern side of the Kifissos river canyon, experienced unexpectedly heavy damage wheras other towns of equal distance to the epicenter, such as Kamatero, experienced slight damage. (Assimaki et al. 2005)

Earthquake models must take such site-specific effects into account in order to provide a local view of the hazard. In order to achieve this, high-resolution geotechnical data, including information on the soil type, is utilized to determine how ground motions are converted to ground shaking at a specific site, allowing for effective differentiation between risks on a location level basis.

Building properties have a large impact upon damageability

The 1999 Athens event resulted in the severe structural damage to, in some cases the partial or total collapse of, number of reinforced concrete frame structures. Most of these severely damaged structures were designed according to older seismic codes, only able to withstand significantly lower forces than those experienced during the earthquake. (Elenas, 2003)

A typical example of structural damage to a three-story residential reinforced-concrete building at about 8km from the epicentre on soft soil. (Tselentis and Zahradnik, 2000)

Earthquake models must account for such differences in building construction and age. Variations in local seismic codes and construction practices the vulnerability of structures can change greatly between different countries and regions, with it important to factor these geographical contrasts in. It is important for earthquake models to capture these geographical differences of building codes and this can be done through the regionalization of vulnerability.

Additionally, the Athens earthquake predominantly affected both low and middle rise buildings of two to four stories. The measured spectral acceleration (a unit describing the maximum acceleration of a building during an earthquake) decreased rapidly for buildings with five stories or more, indicating that this particular event did not affect high rise buildings severely. (Anastasiadis et al. 1999)

Spectral response based methodology most accurately estimates damage, modeling a building’s actual response to ground motions. This response is highly dependent upon building height. Due to the smaller natural period at which low and middle rise buildings oscillate or sway, they respond greater to higher frequency seismic waves such as those generated by the 1999 Athens event; while the reaction of high rise buildings is the opposite, responding the most to long period seismic waves.

The key features of the RMS Europe Earthquake Models ensure the accurate modeling of events such as the 1999 Athens Earthquake, providing a tool to effectively underwrite and manage earthquake risk across the breadth of Europe.