Reflections from Rendezvous: Innovation to Drive Growth in the Global (Re)Insurance Industry

Each year, the (re)insurance industry meets at the Rendezvous in Monte Carlo to discuss pressing issues facing the market. This year, my colleagues and I had lively discussions about the future of our industry, explored what’s top of mind for our clients and partners, and shared our own perspectives.

Source: The Fairmont Monte Carlo

Over the course of the week, a number of themes emerged.

The industry is at an inflection point, poised for growth

The (re)insurance industry is at an inflection point. While the existing market remains soft, there was a growing recognition at the Rendezvous that the real issue is innovation for growth. We heard time and again that too much of the world’s risk is uninsured, and that (re)insurers need strategies to expand coverage to catastrophic events. Not only in the developing world, but in established markets such as the U.S. and Europe.

Flood risk was of particular interest and discussion at the event. Against the backdrop of a changing climate and a growing concentration of exposures, flood losses nearly doubled in the 10 years from 2000 to 2009, compared to the decade prior. With better data and models, (re)insurers are growing confident they can underwrite, structure, and manage flood risks and provide solutions to meet growing global demand.

In many conversations we shared our thesis that the world’s exposures are evolving from assets@risk to systems@risk. Economic growth and activity is vulnerable to disruption in systems, and innovation, supported by models, data and analytics, is needed to provide new forms of coverage. Take cyber, for example. Insurers see significant opportunities for new forms of cyber risk coverage, but there are fundamental gaps in the industry’s understanding of the risk. When the market is better able to understand cyber risks and model and manage accumulations, cyber could really take off.

Alternative capital is no longer alternative

Amidst a general sense of stability—in part due to more acceptance of the “new normal” after falling prices and a number of mergers and acquisitions, and in part due to a very benign catastrophe risk market—there is a shifting dynamic between insurance-linked securities (ILS) and reinsurance. Alternative capital is now mainstream. In fact, one equity analyst called the use of third party capital a “fiduciary duty.”

Risk is opportunity

I was motivated by how many industry leaders see their market as primed for innovation-driven growth. This is not to overlook present day challenges, but to recognize that the industry can combine capital and know-how, increasingly informed by data analytics, to develop new solutions to expand coverage to an increasingly risky and interconnected world. As I recently wrote, risk is opportunity.

The Ever-present Threat of Tsunami: Are We Prepared?

Last week’s Mw8.3 earthquake offshore the Coquimbo region of central Chile served as a reminder that many coastal regions are exposed to earthquake and subsequent tsunami hazard.

While the extent of damage and loss of life from the recent Chile earthquake and tsunami continues to emerge and is tragic in itself, it is safe to say that things could have been much worse. After all, this is the same subduction zone that produced the 1960 Valdivia earthquake (or “Great Chilean earthquake”) 320 miles further to the south—the most powerful earthquake in recorded history.

The 1960 Valdivia earthquake had a magnitude of Mw9.6 and triggered a localized tsunami that battered the Chilean coast with waves in excess of 20 meters as well as far-field tsunami around the Pacific Ocean. Many events of M8.5+ produce tsunami that are truly global in nature and waves of several meters height can even reach coast lines more than 10,000 kilometers away from the event source, highlighting the need for international tsunami warning systems and awareness of population, city planners, and engineers in coastal areas.

 Coastlines At Risk of Tsunami

Tsunami and their deadly consequences have been with us since the beginning of mankind. What’s new, however, is the increasing awareness of the economic and insured losses that tsunami can cause. There are several mega cities in developed and emerging nations that are in the path of a future mega-tsunami, as reported by Dr. Robert Muir-Wood in his report Coastlines at Risk of Giant Earthquakes & Their Mega-Tsunami.

The 2011 earthquake and tsunami off the Pacific coast of Tohoku, Japan acted as a wake-up call to the insurance industry moving tsunami out of its quasi-niche status. With more than 15,000 lives lost, more than USD 300 billion in economic losses, and roughly USD 40 billion in insured losses, clients wanted to know where other similar high magnitude earthquakes and subsequent tsunami could occur, and what they would look like.

In response, RMS studied a multitude of high magnitude (Mw8.9-Mw9.6) event sources around the world and modeled the potential resulting tsunami scenarios. The scenarios are included in the RMS® Global Tsunami Scenario Catalog and include both historical and potential high-magnitude tsunami events that can be used to identify loss accumulations and guide underwriting decisions.

For example, below is an example output, showing the potential impact of a recurrence of the 1877 Chile Mw9.1 Earthquake (Fig 1a) and the impact of a potential future M9 scenario (Fig 1b) stemming from the Nankai Trough on the coast of Toyohashi, Japan.

Fig 1a: Re-simulation of the 1877 Chile Mw9.1 Earthquake. Coquimbo area shown. The inundation from this event would impact the entire Chilean coastline and exceed 9 meters inundation depth (further to the North). Fig 1b: M9 scenario originating on the Nankai Trough south of Japan, impacting the city of Toyohashi (population ~376 thousand), with inundation going far inland and exceeding 6 meters in height.

With rapid advances in science and engineering enabling a deeper understanding of tsunami risk, the insurance industry, city planners and local communities can better prepare for devastating tsunami, implementing appropriate risk mitigation strategies to reduce fatalities and the financial shocks that could be triggered by the next “big one.”

Exposure Data: The Undervalued Competitive Edge

High-quality catastrophe exposure data is key to a resilient and competitive insurer’s business. It can improve a wide range of risk management decisions, from basic geographical risk diversification to more advanced deterministic and probabilistic modeling.

The need to capture and use high quality exposure data is not new to insurance veterans. It is often referred to as the “garbage-in-garbage-out” principle, highlighting the dependency of catastrophe model’s output on reliable, high quality exposure data.

The underlying logic of this principle is echoed in the EU directive Solvency II, which requires firms to have a quantitative understanding of the uncertainties in their catastrophe models; including a thorough understanding of the uncertainties propagated by the data that feeds the models.

The competitive advantage of better exposure data

The implementation of Solvency II will lead to a better understanding of risk, increasing the resilience and competitiveness of insurance companies.

Firms see this, and more insurers are no longer passively reacting to the changes brought about by Solvency II. Increasingly, firms see the changes as an opportunity to proactively implement measures that improve exposure data quality and exposure data management.

And there is good reason for doing so: The majority of reinsurers polled recently by EY (formerly known as Ernst & Young) said quality of exposure data was their biggest concern. As a result, many reinsurers apply significant surcharges to cedants that are perceived to have low-quality exposure data and exposure management standards. Conversely, reinsurers are more likely to provide premium credits of 5 to 10 percent or offer additional capacity to cedants that submit high-quality exposure data.

Rating agencies and investors also expect more stringent exposure management processes and higher exposure data standards. Sound exposure data practices are, therefore, increasingly a priority for senior management, and changes are driven with the mindset of benefiting from the competitive advantage that high-quality exposure data offers.

However, managing the quality of exposure data over time can be a challenge: During its life cycle, exposure data degrades as it’s frequently reformatted and re-entered while passed on between different insurance entities along the insurance chain.

To fight the decrease of data quality, insurers spend considerable time and resources to re-format and re-enter exposure data as its being passed on along the insurance chain (and between departments within each individual touch point on the chain). However, due to the different systems, data standards and contract definitions in place a lot of this work remains manual and repetitive, inviting human error.

In this context, RMS’ new data standards, exposure management systems, and contract definition languages will be of interest to many insurers; not only because it will help them to tackle the data quality issue, but also by bringing considerable savings through reduced overhead expenditure, enabling clients to focus on their core insurance business.

Asia’s Costliest Cyclones: The Curse of September

The northwest Pacific is the most active tropical cyclone basin in the world, having produced some of the most intense and costly cyclone events on record. The 2015 typhoon season has been particularly active due to this year’s strong El Niño conditions.

Sea surface temperature in the equatorial Pacific Ocean. El Niño is characterized by unusually warm temperatures in the equatorial Pacific. (NOAA)

The unpredictable nature of the El Niño phenomenon, which affects the genesis and pathway of tropical cyclones, and the complexity of tropical cyclone systems underscore the need to fully understand typhoon risk—particularly in Japan where exposure concentrations are high. Catastrophe models, such as the forthcoming RMS® Japan Typhoon Model, using a basin-wide event set to model the three key correlated perils—wind, inland and coastal flood—are more effective in enabling firms to price and manage the ever-evolving exposures that are at risk from this multifaceted peril.

The Significance of September

Peak typhoon season in the northwest Pacific basin is between July and October, but it’s September that typically sees the highest number of strong category 3-5 typhoons making landfall: eight of the top ten greatest insured losses from northwest Pacific tropical cyclones since 1980 all occurred in September.

In September, during El Niño years, Guam is significantly more susceptible to a higher proportion of landfalls, and Japan and Taiwan experience a slight increase due to the genesis and pathway of tropical cyclones. While wind is the primary driver of tropical cyclone loss in Japan, inland and coastal flooding also contribute substantially to the loss.

In September 1999, Typhoon Bart caused $3.5 billion in insured losses due to strong winds, heavy rainfall, and one of the highest storm surges on record at the time. The height of the storm surge reached 3.5 meters in Yatushiro Bay, western Japan, and destroyed coastal defences, inundating vast areas of land.

Five years later in September 2004, Typhoon Songda caused insured losses of $4.7 billion. Much of the loss was caused by rain-related events and flooding of more than 10,000 homes across South Korea and Japan in the Chugoku region, western Honshu.

Table 1 Top 10 Costliest Tropical Storms in Asia (1980-2014):

Date Event Affected Area Maximum Strength (SSHWS) Insured Loss ($mn)
Sept, 1991 Mireille Japan Cat 4 6,000
Sept, 2004 Songda Japan, South Korea Cat 4 4,700
Sept, 1999 Bart Japan, South Korea Cat 5 3,500
Sept, 1998 Vicki Japan, Philippines Cat 2 1,600
Oct, 2004 Tokage Japan Cat 4 1,300
Sept 2011 Roke Japan Cat 4 1,200
Aug – Sept, 2004 Chaba Japan, Russia Cat 5 1,200
Sept, 2006 Shanshan Japan, South Korea Cat 4 1,200
Sept, 2000 Saomai Japan, South Korea, Guam, Russia Cat 5 1,100
Sept, 1993 Yancy Japan Cat 4 980

Munich Re

September 2015 – A Costly Landfall for Japan?

This September we have already seen Tropical Storm Etau, which brought heavy rains to Aichi Prefecture on Honshu Island causing immense flooding to more than 16,000 buildings, and triggered dozens of landslides and mudslides.

The increased tropical cyclone activity in the northwest Pacific this year has been attributed to an El Niño event that is forecast to strengthen further. Two factors linked to El Niño events suggest that this September could still see a costly landfall in Japan:

  • El Nino conditions drive the formation of tropical cyclones further eastward, increasing the travel times and distances of typhoons over water, giving rise to more intense events.
  • More northward recurving of storms produces tropical cyclones that track towards Japan, increasing the number of typhoons that could make landfall.

Combined, the above conditions increase the number of strong typhoons that make landfall in Japan.

Damaging Typhoons Don’t Just Occur In September

Damaging typhoons don’t just occur in September or El Niño years – they can happen under any conditions.

Of the ten costliest events, only Typhoon Mireille in 1999 and Typhoons Songda, Chaba, and Tokage, all of which made landfall in 2004, occurred during El Niño years

Look out for more information on this topic in the RMS paper “Effects of the El Niño Southern Oscillation on Typhoon Landfalls in the Northwest Pacific”, due to be published in October.

A Coup Against Flu?

Recent articles from two separate research groups published in Science and Nature Medicine report major breakthroughs in flu vaccine research. The advances could ultimately lead to the holy grail of influenza prevention–a universal flu vaccine.

By conferring immunity against large numbers of flu strains, the new vaccines have the potential to reduce the severity of seasonal flu outbreaks and vastly reduce the risk of novel pandemics.  Using the RMS Infectious Disease Model, we calculated that if such vaccines were able to confer immunity to 50% of people globally, the risk of a novel flu pandemic outbreak could be reduced by as much as 75%.

This would be a huge success in reducing the risk of excess mortality events and improving global health. Though I should emphasise that while Edward Jenner invented the smallpox vaccine in 1796, it took until 1980 for smallpox to be eradicated from the wild. Beyond development of effective broad-spectrum vaccines, there is a lot of work to do to make the world safe from flu.

A high proportion of flu victims are the elderly. Significantly reducing deaths from flu would disproportionately reduce old-age mortality. This is particularly interesting; not only is it an important milestone in improving old-age public health, it is also relevant to old-age care and budgeting for retirement too.

Influenza Is The Most Likely Source of Future Pandemic Sickness and Mortality. 

In the U.S., in a single flu season, the average number of flu-related deaths is 30-40,000, peaking at 47,000 deaths in previous seasons. This does not take account of the viruses that can cause major pandemics: death tolls in the 1918 “Spanish Flu” event reached as high 50-100 million people worldwide. 

Widespread use of a universal vaccine conferring lifelong immunity could eliminate these deaths, making a meaningful contribution to reducing infectious disease mortality.

Structure of influenza, showing hemagglutinin as HA. Source

The marvel of the new vaccines under development is their potential to confer immunity against many strains, including ones that have not yet emerged. They work by using cutting-edge molecular machinery to target the stem of the haemagglutinin protein on the virus’ surface. The vaccines have only been tested on animal models and only on a small scale so far, but have worked well in reducing viral loads and mortality in these tests.

If this breakthrough translates into future vaccines that prove efficacious in clinical trials, these could become immensely powerful in combatting both seasonal flu cases and reducing the likelihood of new flu pandemics.

Today, beyond seasonal flu, there are no vaccines capable of preventing novel flu pandemics. However, the production pipeline for the current seasonal flu vaccine can be put to use in pandemics, with current capacity of pipelines estimated to produce decisive quantities of vaccine within three months of a pandemic outbreak.

As quantified in the RMS Infectious Disease Model, while this current technology has the potential to substantially reduce the total caseload of a pandemic, it is not a panacea. Three months is a relatively long time for highly transmissible viruses, so very large numbers of people could be infected in this interval. Even more infections would happen during the roll-out period before the vaccine has successfully been given to sufficient people to halt the spread. Furthermore, complications could emerge during the production that either mean it takes longer than three months, or that such a vaccine only confers partial immunity.

RMS created the world’s first probabilistic model of pandemic influenza and the first probabilistic model of vaccine development, delivery, and efficacy. The recent breakthroughs in flu vaccine research are welcome news and RMS scientists are closely monitoring the developments.

Understanding the Principles of Earthquake Modeling from the 1999 Athens Earthquake Event

The 1999 Athens Earthquake occurred on September 7, 1999, registering a moment-magnitude of 6.0 (USGS). The tremor’s epicenter was located approximately 17km to the northwest of the city center. Its proximity to the Athens Metropolitan Area resulted in widespread structural damage.

More than 100 buildings including three major factories across the area collapsed. Overall, 143 people lost their lives and more than 2,000 were treated for injuries in what eventually became Greece’s deadliest natural disaster in almost half a century. In total the event caused total economic losses of $3.5 billion, while insured loss was $130 million (AXCO).


Losses from such events can often be difficult to predict; historical experience alone is inadequate to predict future losses. Earthquake models can assist in effectively managing this risk, but must take into account the unique features that the earthquake hazard presents, as the 1999 Athens Earthquake event highlights.

Background seismicity must be considered to capture all potential earthquake events

The 1999 event took Greek seismologists by surprise as it came from a previously unknown fault. Such events present a challenge to (re)insurers as they may not be aware of the risk to properties in the area, and have no historical basis for comparison. Effective earthquake models must not only incorporate events on known fault structures, but also capture the background seismicity. This allows potential events on unknown or complicated fault structures to be recorded, ensuring that the full spectrum of possible earthquake events is captured.

Hazard can vary greatly over a small geographical distance due to local site conditions

Soil type had significant implications in this event. Athens has grown tremendously with the expansion of the population into areas of poorer soil in the suburbs, with many industrial areas concentrated along the alluvial basins of the Kifissos and Ilisos rivers. This has increased the seismic hazard greatly with such soils amplifying the ground motions of an earthquake.

The non-uniform soil conditions across the Athens region resulted in an uneven distribution of severe damage in certain regions. The town of Adames in particular, located on the eastern side of the Kifissos river canyon, experienced unexpectedly heavy damage wheras other towns of equal distance to the epicenter, such as Kamatero, experienced slight damage. (Assimaki et al. 2005)

Earthquake models must take such site-specific effects into account in order to provide a local view of the hazard. In order to achieve this, high-resolution geotechnical data, including information on the soil type, is utilized to determine how ground motions are converted to ground shaking at a specific site, allowing for effective differentiation between risks on a location level basis.

Building properties have a large impact upon damageability

The 1999 Athens event resulted in the severe structural damage to, in some cases the partial or total collapse of, number of reinforced concrete frame structures. Most of these severely damaged structures were designed according to older seismic codes, only able to withstand significantly lower forces than those experienced during the earthquake. (Elenas, 2003)

A typical example of structural damage to a three-story residential reinforced-concrete building at about 8km from the epicentre on soft soil. (Tselentis and Zahradnik, 2000)

Earthquake models must account for such differences in building construction and age. Variations in local seismic codes and construction practices the vulnerability of structures can change greatly between different countries and regions, with it important to factor these geographical contrasts in. It is important for earthquake models to capture these geographical differences of building codes and this can be done through the regionalization of vulnerability.

Additionally, the Athens earthquake predominantly affected both low and middle rise buildings of two to four stories. The measured spectral acceleration (a unit describing the maximum acceleration of a building during an earthquake) decreased rapidly for buildings with five stories or more, indicating that this particular event did not affect high rise buildings severely. (Anastasiadis et al. 1999)

Spectral response based methodology most accurately estimates damage, modeling a building’s actual response to ground motions. This response is highly dependent upon building height. Due to the smaller natural period at which low and middle rise buildings oscillate or sway, they respond greater to higher frequency seismic waves such as those generated by the 1999 Athens event; while the reaction of high rise buildings is the opposite, responding the most to long period seismic waves.

The key features of the RMS Europe Earthquake Models ensure the accurate modeling of events such as the 1999 Athens Earthquake, providing a tool to effectively underwrite and manage earthquake risk across the breadth of Europe.

Salafi-Jihadists and Chemical, Biological, Radiological, Nuclear Terrorism: Evaluating the Threat

Chemical, biological, radiological, and nuclear (CBRN) weapons attacks constitute a sizeable portion of the terrorism risk confronting the insurance industry. A CBRN attack is most likely to occur in a commercial business center, potentially generating significant business interruption losses due to evacuation and decontamination, in addition to any property damage or casualties that occur. In the past, there has been a general agreement among leading counter-terrorism experts that the use of a CBRN weapon by a terrorist group is unlikely as these armaments were expensive, difficult to acquire, and complicated to weaponize as well as to deploy. Moreover, with the operational environment being curtailed by national security agencies, it would be a challenge for any group to orchestrate a large CBRN attack, particularly in the West. However, the current instability in the Middle East may have shifted the paradigm of thought about the use of CBRN weapons by a terrorist group. Here are some reasons:

  1. Aspiring Terrorist Groups

The current instability in the Middle East, particularly the conflict in Syria and the ongoing Sunni insurgency in Iraq, has energized the salafi-jihadi groups and has emboldened their supporters to orchestrate large-scale casualty attacks. More harrowing is the fact that salafi-jihadi groups have been linked to several CBRN terrorist attacks. Horrific images and witness accounts have led to claims that local Sunni militants used chemical weapons against Kurdish militants in Syria and security forces in Iraq.

U.N. chemical weapons experts prepare before collecting samples from one of the sites of an alleged chemical weapons attack in Damascus’ suburb of Zamalka. (Bassam Khabieh/Reuters)

CBRN attack modes appeal more to religious terrorist groups than to other types of terrorist organizations because, while more “secular” terrorist groups might hesitate to kill many civilians for fear of alienating their support network, religious terrorist organizations tend to regard such violence as not only morally justified but expedient for the attainment of their goals.

In Iraq and in Syria, the strongest salafi-jihadi group is the Islamic State, which has an even more virulent view of jihad than their counterpart al-Qaida. Several American counter-terrorism experts have warned that the Islamic State has been working to build the capabilities to execute mass casualty attacks out of their area of operation—a significant departure from the group’s focus on encouraging lone wolf attacks outside their domain.

  1. Access to Financial Resources

To compound the threat, the Islamic State has access to extraordinary levels of funding that make the procurement of supplies to develop CBRN agents a smaller hurdle to overcome. A study done by Reuters in October 2014 estimates that the Islamic State possesses assets of more than of US$2 trillion, with an annual income amounting to US$2.9 billionWhile this is a conservative estimate and much of their financial resources would be allocated to run their organization as well as maintain control of their territory, it still offers them ample funding to have a credible viable CBRN program.

  1. Increased Number of Safe Havens

Operating in weak or failing states can offer such a haven in which terrorist groups can function freely and shelter from authorities seeking to disrupt their activities. Currently, the Islamic State has control of almost 50% of Syria and has seized much of northern Iraq, including the major city of Mosul. The fear is that there are individuals working in the Islamic State-controlled campuses of the University of Mosul or in some CBRN facility in the Syrian city of Raqqa, the group’s de facto capital, to develop such weapons.

  1. Accessibility of a CBRN Arsenal

Despite commendable efforts by the Organization for the Prohibition of Chemical Weapons (OPCW) to render Syrian’s CBRN stockpiles obsolete, it is still unclear whether the Assad regime has destroyed their CBRN arsenal. As such, access to CBRN materials in Syria is still a significant concern as there are many potential CBRN sites that could be pilfered by a terrorist group. For example, in April 2013, militants in Aleppo targeted the al-Safira chemical facility, a pivotal production center for Syria’s chemical weapons program.

This problem is not limited to Syria. In Iraq, where security and centralized control is also weak, it was reported in July 2014 that Islamic State fighters were able to seize more than 80 pounds of uranium from the University of Mosul. Although the material was not enriched to the point of constituting a nuclear threat, the radioactive uranium isotopes could have been used to make a crude radiological dispersal device (RDD).

  1. Role Of Foreign Jihadists

The Islamic State’s success in attracting foreigners has been unparalleled, with more than 20,000 foreign individuals joining their group. University educated foreign jihadists potentially provide the Islamic State with a pool of individuals with the requisite scientific expertise to develop and use CBRN weapons. In August 2014, a laptop owned by a Tunisian physics university student fighting with the Islamic State in Syria was discovered to contain a 19-page document on how to develop bubonic plague from infected animals and weaponize it. Many in the counter-terrorism field have concerns that individuals with such a background could be given a CBRN agent and then trained to orchestrate an attack. They might even return to their countries of origin to conduct attacks back in their homeland.

Terrorist groups such as the Islamic State continue to show keen desire to acquire and develop such weapons. Based on anecdotal evidence, there is enough credible information to show that the Islamic State has at least a nascent CBRN program. Fortunately, obtaining a CBRN capable of killing hundreds, much less thousands, is still a significant technical and logistical challenge. Al-qaida in the past has tried unsuccessfully to acquire such weapons, while the counter-terrorism forces globally have devoted significant resources to prevent terrorist groups from making any breakthrough. Current evidence suggests that the salafi-jihadists are still far from such capabilities, and at best can only produce crude CBRN agents that are more suited for smaller attacks. However, the Islamic State, with their sizeable financial resources, their success in recruiting skilled individuals, and the availability of CBRN materials in Iraq and Syria, has increased the probability that they could carry out a successful large CBRN attack. As such, it seems that it is a matter not of “if,” but rather of “when,” a mass CBRN attack could occur.

Coastal Flood: Rising Risk in New Orleans and Beyond

As we come up on the tenth anniversary of Hurricane Katrina, a lot of the focus is on New Orleans. But while New Orleans is far from being able to ignore its risk, it’s not the most vulnerable to coastal flood. RMS took a look at six coastal cities in the United States to evaluate how losses from storm surge are expected to change from the present day until 2100 and found that cities such as Miami, New York, and Tampa face greater risk of economic loss from storm surge.

To evaluate risk, we compared the likelihood of each city sustaining at least $15 billion in economic losses from storm surge – the amount of loss that would occur if the same area of Orleans Parish was flooded today as was flooded in 2005. What we found is that while New Orleans still faces significant risk, with a 1-in-440 chance of at least $15 billion in storm surge losses this year, the risk is 1-in-200 in New York, 1-in-125 in Miami, and 1-in-80 in Tampa.

Looking ahead to 2100, those chances increase dramatically. The chance of sustaining at least $15 billion in storm surge losses in 2100 rises to 1-in-315 in New Orleans, 1-in-45 in New York, and 1-in-30 in both Miami and Tampa.

Due to flood defences implemented since 2005, the risk in New Orleans is not as dramatic as you might think compared to other coastal cities evaluated. However, the Big Easy is faced with another problem in addition to rising sea levels – the city itself is sinking. In fact, it’s sinking faster than sea levels are rising, meaning flood heights are rising faster than any other city along the U.S. coast.

Our calculations regarding the risk in New Orleans were made on the assumption that flood defences are raised in step with water levels. If mitigation efforts aren’t made, the risk will be considerably higher.

And, there is considerable debate within the scientific community over changing hurricane frequency. As risk modelers, we take a measured, moderate approach, so we have not factored in potential changes in frequency into our calculations as there is not yet scientific consensus. However, some take the view that frequency is changing, which would also affect the expected future risk.

What’s clear is it’s important to understand changing risk as storm surge continues to contribute a larger part of hurricane losses.

From Arlene to Zeta: Remembering the Record-Breaking 2005 Atlantic Hurricane Season

Few in the insurance industry can forget the Atlantic hurricane season of 2005. For many, it is indelibly linked with Hurricane Katrina and the flooding of New Orleans. But looking beyond these tragic events, the 2005 season was remarkable on many levels, and the facts are just as compelling in 2015 as they were a decade ago.

In the months leading up to June 2005, the insurance industry was still evaluating the impact of a very active season in 2004. Eight named storms made landfall in the United States and the Caribbean (Mexico was spared), including four major hurricanes in Florida over a six-week period. RMS was engaged in a large 2004-season claims evaluation project as the beginning of the 2005 season approached.

An Early Start

The season got off to a relatively early start with the first named storm—Arlene—making landfall on June 8 as a strong tropical storm in the panhandle of Florida. Three weeks later, the second named storm—Bret—made landfall as a weak tropical storm in Mexico. Although higher than the long-term June average of less than one named storm, June 2005 raised no eyebrows.

July was different.

Climatologically speaking, July is usually one of the quietest months of the entire season, with the long-term average number of named storms at less than one. But in July 2005, there were no fewer than five named storms, three of which were hurricanes. Of these, two—Dennis and Emily—were major hurricanes, reaching categories 4 and 5 on the Saffir-Simpson Hurricane Scale. Dennis made landfall on the Florida panhandle, and Emily made landfall in Mexico. This was the busiest July on record for tropical cyclones.

The Season Continued to Rage

In previous years when there was a busy early season, we comforted ourselves by remembering that there was no correlation between early- and late-season activity. Surely, we thought, in August and September things would calm down. But, as it turned out, 10 more named storms occurred by the end of September—five in each month—including the intense Hurricane Rita and the massively destructive Hurricane Katrina.

In terms of the overall number of named storms, the season was approaching record levels of activity—and it was only the end of September! As the industry grappled with the enormity of Hurricane Katrina’s devastation, there were hopes that October would bring relief. However, it was not to be.

Seven more storms developed in October, including Hurricane Wilma, which had the lowest-ever pressure for an Atlantic hurricane (882 mb) and blew though the Yucatan Peninsular as a category 5 hurricane. Wilma then made a remarkable right turn and a second landfall (still as a major hurricane) in southwestern Florida, maintaining hurricane strength as it crossed the state and exited into the Atlantic near Miami and Fort Lauderdale.

We were now firmly in record territory, surpassing the previous most-active season in 1933. The unthinkable had been achieved: The season’s list of names had been exhausted. October’s last two storms were called Alpha and Beta!

Records Smashed

Four more storms were named in November and December, bringing the total for the year to 28 (see Figure 1). By the time the season was over, the Atlantic, Caribbean and Gulf of Mexico had been criss-crossed by storms (see Figure 2), and many long-standing hurricane-season records were shattered: the most named storms, the most hurricanes, the highest number of major hurricanes, and the highest number of category 5 hurricanes (see Table 1). It was also the first time in recorded history that more storms were recorded in the Atlantic than in the western North Pacific basin. In total, the 2005 Atlantic hurricane season caused more than $90 billion in insured losses (adjusted to 2015 dollars).

The 2005 Atlantic Hurricane Season: The Storm Before the Calm

The 2005 season was, in some ways, the storm before the current calm in the Atlantic, particularly as it has affected the U.S. No major hurricane has made landfall in the U.S. since 2005. That’s not to say that major hurricanes have not developed in the Atlantic or that damaging storms haven’t happened—just look at the destruction wreaked by Hurricane Ike in 2008 (over $13 billion in today’s dollars) and by Superstorm Sandy in 2012, which caused more than $20 billion in insured losses. We should not lower our guard.

Figure 1: Number of named storms by month during the 2005 Atlantic hurricane season

Table 1: Summary of the number of named storms in the Atlantic hurricane basin in 2005 and average season activity through 2014
* Accumulated Cyclone Energy (ACE): a measure of the total energy in a hurricane season based on number of storms, duration, and intensity

Figure 2: Tracks of named storms in the 2005 Atlantic hurricane season

“Super” El Niño – Fact vs. Fiction

The idea of a “super” El Niño has become a hot topic, with many weighing in. What’s drawing all of this attention is the forecast of an unusually warm phase of the El Niño Southern Oscillation (ENSO). Scientists believe that this forecasted El Niño phase could be the strongest since 1997, bringing intense weather this winter and into 2016.

Anomalies represent deviations from normal temperature values, with unusually warm temperatures shown in red and unusually cold anomalies shown in blue. Source: NOAA

It’s important to remember the disclaimer “could.” With all of the information out there I thought it was a good time to cull through the news and try to separate fact from fiction regarding a “super” El Niño. Here are some of the things that we know—and a few others that don’t pass muster.

Fact: El Niño patterns are strong this year

Forecasts and models show that El Niño is strengthening. Meteorologist Scott Sutherland wrote on The Weather Network that there is a 90 percent chance that El Niño conditions will persist through winter and an over 80 percent chance that it will still be active next April. Forecasts say El Niño will be significant, “with sea surface temperatures likely reaching at least 1.5oC (2.7oF) above normal in the Central Pacific – the same intensity as the 1986/87 El Niño (which, coincidentally also matches the overall pattern of this year’s El Niño development).”

A “strong” El Niño is identified when the Oceanic Niño Index (ONI), an index tracking the average sea surface temperature anomaly in the Niño 3.4 region of the Pacific Ocean over a three-month period, is above 1.5oC. A “super” El Niño, like the one seen in 1997/98, is associated with an ONI above 2.0oC. The ONI for the latest May-June-July period was recorded as 1.0oC, identifying El Niño conditions present as of “moderate” strength with the peak anomaly model forecast consensus around 2.0oC.

Fiction: A “super” El Niño is a cure-all for drought plaguing Western states

Not necessarily. The conventional wisdom is that a “super” El Niño means more rain for drought-ravaged California, and a potential end to water woes that have hurt the state’s economy and even made some consider relocation. But, we don’t know exactly how this El Niño will play out this winter.

Will it be the strongest on record? Will it be a drought buster?

Some reports suggest that a large pool of warm water on the northeast Pacific Ocean and a persistent high-pressure ridge over the West Coast of the U.S., driven by dry, hot conditions, could hamper drought-busting rain.

The Washington Post has a good story detailing why significant rain from a “super” El Niño might not pan out for the Golden State.

And if the rain does come, could it have devastating negative impacts? RMS’ own Matthew Nielsen recently wrote an article in Risk and Insurance regarding the potential flood and mudslide consequences of heavy rains during an El Niño.

Another important consideration is El Niño’s impact on the Sierra snow pack, a vital source for California’s water reserves. Significant uncertainty exists around when and where snow would fall, or even if the warm temperatures associated with El Niño would allow for measureable snow pack accumulation. Without the snow pack, the rainwater falling during an El Niño would only be a short-term fix for a long-term problem.

Fact: It’s too early to predict doomsday weather

There are a vast number of variables needed to produce intense rain, storms, flooding, and other severe weather patterns. El Niño is just one piece of the puzzle. As writer John Erdman notes on, “El Niño is not the sole driver of the atmosphere at any time. Day-to-day variability in the weather pattern, including blocking patterns, forcing from climate change and other factors all work together with El Niño to determine the overall weather experienced over the timeframe of a few months.”

Fiction: A “super” El Niño will cause a mini ice age

This theory has appeared around the Internet, on blogs and peppered in social media. While reported some similarities between ice age and El Niño weather patterns to an ice age more than a decade ago you can’t assume we’re closing in on another big chill. The El Niño cycle repeats every three to 10 years; shifts to an ice age occur over millennia.

What other Super El Niño predictions have you heard this year? Share and discuss in the comments section.