See How Quickly and Easily You Can Access the Exposure Metrics That Matter

Exposure Manager is a risk management solution that provides executives, underwriters, risk analysts, and other decision-makers with the exposure analytics needed to offer a comprehensive view of risk and understand loss potential.

As the first solution released on the RMS(one) platform, Exposure Manager was developed based on the understanding that organizations not only need quick and reliable assessments of exposure concentrations, but also the right tools to ensure they can access key metrics and insights.

The videos below illustrate two of the important capabilities that enhance users’ ability to build portfolio intuition faster and quickly access the metrics that are most important.

Build Portfolio Intuition Faster provides insights into how Exposure Manager enables customers to quickly and efficiently derive deeper portfolio insights using an intuitive and user-friendly interface.

exp-mgr-vid-1

With a customizable interface that conveys the information that’s most important to the user, Exposure Manager’s analytics, enabled by an intuitive best-in-class user experience, can be configured without knowledge of SQL or support from IT.

This enhances the ability for customers to create quick insights into their portfolio or perform a deep dive into their book to make quick assessments.

Access Metrics That Matter shows how Exposure Manager leverages the RMS financial model to provide an exposed limit metric. This offers a consistent view of loss potential to enable precise identification of loss drivers.

exp-mgr-vid-2

The flexible interface provides users with precise control to quickly make informed decisions about their book and help identify threats and opportunities in the portfolio.

All of these benefits allow customers to become more incisive about their portfolio.

Earthquake Hazard: What Has New Zealand’s Kaikoura Earthquake Taught Us So Far?

The northeastern end of the South Island is a tectonically complex region with the plate motion primarily accommodated through a series of crustal faults. On November 14, as the Kaikoura earthquake shaking began, multiple faults ruptured at the same time culminating in a Mw 7.8 event (as reported by GNS Science).

The last two weeks have been busy for earthquake modelers. The paradox of our trade is that while we exist to help avoid the damage this natural phenomenon causes, the only way we can fully understand this hazard is to see it in action so that we can refine our understanding and check that our science provides the best view of risk. Since November 14 we have been looking at what Kaikoura tells us about our latest, high-definition New Zealand Earthquake model, which was designed to handle such complex events.

Multiple-Segment Ruptures

With the Kaikoura earthquake’s epicenter at the southern end of the faults identified, the rupture process moved from south to north along this series of interlinked faults (see graphic below). Multi-fault rupture is not unique to this event as the same process occurred during the 2010 Mw 7.2 Darfield Earthquake. Such ruptures are important to consider in risk modeling as they produce events of larger magnitude, and therefore affect a larger area, than individual faults would on their own.

Map showing the faults identified by GNS Sciences as experiencing surface fault rupture in the Kaikoura Earthquake.
Source: http://info.geonet.org.nz/display/quake/2016/11/16/Ruptured +land%3A+observations+from+the+air

In keeping with the latest scientific thinking, the New Zealand Earthquake HD Model provides an expanded suite of events that represent complex ruptures along multiple faults. For now, these are included only for areas of high slip fault segments in regions with exposure concentrations, but their addition increases the robustness of the tail of the Exceedance Probability curve, meaning clients get a better view of the risk of the most damaging, but lower probability events.

Landsliding and Liquefaction

While most property damage has been caused directly by shaking, infrastructure has been heavily impacted by landsliding and, to a lesser extent, liquefaction. Landslides and slumps have occurred across the region, most notably over Highway 1, an arterial route. The infrastructure impacts of the Kaikoura earthquake are a likely dress rehearsal for the expected event on the Alpine Fault. This major fault runs 600 km along the western coast of the South Island and is expected to produce an Mw 8+ event with a probability of 30% in the next 50 years, according to GNS Science.

As many as 80 – 100,000 landslides have been reported in the upper South Island, with some creating temporary dams over rivers and in some cases temporary lakes (see below). These dams can fail catastrophically, sending a sudden increase of water flow down the river.

 

picture2

Examples of rivers blocked by landslides photographed by GNS Science researchers.

Source: http://info.geonet.org.nz/display/quake/2016/11/18/ Landslides+and+Landslide+dams+caused +by+the+Kaikoura+Earthquake

 

 

 

 

 

 

 

 

Liquefaction occurred in discrete areas across the region impacted by the Kaikoura earthquake. The Port of Wellington experienced both lateral and vertical deformation likely due to liquefaction processes in reclaimed land. There have been reports of liquefaction near the upper South Island towns (Blenheim, Seddon, Ward), but liquefaction will not be a driver of loss in the Kaikoura event to the extent it was in the Christchurch earthquake sequence.

RMS’ New Zealand Earthquake HD Model includes a new liquefaction component that was derived using the immense amount of new borehole data collected after the Canterbury Earthquake Sequence in 2010-2011. This new methodology considers additional parameters, such as depth to the groundwater table and soil-strength characteristics, that lead to better estimates of lateral and vertical displacement. The HD model is the first probabilistic model with a landslide susceptibility component for New Zealand.

Tsunami

The Kaikoura Earthquake generated tsunami waves that were observed in Kaikoura at 2.5m, Christchurch at 1m, and Wellington at 0.5m. The tsunami waves arrived in Kaikoura significantly earlier than in Christchurch and Wellington indicating that the tsunami was generated near Kaikoura. The waves were likely generated by offshore faulting, but also may be associated with submarine landsliding. Fortunately, the scale of the tsunami waves did not produce significant damage. RMS’ latest New Zealand Earthquake HD Model captures tsunami risk due to local ocean bottom deformation caused by fault rupture, and is the first model in the New Zealand market to do this, that is built from a fully hydrodynamic model.

Next Generation Earthquake Modeling at RMS

Thankfully the Kaikoura earthquake seems to have produced damage that is lower than we might have seen had it hit a more heavily populated area of New Zealand with greater exposures – for detail on damage please see my other blog on this event.

But what Kaikoura has told us is that our latest HD model offers an advanced view of risk. Released only in September 2016, it was designed to handle such a complex event as the Kaikoura earthquake, featuring multiple-segment ruptures, a new liquefaction model at very high resolution, and the first landslide susceptibility model for New Zealand.

New Zealand’s Kaikoura Earthquake: What Have We Learned So Far About Damage?

The Kaikoura Earthquake of November 14 occurred in a relatively low population region of New Zealand, situated between Christchurch and Wellington. The largest town close to the epicentral region is Blenheim, with a population near 30,000.

Early damage reports indicate there has been structural damage in the northern part of the South Island as well as to numerous buildings in Wellington. While most of this has been caused directly by shaking, infrastructure and ports across the affected region have been heavily impacted by landsliding and, to a lesser extent, liquefaction. Landslides and slumps have occurred across the northeastern area of the South Island, most notably over Highway 1, severing land routes to Kaikoura – a popular tourist destination.

The picture of damage is still unfolding as access to badly affected areas improves. At RMS we have been comparing what we have learned from this earthquake to the view of risk provided by our new, high-definition New Zealand Earthquake model, which is designed to improve damage assessment and loss quantification at location-level resolution.

No Damage to Full Damage

The earthquake shook a relatively low population area of the South Island and, while it was felt keenly in Christchurch, there have been no reports of significant damage in the city. The earthquake ruptured approximately 150 km along the coast, propagating north towards Wellington. The capital experienced ground shaking intensities at the threshold for damage, producing façade and internal, non-structural damage in the central business district. Although the shaking intensities were close to those experienced during the Cook Strait sequence in 2013, which mostly affected short and mid-rise structures, the longer duration and frequency content of the larger magnitude Kaikoura event has caused more damage to taller structures which have longer natural periods.

From: Wellington City Council

Within Wellington, cordons are currently in place around a few buildings in the CBD (see above) as engineers carry out more detailed inspections. Some are being demolished or are set to be, including a nine-story structure on Molesworth Street and three city council buildings. It should be noted that most of the damage has been to buildings on reclaimed land close to the harbor where ground motions were likely amplified by the underlying sediments.

From: http://www.stuff.co.nz/national/86505695/quakehit-wellington-building-at-risk-of-collapse-holds-up-overnight; The building on Molesworth street before the earthquake (L) and on Tuesday (R).

From: http://www.stuff.co.nz/national/86505695/quakehit-wellington-building-at-risk-of-collapse-holds-up-overnight; The building on Molesworth street before the earthquake (L) and after on November 16 (R).

Isolated incidences of total damage in an area of otherwise minor damage demonstrate why RMS is moving to the new HD financial modeling framework. The RMS RiskLink approach applies a low mean damage ratio across the area, whereas RMS HD damage functions allow for zero or total loss – as well as a distribution in between which is sampled for each event for each location. The HD financial modeling framework is able to capture a more realistic pattern of gross losses.

Business Interruption

The Kaikoura Earthquake will produce business interruption losses from a variety of causes such as direct property or content damages, relocation costs, or loss of access to essential services (i.e. power and water utilities, information technology) that cripple operations in otherwise structurally sound buildings. How quickly businesses are able to recover depends on how quickly these utilities are restored. Extensive landslide damage to roads means access to Kaikoura itself will be restricted for months. The New Zealand government has announced financial assistance packages for small business to help them through the critical period immediately after the earthquake. Similar assistance was provided to businesses in Christchurch after the Canterbury Earthquake Sequence in 2010-2011.

That earthquake sequence and others around the world have provided valuable insights on business interruption, allowing our New Zealand Earthquake HD model to better capture these impacts. For example, during the Canterbury events, lifelines were found to be repaired much more quickly in urban areas than in rural areas, and areas susceptible to liquefaction were associated with longer down times due to greater damage to underground services. The new business interruption model provides a more accurate assessment of these risks by accounting for the influence of both property and contents damage as well as lifeline downtime.

It remains to be seen how significant any supply chain or contingent business interruption losses will be. Landslide damage to the main road and rail route from Christchurch to the inter-island ferry terminal at Picton has disrupted supply routes across the South Island. Alternative, longer routes with less capacity are available.

Next Generation Earthquake Modeling at RMS

RMS designed the update to its New Zealand Earthquake High Definition (HD) model, released in September 2016, to enhance location-level damage assessment and improve the gross loss quantification with a more realistic HD financial methodology. The model update was validated with billions of dollars of claims data from the 2010-11 Canterbury Earthquake Sequence.

Scientific and industry lessons learned following damaging earthquakes such as last month’s in Kaikoura and the earlier event in Christchurch increase the sophistication and realism of our understanding of earthquake risk, allowing communities and businesses to shift and adapt – so becoming more resilient to future catastrophic events.

Prudential Regulation Authority on the Challenges Facing Cyber Insurers

Most firms lack clear strategies and appetites for managing cyber risk, with a shortage of cyber domain knowledge noted as a key area of concern. So said the Prudential Regulation Authority, the arm of the Bank of England which oversees the insurance industry, in a letter to CEOs last week.

This letter followed a lengthy consultation with a range of stakeholders, including RMS, and identified several key areas where insurance firms could and should improve their cyber risk management practices. It focussed on the two distinct types of cyber risk: affirmative and silent.

Affirmative cover is explicit cyber coverage, either offered as a stand-alone policy or as an endorsement to more traditional lines of business. Silent risk is where cover is provided “inadvertently” through a policy that was typically never designed for it. But this isn’t the only source of silent risk: it can also leak into policies where existing exclusions are not completely exhaustive. A good example being policies with NMA 2914 applied, which excludes cyber losses except for cases where physical damage is caused in any cyber-attack (eg. by fire or explosion).

The proliferation of this silent risk across the market is highlighted as one of the key areas of concern by the PRA. It believes this risk is not only material, but it is likely to increase over time and has the potential to cause losses across a wide range of classes, a sentiment we at RMS would certainly echo.

The PRA intervention shines a welcome spotlight and adds to the growing pressure on firms to do more to improve their cyber risk management practices. These challenges facing the market have been an issue for some time, but the how do we help the industry address them?

The PRA suggests firms with cyber exposure should have a clearly defined strategy and risk appetite owned by the board and risk management practices that include quantitative and qualitative elements.

At RMS our cyber modeling has focussed on providing precisely this insight, helping many of the largest cyber writers to quantify both their silent and affirmative cyber risk, thus allowing them to focus on growing cyber premiums.

If you would like to know more about the RMS Cyber Accumulation Management System (released February 2016), please contact cyberrisk@rms.com.

Shrugging Off a Hurricane: A Three Hundred Year Old Culture of Disaster Resilience

If a global prize was to be awarded to the city or country that achieves the peak of disaster resilience, Bermuda might be a fitting first winner.

This October’s Hurricane Nicole made direct landfall on the island. The eyewall tracked over Bermuda with maximum measured windspeeds close to 120 mph. Nonetheless there were there were no casualties. The damage tally was principally to fallen trees, roadway debris, some smashed boats and many downed utility poles. The airport opened in 24 hours, with the island’s ferries operating the following day.

Bermuda’s performance through Nicole was exemplary. What’s behind that?

Since its foundation in 1609 when 150 colonists and crew were shipwrecked on the island, Bermuda has got used to its situation at the heart of hurricane alley. Comprising 21 square miles of reef and lithified dunes, sitting out in the Atlantic 650 miles west of Cape Hatteras, a hurricane hits the island on average once every six or seven years. Mostly these are glancing blows, but once or twice a century Bermuda sustains direct hits at Category 3 or 4 intensity. Hurricane Fabian in 2003 was the worst of the recent storms, causing $300 million of damage (estimated to be worth $650 million, accounting for today’s higher prices and greater property exposure). The cost of the damage from Hurricane Gonzalo in 2014 was about half this amount.

How did Bermuda’s indigenous building style come to adopt such a high standard of wind resistance? It seems to go back to a run of four hurricanes at the beginning of the 18th Century. First, in September 1712 a hurricane persisted for eight hours destroying the majority of wooden buildings. Then twice in 1713 and again more strongly in 1715 the hurricane winds ruined the newly rebuilt churches. One hurricane can seem like an exception, four becomes a trend. In response, houses were constructed with walls of massive reef limestone blocks, covered by roofs tiled with thick slabs of coral stone: traditional house styles that have been sustained ever since.

The frequency of hurricanes has helped stress test the building stock, and ensure the traditional construction styles have been sustained. More recently there has been a robust and well-policed building code to ensure adequate wind resistance for all new construction on the island.

Yet resilience is more than strong buildings. It also requires hardened infrastructure, and that is where Bermuda has some room for improvement. Still dependent on overhead power lines, 90 percent of the island’s 27,000 houses lost power in Hurricane Nicole – although half of these had been reconnected by the following morning and the remainder through that day. Mobile phone and cable networks were also back in operation over a similar timescale. Experience of recent hurricanes has ensured an adequate stockpile of cable and poles.

Expert Eyes on the Island

It helps that there is an international reinsurance industry on the island, with many specialists in the science of hurricanes and the physics and engineering of building performance on hand to scrutinize the application of improved resilience. Almost every building is insured, giving underwriters oversight of building standards. Most importantly, the very functioning of global reinsurance depends on uninterrupted connection with the rest of the world, as well as ensuring that on-island staff are not distracted by having to attend to their family’s welfare.

Bermuda’s experience during Nicole would merit the platinum standard of resilience adopted by the best businesses: that all functions can be restored within 72 hours of a disaster. The Bermuda Business Development Agency and the Association of Bermuda Insurers and Reinsurers were fulsome in their praise for how the island had withstood the hurricane. The strong and widely-owned culture of preparedness, reflects the experience of recent storms like Gonzalo and Fabian.

Stephen Weinstein, general counsel at RenaissanceRe, commented “It’s remarkable that one day after a major hurricane strike, Bermuda is open for business, helping finance disaster risk worldwide, and poised to welcome back business visitors and vacationers alike.”

In early 2017, RMS will issue an update to Bermuda wind vulnerability in the version 17 software release as part of a broader update to the 33 islands and territories covered by the North Atlantic Hurricane Models. Updates to Bermuda vulnerability will consider past hurricane observations and the latest building code research.

Terrorism Insurance Under a Trump Presidency

It is likely that very few of the 60 million U.S. citizens who voted for Donald Trump would have done so because of his stance on terrorism insurance. Only because terrorism insurance is too arcane an issue to have come up in the presidential debates. However, many of the nation’s wavering voters may have been swayed by his pledge to make America safer from the scourge of terrorism. Under his presidency, border security will surely be tightened – even if no frontier wall is ever built and changes made to entry decisions for Syrian Muslim refugees into the United States.

Reauthorization of TRIA – Talks Start in 2018

On January 12, 2015, the Terrorism Risk Insurance Program Reauthorization Act of 2015 was signed into law by President Obama. This third extension of the original 2002 Terrorism Risk Insurance Act (TRIA) will sunset at the end of 2020, coinciding with the end of the first term of the Trump presidency. In the drafting of the 2015 reauthorization bill, detailed consideration was given by the House Financial Services Committee to alternative wordings that would have reduced the coverage provided by the U.S. government insurance backstop. One such alternative would have focused U.S. government involvement in the terrorism insurance market on covering terrorism losses from extreme attacks using weapons of mass destruction. When the future of terrorism risk insurance is raised once more on Capitol Hill in 2018, the Republican White House and Congress are likely to seek to further extend the private terrorism insurance market. Though I consider this to be contingent on President Trump keeping his pledge to keep America safe until then.

Balancing Civil Liberties in the Face of Reducing Terrorism Risk

In the democracies of the western alliance, the balance of keeping people safe from terrorism and preserving civil liberty is much debated issue. After the July 2005 London Transport bombings, the head of the British security service, MI5, warned that ‘there needs to be a debate on whether some erosion of civil liberties may be necessary to improve the chances of our citizens not being blown apart as they go about their daily lives’. On a national scale across America, a similar debate was prevalent during the 2016 U.S. presidential election. It may seem that in this instance, the champion of civil liberties, minority rights, and political correctness lost to the conservative advocate of oppressive counter-terrorism action and profiling of terrorist suspects.

Regardless of who occupies the White House, however, terrorist plots against the U.S. will persist and terrorists must be stopped before they move to their attack targets. Success in interdicting these plots depends crucially on intelligence gathered from electronic surveillance. It is well-documented that more intrusive surveillance can successfully increase the chances of lone wolf plots being stopped. And President-elect Trump has already affirmed his readiness to authorize more surveillance. He can claim a public mandate for this: for America to be great again, it has to be safe again – even from lone wolf terrorist plots. After the Orlando nightclub attack on June 12, 2016, perpetrated by the radicalized son of an Afghan immigrant, Donald Trump said that ‘we cannot afford to be politically correct anymore’. And in fighting global Islamist extremism vigorously, he may be able to count on President Putin’s support. While the two world leaders differ on geopolitics, their mutual respect as a President may be maintained through abrasive counter-terrorism action.

When Michael Chertoff was appointed Secretary of Homeland Security, President George W. Bush told him not to let 9/11 happen again – and he didn’t. President-elect Trump will expect a similarly impressive clean sheet. On a more personal level he also has a special interest in increased security against terrorist attacks. His own real estate empire includes some notable potential terrorist targets, including high-profile landmark buildings bearing his name. While the New York Stock Exchange has too tight security to be attacked, in contrast, the Trump Building on Wall Street has easy public access. There are numerous opportunities for terrorist target substitution.

New Zealand Earthquake – Early Perspectives

On Monday 14 November 2016 Dr Robert Muir-Wood, RMS chief research officer who is an earthquake expert and specialist in catastrophe risk management, made the following observations about the earthquake in Amberley:

SCALE
“The November 13 earthquake was assigned a magnitude 7.8 by the United States Geological Service. That makes it more than fifty times bigger than the February 2011 earthquake which occurred directly beneath Christchurch. However, it was still around forty times smaller than the Great Tohoku earthquake off the northeast coast of Japan in March 2011.”

CASUALTIES, PROPERTY DAMAGE & BUSINESS INTERRUPTION
“Although it was significantly bigger than the Christchurch earthquake, the source of the earthquake was further from major exposure concentrations. The northeast coast of South Island has a very low population and the earthquake occurred in the middle of the night when there was little traffic on the coast road. Characteristic of such an earthquake in steep mountainous terrain, there have been thousands of landslides, some of which have blocked streams and rivers – there is now a risk of flooding downstream when these “dams” break.

In the capital city, Wellington, liquefaction and slumping on man-made ground around the port has damaged some quays and made it impossible for the ferry that runs between North and South Island to dock. The most spectacular damage has come from massive landslides blocking the main coast road Highway 1 that is the overland connection from the ferryport opposite Wellington down to Christchurch. This will take months or even years to repair. Therefore it appears the biggest consequences of the earthquake can be expected to be logistical, with particular implications for any commercial activity in Christchurch that is dependent on overland supplies from the north. As long as the main highway remains closed, ferries may have to ship supplies down to Lyttelton, the main port of Christchurch.”

SEISMOLOGY
“The earthquake appears to have occurred principally along the complex fault system in the north-eastern part of the South Island, where the plate tectonic motion between the Pacific and Australian plates transfers from subduction along the Hikurangi Subduction Zone to strike-slip along the Alpine Fault System. Faults in this area strike predominantly northeast-southwest and show a combination of thrust and strike-slip motion. From its epicenter the rupture unzipped towards the northeast, for about 100-140km reaching to about 200 km to the capital city Wellington.”

WHAT NOW?
“Given the way the rupture spread to the northeast there is some potential for a follow-on major earthquake on one of the faults running beneath Wellington. The chances of a follow-on major earthquake are highest in the first few days after a big earthquake, and tail off exponentially. Aftershocks are expected to continue to be felt for months.”

MODELING
“These events occurred on multiple fault segments in close proximity to one another. The technology to model this type of complex rupture is now available in the latest RMS high-definition New Zealand Earthquake Model (2016) where fault segments may now interconnect under certain considerations.”

India’s Need for Disaster Risk Reduction: Can it Turn a Plan into Action?

This was the first time I’d ever heard a Prime Minister praising the benefits of “risk mapping.” Mid-morning on Thursday November 3 in a vast tent in the heart of New Delhi, the Indian Prime Minister, Narendra Modi, was delivering an introductory address to welcome four thousand delegates to the 2016 Asian Ministerial Conference on Disaster Risk Reduction.

Modi mentioned his own personal experience of disaster recovery after the 2001 Gujarat earthquake in which more than 12,000 people died, before presenting a ten-point plan of action in response to the 2015 Sendai Framework for disaster risk reduction. There were no guarantees of new regulations or changes in policy, but three of his ten points were particularly substantive.

First there was a call for appropriate protections to be applied to all government sponsored construction of infrastructure or housing against the relevant hazards at that location. Second he called for “work towards” achieving universal “coverage” (insurance if not by name?) against disasters– from the poorest villager to big industries and state governments. Third he called for standardized hazard and risk mapping to be developed not only for earthquake but for other perils: chemical hazards, cyclones, all varieties of floods and forest fires.

More Economic Development Means More Exposure to Risk

India is at a development threshold, comparable to that reached by Japan at the end of the 1950s and China in the 1990s. Rapid economic growth has led to a dramatic expansion of building and value in harm’s way and there now needs to be a significant compensatory focus on measures to reduce risk and expand protections, whether through insurance systems or flood walls.  Development in India has been moving too fast to hope that adequate building standards are being consistently followed – there are not enough engineers or inspectors.

The Chennai floods at the end of 2015 have come to highlight this disaster-prone landscape. Heavy end-of-year monsoonal downpours fell onto saturated ground after weeks of rainfall, which were then ponded by choked drainage channels and illegal development, swamping hundreds of thousands of buildings along with roads and even the main airport. The city was cut off and economic losses totaled billions of U.S. dollars, with more than 1.8 million people being displaced.

Sorting out Chennai will take co-ordinated government action and money: to implement new drainage systems, relocate or raise those at highest risk and apply flood zonations. Chennai provides a test that Disaster Risk Reduction really is a priority, as Mr. Modi’s speech suggested. The response will inevitably encounter opposition, from those who cannot see why they should be forced to relocate or pay more in their taxes to construct flood defenses.

The one community notably missing from Prime Minister Modi’s call to action was the private sector, even though a pre-conference session the day before, organized by Federation of Indian Chambers of Commerce (FICCI), had identified that 80% of construction was likely to be privately financed.

I gave two talks at the conference – one in the private sector session – on how modelers like RMS have taken a lead in developing those risk maps and models for India, including high resolution flood models that will help extend insurance. Yet armed with information by which to differentiate risk and identify the hot spots, the government may need to step in and provide its own coverages for those deemed too high risk by private insurers.

Auditing Disaster Risk Reduction with Cat Models

In a side meeting at the main conference I presented on the need to have independent risk audits of states and cities, to measure progress in achieving their disaster risk reduction goals, in particular when it comes to earthquake mortality – for which experience from the last few decades gives no perspective on the true risk of potentially large and destructive future earthquakes happening in India – this is where probabilistic catastrophe models are invaluable. The Nepal earthquake of 2015 has highlighted the significant vulnerability of ordinary brick and concrete buildings in the region.

I came away seeing the extraordinary opportunity to reduce and insure risk in India, if ten-point lists can truly be converted into co-ordinated action.

Meanwhile as a test of the government’s resolve in the days leading up to the conference, Delhi was shrouded in its worst ever smog: a toxic concoction of traffic fumes, coal smoke, and Diwali fireworks, enriched to extremely dangerous levels in micro-particles, a smog so thick and pervasive that it seeped inside buildings, so that several attendees asked why the toxic smog was not itself being classified and treated as a true “manmade disaster.”

Hurricane Drought Over? Not So Fast

After a relatively quiet start, the 2016 Atlantic hurricane season grabbed the attention of the insurance industry during September and October. On October 6, all eyes fixed on Hurricane Matthew, a Category 4 storm barreling towards Florida and presenting the greatest threat to the U.S. insurance industry since Sandy in 2012. Despite Matthew’s high winds and floods, caused by both storm surge and rainfall, the storm’s offshore track certainly spared Florida and the southeast U.S. from a potential worst-case scenario.

Matthew was preceded by Hurricane Hermine, a Category 1 storm that made landfall along Florida’s Panhandle on September 2. These storms ended a nearly 11-year period where no hurricanes affected the state of Florida.

While the soft insurance market is expected to weather the effects of Hermine and Matthew, modelers at RMS are investigating whether these events provide valuable clues about future, near-term Atlantic hurricane frequency. Did either of these storms finally end the United States’ well-publicized hurricane drought?

It’s The Major Storms That Matter

The term “hurricane drought” first appeared in Geophysical Research Letters in research by Tim Hall and Kelly Hereid, and is defined as a lack of major hurricanes, ranking as Category 3 or greater on the Saffir-Simpson Hurricane Wind Scale, making landfall on the U.S.

Based on this definition, which itself ignited academic debate and drew ire from some meteorologists – the drought continues. Although a Category 4 hurricane on its approach to Florida, Matthew did not officially make landfall in the U.S. until striking South Carolina at Category 1 intensity.

Scientists agree that the Atlantic Basin entered a prolonged period of above-average hurricane frequency in 1995. To insurers, this translates into a period of increased likelihood of elevated damages and loss. As the drought stretched into record territory, scientists and insurers alike wondered whether this era had come to a close. Colorado State University’s Phil Klotzbach points out that it’s the major hurricanes, those driving the drought, that could provide us with the first clues.

Annual frequency of Atlantic Basin and U.S. landfalling major (category 3-5) hurricanes over known periods of high frequency (1926-1969, 1995-2011) and low frequency (1900-1925, 1970-1994).

An average of 2.7 major hurricanes have formed each year in the Atlantic Basin since 1950. With three major hurricanes – Gaston, Matthew, and Nicole – the 2016 season was the first since 2011 to exceed this average. But the four preceding seasons featured below-average major hurricane activity. You have to go back to the last low period of hurricane activity, around a quarter of a century ago, to see such a run of quiet years.

Annual counts of Atlantic Basin major (category 3-5) hurricanes since 1995. Dashed lines represent the 1950-2015 average (black, 2.7 hurricanes), the 1995-2011 average (red, 3.8 hurricanes), and the 2012-2016 average (blue, 1.8 hurricanes). The 2016 season is inclusive of all activity up to the end of October.

The Most Intriguing Statistic

This four year period is important to RMS modelers monitoring the medium-term rate (MTR), our scientific reference view of hurricane landfall frequency, looking ahead five years. It is the product of 13 individual forecast models; the contribution of each of those models is weighted according to its ability to predict the historical fluctuations in activity.

These forecast models include “shift” models that support the theory of cyclical Atlantic hurricane frequency. These shift models identify the seasons since 2011 as statistically distinct from periods observed since 1950 which are acknowledged as more active, based on the lack of recent major hurricanes.

It Ain’t Over ‘Til It’s Over

One month remains in the current season, so it is possible that more hurricanes could inform our view of the bigger picture.

The final month of the hurricane season, November has produced some notable Atlantic hurricanes, proving that the season’s latter stage requires close observation. Our attention to cyclogenesis turns primarily to the Gulf of Mexico and Caribbean, as evidenced by these past events:

  • Hurricane Kate, the latest landfalling U.S. hurricane in modern history, peaked in intensity as a Category 3 storm before weakening ahead of its landfall near Mexico Beach, Florida on November 21, 1985.
  • Hurricane Lenny, a Category 4 hurricane forming in 1999, is best known for its “wrong way” track that crossed the Caribbean Sea from west to east.
  • Hurricane Michelle, the most intense hurricane of the 2001 season, made landfall in western Cuba as a Category 4 storm on November 4, causing over $2 billion in economic damage.
  • Hurricane Paloma, following a track just to the east of Michelle, reached Category 4 strength at the height of its lifecycle, later impacting the Cayman Islands and Cuba in November 2008.

Shift models inform only one driver of activity considered by the MTR methodology. RMS plans to publish the findings of its annual medium-term rate forecast review in early 2017, after considering the most recent activity and other drivers of near-term hurricane behavior. This forecast will contribute to the updated RMS view of hurricane risk, forthcoming in spring 2017 as part of the version 17 software release.

Tracking Matthew – The Devil in the Detail

Hurricane Matthew aptly demonstrated that slight shifts in a tropical cyclone’s timing, track, and wind field extent can make a huge difference in its overall impact to exposures at risk.

As Matthew bore down on the U.S. after devastating Haiti, it had the makings of another industry-altering event. Had the storm made landfall along the Florida coast, likely as a category 4 storm, insured losses could have been ten times larger than the $1.5 billion to $5 billion range that is currently projected by RMS.

Given that Matthew’s strongest winds were confined to a small area within its inner core, its path proved to be critical. A difference in track of just a few dozen miles translated to a material reduction in wind impacts along the coastline and into interior portions of Florida. The fact that the storm stayed just offshore helped to minimize overall damages significantly throughout the state and the (re)insurance industry at large.

Storms like Matthew signify the importance of being able to track dynamic tropical cyclone characteristics, position, and damage potential accurately as the storm unfolds in order to help communities and businesses adequately prepare and respond.

There is a wealth of public and private data to inform real-time tropical cyclone wind field assessments and event response processes, but some data provides more insight than others. Commonly used public sources include worldwide and national tropical cyclone centers, numerical weather prediction models, and numerous forecast offices or research organizations.

In the U.S., one of the better-known public sources for tropical cyclone data is the National Hurricane Center (NHC) in Miami, Florida. A branch of the National Oceanic and Atmospheric Administration, the NHC provides a range of tropical cyclone data, tools, analyses, and forecasts to inform real-time tropical cyclone assessments in the Atlantic and East Pacific basins.

There are also private sources of tropical cyclone wind field data that span a wide breadth and depth of useful information, few of which provide insight that goes beyond what is provided by the NHC.

One exception to that is HWind, formally known as HWind Scientific. Acquired by RMS in 2015, the provider of tropical cyclone wind field data develops observation-based data products for both real-time and historical wind field analyses in the Atlantic, East Pacific, and Central Pacific Basins.

During a real-time event, HWind provides regularly-derived snapshots of wind field conditions leading up to and following landfall, as well as post-event wind hazard footprints 1-3 days after the storm impacts land. Each analysis is informed by access to an observational data network spanning more than 30 land, air, and sea-based platforms, all of which are subject to stringent independence and quality control testing.

On average, tens of thousands of observations are used for each event, depending on the availability and the storm’s proximity to land.

Figure 1: GIF animation of all RMS HWind snapshots for Hurricane Matthew (September 28 through October 9, 2016). Wind is represented as maximum 1-minute sustained winds over open water for marine exposure, and over open terrain over land.

HWind products tend to represent wind hazard characteristics with more frequency, accuracy, and granularity than many publically available sources, including the NHC.

From a frequency perspective, HWind snapshots are created and refreshed as often as every three hours throughout the event as soon as aircraft reconnaissance begins, allowing users to track changing storm conditions as the event evolves.

The data also discerns important factors such as storm location with a high degree of granularity and precision, often correcting for center-position errors and biases that are evident in some observational data sources, or adjusting wind speeds to account for the impact of terrain.

Each snapshot also includes a high-resolution representation of local wind speeds and hazard bands.


Figure 2: Preliminary wind hazard footprint for Hurricane Matthew (2016) based on the NHC (left) and RMS HWind (right), where winds are represented as maximum 1-minutes sustained in kts (left) and mph (right).

During events like Hurricane Matthew and the events that are yet to come, private sources like HWind can provide additional and timely insight needed to understand the aspects of wind hazard that matter most to a (re)insurer’s business and event response processes.

Using this information, risk managers can more accurately quantify exposure accumulations at risk during or immediately following landfall. Crucially, this allows them to anticipate the potential severity of loss claims with more precision, and position claims adjusters or recovery assets more effectively.

Collectively, it could mean the difference between being proactive vs. reactive when the next event strikes.