The 2015 U.K. Budget and Terrorism Insurance

On 18 March, the Chancellor of the Exchequer, George Osborne, delivered his pre-election budget. Billions of further public spending cuts are needed. Several weeks earlier, Pool Re, the U.K. terrorism insurance pool, announced its first ever purchase of reinsurance in the commercial market.

These two announcements are not unconnected.

Pool Re was set up in 1993, after the IRA bombing of the Baltic Exchange in 1992. Since the pool was established, it has built up quite a substantial surplus; claims have been low thanks to the vigilance of the security and intelligence services. Almost all the major plots since the September 11, 2001 attack have been foiled.

Terrorism insurance is effectively insurance against counter-terrorism failure, and the huge sums spent on blanket indiscriminate surveillance have helped to minimize terrorism insurance losses. The low level of losses is not coincidental, or due to some unpredictable whim of terrorist behavior but readily explainable; too many terrorists spoil the plot. The type of plots capable of causing terrorism insurance losses of a billion pounds or more would involve a sizeable number of operatives.

As the NSA whistleblower Edward Snowden has revealed, the level of surveillance of electronic communications is so intensive that sizeable terrorist social networks end up being tracked by NSA and GCHQ. Lesser plots involving lone wolves or several operatives are most likely to be successful. A string of these have struck the western alliance over the past months in Ottawa, Sydney, Paris, and Copenhagen. Besides causing terror, these have attracted global media publicity, inspiring Jihadi recruitment. But terrorism insurance covers property loss, not the spread of fear or growth in the ranks of Islamic State.

Having developed a tough security environment, it is unsurprising that the U.K. Government should be questioning its continuing exposure to terrorism insurance risk. This is an age of austerity. Pool Re’s three year program provides £1.8bn of reinsurance cover, so diminishing this exposure. More cover might have been purchased, but this was the market limit, given that Chemical-Biological-Radiological-Nuclear (CBRN) risks were included.

The idea of separating out extreme CBRN terrorism risks was considered in Washington by the House Financial Services Committee in the discussions last year over the renewal of the Terrorism Risk Insurance Act. The objective was to provide a federal safety net for such extreme risks, whilst encouraging further private sector solutions for conventional terrorist attacks. This idea was considered at some length, but was a step too far for this TRIA renewal. It might be a step for Pool Re.

The modus operandi of the IRA was to avoid killing civilians. This would alienate their Catholic community support. Bomb warnings, genuine and hoax, were often given. Thus the metric of IRA attacks was typically physical destruction and economic loss. Islamist militants of all persuasions have no such qualms about killing civilians. Indeed, gruesome killings are celebrated. Terrorists follow the path of least resistance in their actions. For Islamic State, this is the path of brutal murder rather than property damage.

Winter 2015: A Season to Remember (or Forget)

This winter has brought a barrage of storms and Arctic air to more than half of the U.S., notably the New England region, resulting in record amounts of snow, sleet, freezing rain, and bitterly cold temperatures.

Arguably, no other major city has been more directly impacted than Boston, Massachusetts. As of March 9, the city has received 105.7 inches of snow this season – over three times the average seasonal total for the region! It’s the second snowiest season on record, behind only the 1995-1996 season, which brought 107.6 inches of snow. Further, February 2015 marks the snowiest month reported (more than 60 inches) and the second coldest February on record.

Damage reports from this season’s snowstorms include roof collapses, building collapses, burst pipes, power outages, and business interruption. The Massachusetts Emergency Management Agency reported more than 160 collapsed buildings or buildings at risk of collapse since February 9 with damage mainly driven by dense snow pack and strong winds. As of February, Boston has already spent a record $35 million on snow removal – almost double the allotted total of $18.5 million.


U.S. Winterstorm Risk Map. Loss cost per $1000 for Residential lines at the ZIP code level, based on RMS U.S. Winterstorm Model output using the RMS 2011 U.S. Winterstorm IED.

Businesses and supply chains have been interrupted as well. A combination of snowstorms, cold weather, and ice has closed thousands of businesses, resulting in lost wages for hourly workers. These events have disrupted all forms of travel, restricting trucks and air freight from reaching their destinations and leading to increased prices for certain goods.

All in all, these types of impacts can result in significant economic and insured damages. According to a study by IHS Global Insight, a one-day snow-related shutdown would cost some states as much as $300-700 million in economic losses.

Insured loss estimates from the cluster of February storms (five in total) that swept through parts of the Ohio Valley, Mid-Atlantic, and Northeast are likely to exceed $1 billion, which is in line with annual averages. RMS model analysis shows that on average, about $2-3 billion in U.S. annual insured losses are caused by winter storms, which can produce a combination of snow, ice, freezing rain, and frigid temperatures. This is about 5-10% of the overall U.S. average caused by perils including hurricanes, severe convective storms, floods, and winter storms.

Whether it is in regards to the harsh winters of the last few years or future winters to come, it is important for the (re)insurance industry to be adequately prepared so insured losses remain at a minimum.

The Journey to Sendai and Beyond

Sendai is a city of a million people 2 hours north of Tokyo on the Shinkansen bullet train. From March 14-17, 2015 it will attract seven thousand people to the 3rd UN World Conference on Disaster Risk Reduction (WCDRR). Twelve heads of state (including one king and one emperor), seven prime ministers and 135 ministers and vice ministers, will be present to launch a fifteen year program of coordinated action around disaster risk reduction.

The conference is being hosted in Sendai because of the city’s recent experience of a mega-catastrophe. Just four years after the great Tohoku earthquake and tsunami in March 2011 and the coastal villages adjacent to Sendai still bear the scour marks where the great tsunami surged inland through the pine forests, removing many buildings off their foundations.

The original International Decade for Disaster Risk Reduction ran from 1990-1999. The second decade from 2005-2015, renewed at Kobe ten years after its devastating 1995 earthquake, was called the Hyogo Framework for Action. The continuation of this international program is currently designed to last for fifteen years. The fact that the frameworks have been renewed reflects reality—while there have been successes for particular regions and perils, the broader goals of worldwide disaster risk reduction have not been met. For example, the 2011 Tohoku earthquake was not anticipated, and as a result had grievous consequences in terms of loss of life and damage to the Fukushima nuclear power plants.

RMS will have four people at the Sendai WCDRR conference. We have obtained a coveted presentation on the main IGNITE stage—the equivalent to a “TED talk.” I will also be speaking on two panel sessions, one organized by The Geneva Association and Tokio Marine, “Insurance as contributors to problem solving and impact reduction,”and a second on the launch of the global set of catastrophe models developed by the UNISDR agency, for which RMS has provided high-level input. We have offered to host these worldwide UNISDR catastrophe models on RMS(one), which will open up access to the models for public officials in developing countries.

We have also worked on a couple of papers (for example, ‘Setting, Measuring and Monitoring: Targets for Disaster Risk Reduction: Recommendations for post-2015 international policy frameworks’) articulating how to measure progress in disaster risk reduction. At present, international frameworks have shied away from setting numerical commitments. We have argued that only probabilistic methods, which simulate thousands of possible events, can show baseline levels of risk, what actions will achieve progress, and whether targets have been achieved. We take Michael Bloomberg’s quote from the foreword to the Risky Business report: “if you can’t measure it, you can’t manage it.”

The work by the UNISDR on catastrophe modeling highlights the accelerated recognition of the role of modeling in managing and reducing disaster risk. There is now a real focus on public-private partnerships in achieving disaster reduction. With RMS’ rich and deep experience in catastrophe modeling, there is much we can offer to these expanded applications. For users of models in governments, public organisations and NGOs, models are required to:

  • explore how to manage a wide range of potential disasters
  • perform cost benefit analyses of alternative actions to reduce risk of loss of life or economic impacts
  • explore potential implications of climate change
  • explore holistically the potential for significant financial shocks to national economies

If you are attending the conference, come and visit us at our booth on the 6th floor of the Sendai International Center where we will be distributing information about our proposals for disaster risk modeling, and articulating our role as leaders in catastrophe risk modeling. It will be a highly publicized event with 500 journalists and around 300 private sector members, including several of our key clients. We will also be meeting with other organizations with which we are affiliated, including the UN Principles for Sustainable Insurance and the Rockefeller Foundation’s 100 Resilient Cities initiative.

We look forward to sharing more insight after the event.

Rising Storm Surge Losses in the U.S. Northeast

Co-authored by Anaïs Katz and Oliver Withers, analysts, Capital Market Solutions, RMS

A recent article in Nature Communications, picked up by the BBC, identified a record sea-level rise of 5” along the coastline north of New York City during 2009-10. Sea levels fluctuate between years however, a swing of this size was unprecedented.

This extreme rise in 2009-2010 has been attributed to the downturn of a major current called the Atlantic meridional overturning circulation (AMOC). As changes to sea levels are sensitive to multiple factors, there is volatility around this increase. The AMOC is one of the ocean’s dynamics that is known to have greatly changed over time. It has been shown that weakening and variation of the AMOC is linked to increases of greenhouse gas emissions.

Sea level rise is one of the most tangible effects of climate change and climate models suggest that even with the mitigation of greenhouse gas emissions, sea levels will continue to increase this century.

A study by Kopp et al. has attempted to determine probabilistic bands for sea rise. The figure below shows the distribution of possible sea-level rises at New York City’s Battery Park throughout the century compared to selected historical hurricane surges. The 50th percentile projection of sea level rise is represented as the red line in the figure.

New York is already one of the most vulnerable cities to hurricanes in the United States; New York is among the top five U.S. states for average annual loss and, due to a combination of factors, New York will likely suffer a sea level rise greater than the national average.

A 5” rise in sea levels is equivalent to a hazard increase in the 99th percentile in 2020, or almost the worst-case-scenario, of the distribution of sea level rises in New York. That this can happen over a two-year period shows that this is an issue that the market should not only be aware of but also considering in risk management.

In the Northeast as a whole, surge losses account for about 10% of industry losses, assuming current policy terms which often exclude surge losses; this contribution from surge would increase with a rise in sea levels. Currently, RMS estimates that surge losses in the Northeast account for approximately 1 in 10 dollars of average annual losses. Based on the analysis RMS conducted for the Risky Business report, which used Kopp’s estimates to modify the RMS North Atlantic storm surge model, surge losses could triple by 2100.

The largest recent hurricane loss occurred on October 29th 2012, when Superstorm Sandy made landfall near Atlantic City, NJ. Based on the RMS best loss estimate, Sandy caused insured losses between $20 and $25 billion, with much of the damage due to storm surge, not wind.

Though a surge like Sandy’s at New York City’s Battery Park was approximately a 1-in-450 year event, it could be closer to a 1-in-100 year event by the end of the century. The figure below shows the return-periods for a storm surge as high as Sandy’s occurring at New York City’s Battery Park, under different sea-level assumptions.

Climate change is a politically contentious topic, but the magnitude of the changes seen recently has material effects on the risk to insurers. With the understanding of climate change coming to the fore, the next decades may pose a research challenge for the insurance industry. Though there is uncertainty in the risk, what industry could be better poised to deal with it?

This post was co-authored by Anaïs Katz and Oliver Withers. 

Anaïs Katz

Analyst, Capital Market Solutions, RMS
As a member of the advisory team within capital market solutions, Anaïs works on producing capital markets’ deal commentary and expert risk analysis. Based in Hoboken, she provides transaction characterizations to clients for bonds across the market and supports the deal team in modeling transactions. She has woked on notable deals for clients such as Tradewynd Re and Golden State Re. Anaïs has also helped to model and develop her group’s internal collateralized insurance pricing model that provides mark to market prices for private transactions. Anaïs holds a BA in physics from New York University and an MSc in Theoretical Systems Biology and Bioinformatics from Imperial College London.

Measuring Disaster Risk for Global UN Goals

A dispiriting part of the aftermath of a disaster is hearing about the staggering number of deaths and seemingly insurmountable economic losses. Many of the disaster risk reduction programs that implement disaster prevention and preparedness capabilities are helping to create more resilient communities. These worthwhile programs require ongoing financing, and their success must be measured and evaluated to continue to justify the allocation of limited funds.

There are two global UN frameworks being renewed this year:

Both frameworks will run for 15 years. This is the first time explicit numerical targets have been set around disaster risk, and consequently, there is now a more pressing need to measure the progress of disaster risk reduction programs to ensure the goals are being achieved.

The most obvious way to measure the progress of a country’s disaster risk reduction would be to observe the number of deaths and economic losses from disasters.

However, as we have learned in the insurance industry in the early 1990s, this approach presents big problems around data sampling. A few years or even decades of catastrophe experience do not give a clear indication of the level of risk in a country or region because catastrophes have a huge and volatile range of outcomes. An evaluation that is purely based on observed deaths or losses can give a misleading impression of success or failure if countries or regions are either lucky in avoiding (or unlucky in experiencing) severe disaster events during the period measured.

A good example is the 2010 Haiti earthquake, which claimed more than 200,000 lives and cost more than $13 billion. Yet for more than 100 years prior to this devastating event, earthquakes in Haiti had claimed fewer than 10 lives.

Haiti shows that it is simply not possible to determine the true level of risk from 15 years of observations for a single country. Even looking at worldwide data, certain events dominate the disaster mortality data, and progress cannot be measured.

Global disaster-related mortality rate (per million global population), 1980–2013 (From Setting, measuring and monitoring targets for disaster risk reduction: recommendations for post-2015 international policy frameworks. Source: adapted from www.emdat.be)

A more reliable way to measure the progress of disaster risk reduction programs is to use a probabilistic methods, which rely on a far more extensive range of possibilities, simulating tens of thousands of catastrophic events. These can then be combined with data on exposures and vulnerabilities to output metrics of specific interest for disaster risk reduction, such as houses or lives lost. Such metrics can be used to:

  • Measure disaster risk in a village, city, or country and how it changes over time
  • Analyze the cost-benefit of mitigation measures:
    • For a region: For example, the average annual savings in lives due to a flood defense or earthquake early warning system
    • For a location: For example, choosing which building has the biggest reduction in risk if retrofitted
  • Quantify the impact of climate change and how these risks are expected to vary over time

In the long term, probabilistic catastrophe modeling will be an important way to ensure improved measurement and, therefore, management of disaster risk, particularly in countries and regions at greatest risk.

The immediate focus should be on educating government bodies and NGOs on the valuable use of probabilistic methods. For the 15 year frameworks which are being renewed this year, serious consideration should be given on how to implement a useful and practical probabilistic method of measuring progress in disaster risk reduction, for example by using hazard maps. See here for further recommendations: http://www.preventionweb.net/english/professional/publications/v.php?id=39649 

2015 is an important year for measuring disaster risk: let’s get involved.

High Tides a Predictor for Storm Surge Risk

On February 21, 2015, locations along the Bristol Channel experienced their highest tides of the first quarter of the 21st century, which were predicted to reach as high as 14.6 m in Avonmouth. When high tides are coupled with stormy weather, the risk of devastating storm surge is at its peak.

Storm surge is an abnormal rise of water above the predicted astronomical tide generated by a storm, and the U.K. is subject to some of the largest tides in the world, which makes its coastlines very prone to storm surge.


A breach at Erith, U.K. after the 1953 North Sea Flood

The sensitivity of storm surge to extreme tides is an important consideration for managing coastal flood risk. While it’s not possible to reliably predict the occurrence or track of windstorms—even a few days before they strike land—it is at least possible to predict years with a higher probability of storm surge well in advance—which can help in risk mitigation operation planning, insurance risk management, and pricing.

Perfect timing is the key to a devastating storm surge. The point at which a storm strikes a coast relative to the time and magnitude of the highest tide will dictate the size of the surge. A strong storm on a neap tide can produce a very large storm surge without producing dangerously high water levels. Conversely, a medium storm on a spring tide may produce a smaller storm surge, but the highest water level can lead to extensive flooding. The configuration of the coastal geometry, topography, bathymetry, and sea defenses can all have a significant impact on the damage caused and the extent of any coastal flooding.

This weekend’s high tides in the U.K. remind us of the prevailing conditions of the catastrophic 1607 Flood, which also occurred in winter. The tides reached an estimated 14.3 m in Avonmouth which, combined with stormy conditions at the time, produced a storm surge that caused the largest loss of life in the U.K. from a sudden onset natural catastrophe. Records estimate between 500 and 2,000 people drowned in villages and isolated farms on low-lying coastlines around the Bristol Channel and Severn Estuary. The return period of such an event is probably over 500 years and potentially longer.

The catastrophic 1953 Flood is another example of a U.K. storm surge event. These floods caused unprecedented property damage along the North Sea coast in the U.K. and claimed more than 2,000 lives along northern European coastlines. This flood occurred close to a Spring tide, but not on an exceptional tide. Water level return periods along the east coast are varied, peaking at just over 200 years in Essex and just less than 100 years in the Thames. So, while the 1953 event is rightfully a benchmark event for the insurance industry, it was not as “extreme” as the 1607 Flood, which coincided with an exceptionally high astronomical tide.

Thankfully, there were no strong storms that struck the west coast of the U.K. this weekend. So, while the high tides may have caused some coastal flooding, they were not catastrophic.

RMS(one): Tackling a Unique Big Data Problem

I am thrilled to join the team at RMS as CTO, with some sensational prospects for growth ahead of us. I originally came to RMS in a consulting role with CodeFutures Corporation, tapped to consult RMS on the development of RMS(one). In that role, I became fascinated by RMS as a company, by the vision for RMS(one), and by the unique challenges and opportunities that it presented. I am delighted to bring my experience and expertise in-house, where my primary focus is continuing the development of the RMS(one) platform and ensuring a seamless transition from our existing core product line.

I have tackled many big data problems in my previous role as CEO and COO of CodeFutures, where we created a big data platform designed to remove the complexity and limitations of current data management approaches. In my more than 20 years of experience with advanced software architectures, I worked with many of the most innovative and brilliant people in high-performance computing; I have helped organizations address the challenges of big data performance and scalability, encouraging effective applications of emerging technologies to fields including social networking, mobile applications, gaming, and complex computing systems.

Each big data problem is unique, but RMS’ is particularly intriguing. Part of what attracted me to the CTO role at RMS was the idea of tackling head-on the intense technical challenges of delivering a scalable risk management platform to an international group of the world’s leading insurance companies. Risk management is unique in the type and scale of data it manages; traditional big data techniques fall far short when tackling this problem. Not only do we need to handle data and processing at tremendous scale, we need to do it with the speed that meets customer expectations. RMS has customers all around the world and we need to deliver a platform they can all leverage to get results they need and expect.

The primary purpose of RMS(one) is to enable companies in the insurance, reinsurance, and insurance-linked securities industries to run RMS next generation HD catastrophe models. It will also allow them to implement their own models and give them access to others by third-party developers in an ever-growing ecosystem. It is designed as an open exposure and risk management platform on which users can define the full gamut of their exposures and contracts, and then implement their own analytics on a highly scalable and purpose-built cloud-based platform. RMS(one) will offer unprecedented flexibility, as well as truly real-time and dynamic risk management processes that will generate more resilient and profitable portfolios—very exciting stuff!

During development of RMS(one), we have garnered outstanding support and feedback from key customers and joint development partners; we know the platform is the first of its kind—a truly integrated and scalable platform for managing risk has never been accomplished before. Through beta testing we obtained hands-on feedback from said customers that we are leveraging into our new designs and capabilities. The idea is to provide new means to enable risk managers to change how they work, providing better results while expending less effort and time.

I work closely with several teams within the company, including software development, model development, product management, sales, and others to deliver on the platform’s objectives. The most engaging part of this work is turning the plans into workable designs that can then be executed by our teams. There is a tremendous group of talented individuals at RMS, and a big part of my job is to coalesce their efforts into a great final product, leveraging the brilliant ideas I encounter from many parts of the company. It is totally exciting, and our focus is riveted on delivering against the plan for RMS(one).

The challenges around modeling European windstorm clustering for the (re)insurance industry

In December I wrote about Lothar and Daria, a cluster of windstorms that emphasized the significance of ‘location’ when assessing windstorm risk. This month we have the 25th anniversary of the most damaging cluster of European windstorms on record—Daria, Herta, Wiebke, and Vivan.

This cluster of storms highlighted the need for better understanding the potential impact of clustering for insurance industry.

At the time of the events the industry was poorly prepared to deal with the cluster of four extreme windstorms that struck in rapid succession over a very short timeframe. However, since then we have not seen such a clustering again of such significance, so how important is this phenomena really over the long term?

There has been plenty of discourse over what makes a cluster of storms significant, the definition of clustering and how clustering should be modeled in recent years.

Today the industry accepts the need to consider the impact of clustering on the risk, and assess its importance when making decisions on underwriting and capital management. However, identifying and modeling a simple process to describe cyclone clustering is still proving to be a challenge for the modeling community due to the complexity and variety of mechanisms that govern fronts and cyclones.

What is a cluster of storms?

Broadly, a cluster can be defined as a group of cyclones that occur close in time.

But the insurance industry is mostly concerned with severity of the storms. Thus, how do we define a severe cluster? Are we talking about severe storms, such as those in 1990 and 1999, which had very extended and strong wind footprints. Or is it storms like those in the winter 2013/2014 season, that were not extremely windy but instead very wet and generated flooding in the U.K.? There are actually multiple descriptions of storm clustering, in terms of storm severity or spatial hazard variability.

Without a clearly identified precedence of these features, defining a unique modeled view for clustering has been complicated and brings uncertainty in the modelled results. This issue also exists in other aspects of wind catastrophe modeling, but in the case of clustering, the limited amount of calibration data available makes the problem particularly challenging.

Moreover, the frequency of storms is impacted by climate variability and as a result there are different valid assumptions that could be applied for modeling, depending on the activity time frame replicated in the model. For example, the 1980s and 1990s were more active than the most recent decade. A model that is calibrated against an active period will produce higher losses than one calibrated against a period of lower activity.

Due to the underlying uncertainty in the model impact, the industry should be cautious of only assessing either a clustered or non-clustered view of risk until future research has demonstrated that one view of clustering is superior to others.

How does RMS help?

RMS offers clustering as an optional view that reflects well-defined and transparent assumptions. By having different views of risk model available to them, users can better deepen their understanding of how clustering will impact a particular book of business, and explore the impact of the uncertainty around this topic, helping them make more informed decisions.

This transparent approach to modeling is very important in the context of Solvency II and helping (re)insurers better understand their tail risk.

Right now there are still many unknowns surrounding clustering but ongoing investigation, both in academia and industry, will help modelers to better understand the clustering mechanisms and dynamics, and the impacts on model components to further reduce the prevalent uncertainty that surrounds windstorm hazard in Europe.

 

Fighting Emerging Pandemics With Catastrophe Bonds

By Dr. Gordon Woo, catastrophe risk expert

When a fire breaks out in a city, there needs to be a prompt firefighting response to contain the fire and prevent it from spreading. The outbreak of a major fire is the wrong time to hold discussions on the pay of firefighters, to raise money for the fire service, or to consider fire insurance. It is too late.

Like fire, infectious disease spreads at an exponential rate. On March 21, 2014, an outbreak of Ebola was confirmed in Guinea. In April, it would have cost a modest sum of $5 million to control the disease, according to the World Health Organization (WHO). In July, the cost of control had reached $100 million; by October, it had ballooned to $1 billion. Ebola acts both as a serial killer and loan shark. If money is not made available rapidly to deal with an outbreak, many more will suffer and die, and yet more money will be extorted from reluctant donors.

Photo credits: Flickr/©afreecom/Idrissa Soumaré

An Australian nurse, Brett Adamson, working for Médecins Sans Frontières (MSF), summed up the frustration of medical aid workers in West Africa, “Seeing the continued failure of the world to respond fast enough to the current situation I can only assume I will see worse. And this I truly dread”

One of the greatest financial investments that can be made is for the control of emerging pandemic disease. The return can be enormous: one dollar spent early can save twenty dollars or more later. Yet the Ebola crisis of 2014 was marked by unseemly haggling by governments over the failure of others to contribute their fair share to the Ebola effort. The World Bank has learned the crucial risk management lesson: finance needs to be put in place now for a future emerging pandemic.

At the World Economic Forum held in Davos between January 21-24, 2015, the World Bank president, Jim Yong Kim, himself a physician, outlined a plan to create a global fund that would issue bonds to finance important pandemic-fighting measures, such as training healthcare workers in advance. The involvement of the private sector is a key element in this strategy. Capital markets can force governments and NGOs to be more effective in pandemic preparedness. Already, RMS has had discussions with the START network of NGOs over the issuance of emerging pandemic bonds to fund preparedness. One of their brave volunteers, Pauline Cafferkey, has just recovered from contracting Ebola in Sierra Leone.

The market potential for pandemic bonds is considerable; there is a large volume of socially responsible capital to be invested in these bonds, as well as many companies wishing to hedge pandemic risks.

RMS has unique experience is this area. Our LifeRisks models are the only stochastic excess mortality models to have been used in a 144A transaction, and we have undertaken the risk analyses for all 144A excess mortality capital markets transactions issued since the 2009 (swine) flu pandemic.

Excess mortality (XSM) bonds modeled by RMS  
Vita Capital IV Ltd 2010
Kortis Capital Ltd 2010
Vita Capital IV Ltd. (Series V and VI) 2011
Vita Capital V 2012
Mythen Re Ltd. (Series 2012-2)XSM modeled by RMS 2012
Atlas IX Capital Limited (Series 2013-1) 2013

With this unique experience, RMS is best placed to undertake the risk analysis for this new developing market, which some insiders believe has the potential to grow bigger than the natural catastrophe bond market.

Winter Storm Juno: Three Facts about “Snowmageddon 2015”

By Jeff Waters, meteorologist and senior analyst, business solutions

There were predictions that Winter Storm Juno—which many in the media and on social media dubbed “Snowmageddon 2015”—would be one of the worst blizzards to ever hit the East Coast. By last evening, grocery stores from New Jersey to Maine were stripped bare and residents were hunkered down in their homes.

Blizzard of 2015: Bus Snow Prep. Photo: Metropolitan Transportation Authority / Patrick Cashin

It turns out the blizzard—while a wallop—wasn’t nearly as bad as expected. The storm ended up tracking 50 to 75 miles further east, thus sparing many areas anticipating a bludgeoning and potentially reducing damages.

Here are highlights of what we’re seeing do far:

The snowstorm didn’t cripple Manhattan, but brought blizzard conditions to Long Island and more than two feet of snow in certain areas of New York, Connecticut, and Massachusetts.

The biggest wind gust thus far in the New York City forecast area has been 60 mph, which occurred just after 4:00 am ET this morning.

From The New York Times: “For some it was a pleasant break from routine, but for others it was a burden. Children stayed home from school, even in areas with hardly enough snow on the ground to build a snowman. Parents, too, were forced to take a day off.”

Slightly north, The Hartford Courant received reports from readers of as much as 27 inches of snow in several locations and as little as five inches in others. They asked readers to offer tallies of snow and posted the results in an interactive map.

Massachusetts was hit hardest, with heavy snow and a hurricane force wind gust reported in Nantucket.

The biggest wind gust overall has been 78 mph in Nantucket, MA, which is strong enough to be hurricane force.

From The Boston Globe: “By mid-morning, with the snow still coming down hard, the National Weather Service had fielded unofficial reports of 30 inches in Framingham, 28 inches in Littleton, and 27 inches in Tyngsborough. A number of other communities recorded snow depths greater than 2 feet, including Worcester, where the 25 inches recorded appeared likely to place it among the top 5 ever recorded there.”

There’s more snow to come, but the economic impact is likely to be less than anticipated.

Notable snowfall totals have been recorded across the East Coast. Many of these areas, particularly in coastal New England (including Boston), will see another 6-12 inches throughout the day today.

It’s too early to provide loss estimates, and damages are still likely as snow melts and flooding begins, particularly in hard hit areas of New England like Providence and Boston. However, with New York City spared, the impact is likely far less significant than initially anticipated.