Category Archives: Regulatory Compliance

Outcomes from The Solvency II “Lessons Learned” Insurance Conference in Slovenia

For insurers and consumers in the European Union, 2016 is a key year, since it is when the industry gets real experience of Solvency II, the newly implemented risk-based supervisory system. After a decade in the making, Solvency II officially came into force on January 1, 2016. While it had been a scramble by the industry to meet that deadline, ten months on as the road becomes less bumpy, what have we learned?

Insurers have met their numerous reporting requirements under the new regime, as well as calculated the Solvency Capital Requirement (SCR), prepared Own Risk and Solvency Assessments (ORSAs), and set out their risk management frameworks and rules of governance. Although this appears a straightforward task, in reality, the introduction of Solvency II has created a significant paradigm shift in insurance regulation, the biggest experienced in decades – with a corresponding cultural and strategic challenge to firms that do business in the European Union.

In September, I attended a conference in Slovenia’s capital Ljubljana, where industry participants gathered to assess where the industry has got to.

Has Anything Gone Wrong?

According Europe’s regulatory umbrella body, the answer to this question is an emphatic “no.” Manuela Zweimueller, the Head of Regulations at the European Insurance and Occupational Pensions Authority (EIOPA), added that although Solvency II is not quite perfect, regulators are continuing to refine the requirements. The main challenge according to EIOPA, is that Solvency II needs to be equally understood by regulators and (re)insurers over the next five years in order to close up the pockets of inefficiency and provide a level playing field for all those involved. EIOPA terms this “supervisory convergence.”

From the standpoint of European insurers and national regulators there are several core challenges. The German Federal Financial Supervisory Authority considers the combined demands of a complex internal model approval process, the need to work through complicated and lengthy reports and data, and appropriately train staff create challenges for supervised firms. From an industry perspective, Italian insurer Generali revealed that the main issues they face are around the complexity of internal model requirements and documentation. Both sides agree, however, that despite the burden of regulatory compliance and high level of technical detail involved, the use of an internal model for Solvency II to measure risk provides substantial benefits in the way of management, governance, and strategic decision-making. This makes Solvency II the only long-term solution for almost all insurers. For a brief discussion of the benefits of internal models, see my earlier blog post.

The additional demands of complying with Solvency II, however, have partly given rise to a surge in M&A activity. By going under the wing of a larger business, only one solvency return needs to be filed, which results in efficiencies and cost-savings. According to the Association of British Insurers (ABI), firms in the U.K. alone have already invested at least £3 billion (US$3.7 billion) to comply with the new solvency regulations. Strategic M&A activity is likely to rise, especially for small to medium-sized insurers which face problems maintaining the same levels of profitability as they did prior to Solvency II, and are seeking ways to defend their positions in the market.

What Does the Future Hold?

What’s needed next, according to EIOPA, is a period of stability for Solvency II – though there are still many more challenges that lie ahead. For instance, in the short term, insurance firms will undoubtedly feel the pinch, with many needing to invest more time and money into efficiently reporting their solvency ratios to the regulators. But there will be a preliminary review of the new directive in 2018 when EIOPA will address some of the complexities.

More widely, fears are increasing over the economic reality of low interest rates (which are hitting the life insurance market the hardest), decreasing corporate yields, and stock market volatility with Brexit. Although the consequences of Brexit have not been as bad as expected so far, these factors will still need to be managed in the balance sheet.

And despite all the difficulties that lie ahead for the industry as a whole, EIOPA stresses that we must remember that the ultimate goal of Solvency II is not just to unify a single EU insurance market, but to increase consumer protection – and adopting a consumer-centered approach is beneficial for all.

The Rising Cost of Hurricanes – and America’s Ability to Pay

Future hurricanes are going to cost the U.S. more money and, if we don’t act to address this, it will leave the government struggling to cope. That is the finding of a recent Congressional Budget Office (CBO) report which put it starkly:

“…over time, the costs associated with hurricane damage will increase more rapidly than the economy will grow. Consequently, hurricane damage will rise as a share of gross domestic product (GDP)…”

The CBO identified two core drivers for the escalating costs: climate change, which will drive just under half of the potential increases in hurricane damages while just over half of damages will come from coastal development. The four main four variables that would have the most impact were identified as:

  • Changes in sea levels for different U.S. states;
  • changes in the frequency of hurricanes of various intensities;
  • population growth in coastal areas, and;
  • per capita income in coastal areas.

Using Catastrophe Models to Calculate the Future Cost of Hurricanes

To inform the CBO’s research and creation of a range of possible hurricane scenarios based on future changes to the four key variables, RMS hurricane and storm surge risk experts provided the CBO with data from the RMS North Atlantic Hurricane Model and Storm Surge Model.

Through RMS’ previous work with the Risky Business Initiative we were able to provide state specific “damage functions” which were used to translate possible future hurricane events, state-specific sea levels and current property exposure into expected damaged. While we usually produce loss estimates for catastrophes, we didn’t provide the CBO with estimated losses ourselves – rather we built a tool so the CBO could “own” its own assumptions about changes in all the factors – a critical aspect of the CBO’s need to remain impartial and objective.

Solutions to Increase Coastal Urban Resilience

The CBO’s report includes suggested policies that could decrease the pressure on federal spending. The polices range from global initiatives to limit greenhouse gas emissions to more direct mechanisms that could shift costs to state and local governments and private entities, as well as investing in structural changes to reduce vulnerabilities. Such approaches bring to the forefront the role of local resilience in tackling a global problem.

However, therein lies the challenge. Many of the options open to society to increase resiliency against catastrophes, could have a less positive effect on the economy. It’s an issue that has been central to the wider debate about reducing the impacts of climate change. Limiting greenhouse gas emissions has direct effects on the oil and gas industry.  Likewise, curbing coastal development impacts developers and local economies. It has led states such as North Carolina to ban the use of future sea level rise projections as the basis for policies on coastal development.

Overcoming Political Resistance

Creating resiliency in U.S. towns and communities needs to be a multi-faceted effort. While initiatives to fortify the building stock and curb global climate change and sea level rise are moving ahead there is strong resistance from the political arena.  To overcome the resistance, solutions to transition the economy to new forms of energy must be found, as well as ways to adapt the current workforce to the jobs of the future. City leaders and developers should partner to find sustainable growth initiatives for urban growth, to ease the fears that coastal cities will wither and die under new coastal use restrictions.

Initiating these conversations will go a long way to removing the barriers to success in curbing greenhouse gas emissions and limiting coastal growth. With an already polarised political debate on climate change this CBO report may provoke further controversy about how to deal with the factors behind the increase in future hurricane damage costs. Though one conclusion is inescapable: catastrophe losses are going up and we will all be footing the bill.

This post was co-authored by Paul Wilson and Matthew Nielsen.

Matthew Nielsen

Senior Director of Global Governmental and Regulatory Affairs, RMS

Matthew Nielsen leads Governmental and Regulatory Affairs. He is responsible for maintaining relationships with regulators, legislators, and rating agencies on behalf of the company to establish open channels of communication around RMS models and solutions. Matthew is a meteorologist and geographer with extensive experience in North American catastrophe risk. In his prior role at RMS, he was responsible for developing the RMS climate peril models for the Americas, including the severe convective storm, winter storm, flood, and hurricane models. He has conducted field reconnaissance for major catastrophes including Hurricanes Katrina and Sandy. Before joining RMS, Matthew conducted remote sensing in satellite meteorology research at the Cooperative Institute for Research in the Atmosphere (CIRA). He holds a BS in physics from Ripon College, where he won the Henry Knop Award in Physics, and an MS in atmospheric science from Colorado State University. Matthew is a member of the American Meteorological Society (AMS), the International Society of Catastrophe Managers (ISCM), and the American Association of Geographers (AAG).

European Windstorm: Such A Peculiarly Uncertain Risk for Solvency II

Europe’s windstorm season is upon us. As always, the risk is particularly uncertain, and with Solvency II due smack in the middle of the season, there is greater imperative to really understand the uncertainty surrounding the peril—and manage windstorm risk actively. Business can benefit, too: new modeling tools to explore uncertainty could help (re)insurers to better assess how much risk they can assume, without loading their solvency capital.

Spikes and Lulls

The variability of European windstorm seasons can be seen in the record of the past few years. 2014-15 was quiet until storms Mike and Niklas hit Germany in March 2015, right at the end of the season. Though insured losses were moderate[1], had their tracks been different, losses could have been so much more severe.

In contrast, 2013-14 was busy. The intense rainfall brought by some storms resulted in significant inland flooding, though wind losses overall were moderate, since most storms matured before hitting the UK. The exceptions were Christian (known as St Jude in Britain) and Xaver, both of which dealt large wind losses in the UK. These two storms were outliers during a general lull of European windstorm activity that has lasted about 20 years.

During this quieter period of activity, the average annual European windstorm loss has fallen by roughly 35% in Western Europe, but it is not safe to presume a “new normal” is upon us. Spiky losses like Niklas could occur any year, and maybe in clusters, so it is no time for complacency.

Under Pressure

The unpredictable nature of European windstorm activity clashes with the demands of Solvency II, putting increased pressure on (re)insurance companies to get to grips with model uncertainties. Under the new regime, they must validate modeled losses using historical loss data. Unfortunately, however, companies’ claims records rarely reach back more than twenty years. That is simply too little loss information to validate a European windstorm model, especially given the recent lull, which has left the industry with scant recent claims data. That exacerbates the challenge for companies building their own view based only upon their own claims.

In March we released an updated RMS Europe Windstorm model that reflects both recent and historic wind history. The model includes the most up-to-date long-term historical wind record, going back 50 years, and incorporates improved spatial correlation of hazard across countries together with a enhanced vulnerability regionalization, which is crucial for risk carriers with regional or pan-European portfolios. For Solvency II validation, it also includes an additional view based on storm activity in the past 25 years. Pleasingly, we’re hearing from our clients that the updated model is proving successful for Solvency II validation as well as risk selection and pricing, allowing informed growth in an uncertain market.

Making Sense of Clustering

Windstorm clustering—the tendency for cyclones to arrive one after another, like taxis—is another complication when dealing with Solvency II. It adds to the uncertainties surrounding capital allocations for catastrophic events, especially due to the current lack of detailed understanding of the phenomena and the limited amount of available data. To chip away at the uncertainty, we have been leading industry discussion on European windstorm clustering risk, collecting new observational datasets, and developing new modeling methods. We plan to present a new view on clustering, backed by scientific publications, in 2016. These new insights will inform a forthcoming RMS clustered view, but will be still offered at this stage as an additional view in the model, rather than becoming our reference view of risk. We will continue to research clustering uncertainty, which may lead us to revise our position, should a solid validation of a particular view of risk be achieved.

Ongoing Learning

The scientific community is still learning what drives an active European storm season. Some patterns and correlations are now better understood, but even with powerful analytics and the most complete datasets possible, we still cannot yet forecast season activity. However, our recent model update allows (re)insurers to maintain an up-to-date view, and to gain a deeper comprehension of the variability and uncertainty of managing this challenging peril. That knowledge is key not only to meeting the requirements of Solvency II, but also to increasing risk portfolios without attracting the need for additional capital.

[1] Currently estimated by PERILS at 895m Euro, which aligns with the RMS loss estimate in April 2015

Exposure Data: The Undervalued Competitive Edge

High-quality catastrophe exposure data is key to a resilient and competitive insurer’s business. It can improve a wide range of risk management decisions, from basic geographical risk diversification to more advanced deterministic and probabilistic modeling.

The need to capture and use high quality exposure data is not new to insurance veterans. It is often referred to as the “garbage-in-garbage-out” principle, highlighting the dependency of catastrophe model’s output on reliable, high quality exposure data.

The underlying logic of this principle is echoed in the EU directive Solvency II, which requires firms to have a quantitative understanding of the uncertainties in their catastrophe models; including a thorough understanding of the uncertainties propagated by the data that feeds the models.

The competitive advantage of better exposure data

The implementation of Solvency II will lead to a better understanding of risk, increasing the resilience and competitiveness of insurance companies.

Firms see this, and more insurers are no longer passively reacting to the changes brought about by Solvency II. Increasingly, firms see the changes as an opportunity to proactively implement measures that improve exposure data quality and exposure data management.

And there is good reason for doing so: The majority of reinsurers polled recently by EY (formerly known as Ernst & Young) said quality of exposure data was their biggest concern. As a result, many reinsurers apply significant surcharges to cedants that are perceived to have low-quality exposure data and exposure management standards. Conversely, reinsurers are more likely to provide premium credits of 5 to 10 percent or offer additional capacity to cedants that submit high-quality exposure data.

Rating agencies and investors also expect more stringent exposure management processes and higher exposure data standards. Sound exposure data practices are, therefore, increasingly a priority for senior management, and changes are driven with the mindset of benefiting from the competitive advantage that high-quality exposure data offers.

However, managing the quality of exposure data over time can be a challenge: During its life cycle, exposure data degrades as it’s frequently reformatted and re-entered while passed on between different insurance entities along the insurance chain.

To fight the decrease of data quality, insurers spend considerable time and resources to re-format and re-enter exposure data as its being passed on along the insurance chain (and between departments within each individual touch point on the chain). However, due to the different systems, data standards and contract definitions in place a lot of this work remains manual and repetitive, inviting human error.

In this context, RMS’ new data standards, exposure management systems, and contract definition languages will be of interest to many insurers; not only because it will help them to tackle the data quality issue, but also by bringing considerable savings through reduced overhead expenditure, enabling clients to focus on their core insurance business.

What Can the Insurance Market Teach Banks About Stress Tests?

In the last eight years the national banks of Iceland, Ireland, and Cyprus have failed. Without government bailouts, the banking crisis of 2008 would also have destroyed major banks in the United Kingdom and United States.

Yet in more than 20 years, despite many significant events, every insurance company has been able to pay its claims following a catastrophe.

The stress tests used by banks since 1996 to manage their financial stability were clearly ineffective at helping them withstand the 2008 crisis. And many consider the new tests introduced each year in an attempt to prevent future financial crises to be inadequate.

In contrast, the insurance industry has been quietly using stress tests with effect since 1992.

Why Has the Insurance Industry Succeeded While Banks Continue to Fail?

For more than 400 years the insurance industry was effective at absorbing losses from catastrophes.

In 1988 everything changed.

The Piper Alpha oil platform exploded and Lloyd’s took most of the $1.9 billion loss. The following year Lloyd’s suffered again from Hurricane Hugo, the Loma Prieta earthquake, the Exxon Valdez oil spill, and decades of asbestos claims. Many syndicates collapsed and Lloyd’s itself almost ceased to exist. Three years later, in 1992, Hurricane Andrew slammed into southern Florida causing a record insurance loss of $16 billion. Eleven Florida insurers went under.

Since 1992, insurers have continued to endure record insured losses from catastrophic events, including the September 11, 2001 terrorist attacks on the World Trade Center ($40 billion), 2005 Hurricane Katrina ($60 billion—the largest insured loss to date), the 2011 Tohoku earthquake and tsunami ($40 billion), and 2012 Superstorm Sandy ($35 billion).

Despite the overall increase in the size of losses, insurers have still been able to pay claims, without a disastrous impact to their business.

So what changed after 1992?

Following Hurricane Andrew, A.M. Best required all U.S. insurance companies to report their modeled losses. In 1995, Lloyd’s introduced the Realistic Disaster Scenarios (RDS), a series of stress tests that today contains more than 20 different scenarios. The ten-page A.M. Best Supplemental Rating Questionnaire provides detailed requirements for reporting on all major types of loss potential, including cyber risk.

These requirements might appear to be a major imposition to insurance companies, restricting their ability to trade efficiently and creating additional costs. But this is not the case.

Why Are Stress Tests Working For Insurance Companies?

Unlike the banks, stress tests are at the core of how insurance companies operate. Insurers, regulators, and modeling firms collaborate to decide on suitable stress tests. The tests are based on the same risk models that are used by insurers to select and price insurance risks.

And above all, the risk models provide a common currency for trading and for regulation.

How Does This Compare With the Banking Industry? 

In 1996, the Basel Capital Accord allowed banks to run their own stress tests. But the 2008 financial crises proved that self-regulation would not work. So, in 2010, the Frank-Dodd Act was introduced in the U.S., followed by Basel II in Europe in 2012, passing authority to regulators to perform the stress tests on banks.

Each year, the regulators introduce new stress tests in an attempt to prevent future crises. These include scenarios such as a 25% decline in house prices, 60% drop in the stock market, and increases in unemployment.

Yet, these remain externally mandated requirements, detached from the day-to-day trading in the banks. Some industry participants criticize the tests for being too rigorous, others for not providing a broad enough measure of risk exposure.

What Lessons Can the Banking Industry Learn From Insurers?

The Bank of England is only a five-minute walk from Lloyd’s but the banking world seems to have a long journey ahead before managing risk is seen as a competitive advantage rather than an unwelcome overhead.

The banking industry needs to embrace stress tests as a valuable part of daily commercial decision-making. Externally imposed stress tests cannot continue to be treated as an unwelcome interference in the success of the business.

And ultimately, as the insurance industry has shown, collaboration between regulators and practitioners is the key to preventing financial failure.

Water, Water Everywhere: The Effect of Climate Change on Florida

Climate change has been a hot topic in Florida for quite some time. Just last week, President Obama visited the Everglades to discuss the need to address climate change now.

RMS partnered with the Risky Business Initiative to quantify and publicize the economic risks the United States faces from the impacts of a changing climate. In Florida, there is a 1% chance that by 2100, 17% of current Florida property value will be underwater, causing a $20.7 billion increase in annual flooding losses, and $681 billion worth of property loss due to sea level rise.

Bob Correll, principal at the Global Environment Technology Foundation leading the Center for Energy and Climate Solutions: Just last week a report commissioned by the G7 was released to the foreign ministers, including Secretary of State John Kerry, titled “A New Climate for Peace: Taking Action on Climate and Fragility Risk.” It outlines seven things we need to worry about as the changing climate becomes more evident, including sea-level rise and coastal degradation.

Brian Soden, atmospheric sciences professor, University of Miami: Sea level rise is the impact of climate change that I’m most worried about. The rate of sea level rise has almost doubled in Miami over the past decade. We are the canary in the coal mine. If you increase sea level by just three feet, which is in the middle of the range of projections, the Everglades would pretty much be gone.

Robert Muir-Wood, chief research officer, RMS: At RMS we attempt to be completely objective about risk. We attempt to take the full scientific understanding and translate it into information about risk and the associated cost. Financial markets are smart. Future risk is already starting to affect the current value of property.

Matthew Nielson, senior director of global governmental and regulatory affairs, RMS: Regulations generally fall into two buckets: curbing emissions so we can temper this problem and thinking about future development and planning to account for future sea level rise.

But what do we do now? There are a lot of things to think about – one is drainage issues. Another is access to fresh water.

Paul Wilson, senior director of model development and lead modeler for the Risky Business Initiative, RMS: It will be interesting to see how things play out – if the response will come as a result of science and gradual sea level rise, or only after a major catastrophe.

Muir-Wood: It’s very hard for communities to take action until they’ve had a disaster. As we’ve seen with Hurricane Katrina and Superstorm Sandy, suddenly there’s all sorts of enlightened thinking about future risk, such as investments in sea defenses. Unfortunately, it often takes a catastrophe to impact on decisions about mitigating risk.

Paul VanderMarck, chief products officer, RMS: You can only build a sea wall so high before it’s not worth living here anymore.

Soden: The biggest question I ask myself is “when do I sell?”

Correll: A year ago the WEF came to us and asked if we would be willing to work with their young global leaders. We had the head of all Shell operations in the Middle East. We had the former head of GE operations in India. They are getting the message. They walked away saying, “we need to rethink our business plans to plan for the future.”

Modeling provides a lot of the underpinnings to make decisions that are outside of the norm. The past is no longer a prologue to the future.

Managing Risk from Regulatory Requirements

A study last year by the Centre for the Study of Financial Innovation in collaboration with PricewaterhouseCoopers identified regulation as the number one risk after surveying life and non-life insurers, reinsurers, brokers, regulators, consultants, and service providers across North America, Bermuda, Latin America, Europe, Africa, the Middle East, and Asia.

We have been seeing an increase in regulatory requirements across the world—Solvency II in Europe, ORSA in the U.S., APRA’s horizontal requirement in Australia, and the B9 Earthquake requirement by OFSI in Canada—to name just a few. There has also been a push in Asia to move toward Solvency II style of regulation, with the Chinese regulator announcing intent to introduce a regulation regime based on a three-pillar system, and Japan aiming for Solvency II equivalence, at least for reinsurance.

Respondents were concerned that these new regulations come at a time when the industry is seeing reduced profitability due to poor investment performance in an uncertain macroeconomic environment. Some respondents felt that the sheer volume of the new regulations is creating a whole new class of risk—regulatory compliance risk.

Last week, Ernst & Young published its European Solvency II survey, spanning 20 countries and participants from more than 170 insurance companies. The study focused on Solvency II preparedness, and determined that the Pillar 3 regulatory requirement, which requires institutions to disclose details on the scope of application, capital, risk exposures, risk assessment processes, and the capital standing of the institution, still presents a major challenge across the industry. EY concluded that the challenges of reporting and ensuring robust data and information technology remain very significant.

This is not surprising, as we’re familiar with how the industry currently manages data. Multiple databases, missing or incorrect exposure data, risk clash, and an inability to consistently analyze or report across different businesses and entities are only symptoms of the malaise. Despite multiple industry initiatives, we have not managed to resolve the data quality issue.

Tracking of data, audit trails, the ability to roll back changes, and role-based user access are simple mechanisms that most other industries have widely embraced. Utilizing one system of record for all exposure data, no matter what the line of business or risk, has the obvious benefit of reducing errors and inconsistencies while creating a single source of risk data for modeling and other business applications. The ability to integrate insights from claims data into specific model adjustments, rather than having to tamper with exposure data, will further the integrity of exposure data as a single source of truth.

Taking the concept of a single system of record further, enabling catastrophe modeling and capital modeling tools to access the same underlying exposure data, with clearly defined hierarchies, can largely get rid of today’s versioning and inconsistency headaches. Even better, such a system of record could provide up-to-the-minute, “live” exposure.

The last piece of the puzzle is efficient reporting to internal and external stakeholders. Customizable dashboards, reporting apps for various regulatory and rating purposes, and APIs to communicate with external websites provide the necessary arsenal to meet multiple reporting requirements across group entities around the globe.

A well-designed system and infrastructure that helps companies meet regulatory requirements and achieve resilient risk management objectives is the holy grail of the industry.