logo image

For (re)insurers, getting an enterprise-level view of their exposure is essential. It sheds light across the entire book of business – identifying concentrations of risks, portfolio blind spots, and how losses flow through insurance and reinsurance portfolios – especially when an event strikes.

With the time, effort, and resources required to produce an enterprise-level view of exposure, the overall result may be inaccurate, not reflect the entire business, or just be out of date. So, what are the five main challenges in getting a skillful enterprise-level view of exposure, and how can they be averted?

1. Portfolio Blind Spots Due to Siloed Systems

Business decisions benefit from a comprehensive and consistent view of risk, yet exposure managers are required to collate information from multiple disparate systems.

(Re)insurers typically structure their risk teams to focus on a specific region. Teams often then work in parallel to understand portfolio and reinsurance losses independently, creating a disjointed view of global risk across the organization.

For example, a large global insurer might underwrite European wind cover locally in Europe, but the global cover may be underwritten out of the U.S. or Bermuda. And this global cover may or may not include the same European exposure covered in the Europe-specific policy, depending on the layer written and any subconditions. Without a unified enterprise view, each region cannot easily access the total client exposure at the time of underwriting.

To add to the complexity, existing tools will typically either manage insurance or reinsurance entities but not both, and with no centralized repository. These siloed systems can result in critical portfolio blind spots, leading to underwriters unknowingly exceeding their risk thresholds or leaving the company vulnerable to unexpected large losses following an event.

2. IT Bottlenecks Make Building a Holistic View of Risk Difficult

Current tools make accumulations easy on a single-client basis, but when a roll-up of the total group-level portfolio is needed, it can be a lengthy, complex process. Typically, each portfolio is stored in a separate Exposure Data Module (EDM), some of which are not immediately available due to internal archiving rules. Yet, all these individual EDMs need to be consolidated to analyze the group-level view.

(Re)insurers often experience performance (or IT) bottlenecks when third-party systems are called on to generate an enterprise view. The systems struggle with the volume of data, and combining large portfolios on systems designed for single portfolios slows the process or makes it crash.

As a work-around, analysts run portfolio accumulations through the back end, often in an enhanced version of SQL (structured query language) to handle the data loads.

The sheer amount of data can be overwhelming, especially for some smaller insurers. Due to the scale of the task, insurers can typically produce an analysis only once a quarter. Shortcuts abound, with data aggregated and consolidated at lower resolutions, such as state or county level, plus manual work-arounds reliant on ad hoc specialist expertise. The resulting lack of confidence in output can lead to a company that underperforms compared to the rest of the market.

3. Poor Consistency in Financial Modeling Leads to Lack of Confidence in Analytics

Consistency in modeling is key, and the myriad third-party tools that are used all need to unite around a financial model. But many (re)insurers often do not have a financial model robust enough to capture and accurately apply complex policy conditions and inward treaty structures.

And even if one tool does have a robust enough financial engine, the same engine will not be utilized across the multiple tools used by analysts. This results in inconsistency in how treaty conditions are applied and even the specific financial terms and their meanings.

As a work-around, analysts often just run the gross loss in the vendor probabilistic model, and then manually code how the more sophisticated treaties are supposed to work against their portfolio in the back end using SQL. These required manual work-arounds not only slow renewal modeling at the busiest times of the year, but they increase the potential for inaccurate results – reducing overall confidence in the analytics. 

4. Inability to Optimize Outward Reinsurance Makes for Expensive Reinsurance Purchases

Organizations have very little visibility into what different outward reinsurance structures might look like against their portfolio before purchase. Not to mention there’s little opportunity to test different scenarios and weigh them against different business strategies and potential costs to the business.

In the reinsurance world, there’s also a very short window between renewal completion, portfolio roll-up, analytics generation, and then getting your outward reinsurance placed – everything must be done within about eight weeks. With hundreds of treaties to be reanalyzed, there is a heavy reliance on brokers to assist with structuring outward reinsurance programs.

Outward reinsurance is often purchased at different levels across the business. A treaty might be purchased locally, but then also at the group level – perhaps covering different perils and regions. For the analysis, the details of the outward reinsurance are typically stored in a separate system or in an Excel spreadsheet, and this vital information must be applied manually (i.e., outside of any third-party modeling tool), which takes an enormous amount of time.

Overall, an inability to accurately capture outward reinsurance structures means organizations cannot produce a true net of reinsurance loss. So they often resort to taking an overly conservative approach to their reinsurance purchase, perhaps paying more than necessary.

5. Integration of Real-Time Data Slows Event Response Operations

Exposure managers are in the spotlight during a cat event as they move through stage after stage, waiting for and preparing third-party event response assets, rolling up portfolios, and reforming data to be compatible with their vendor model or GIS (geographic information system) tools.

With a fast-moving event, the latest information changes rapidly, and exposure managers might not even complete their analysis before the next event footprint arrives.

At this point, with the business in the spotlight, the final challenge arises if it takes days for an organization to get a loss estimate, especially a net of reinsurance loss estimate, as it changes the rating agency’s perception of an insurer. When there is more uncertainty around the loss estimate, the rating agency assumes the organization is more volatile.

This means that an agency may expect an organization to hold higher capital reserves, which could be hundreds of millions of dollars depending on the size of the business. The ability to quickly provide more accurate loss estimates to key external stakeholders saves money and adds credibility.

Overcoming Challenges with the RMS ExposureIQ Application

How do insurers overcome these challenges? Integration of systems is vital, as most of the hard work in the current process is bringing all the business data together, ready for analysis across the latest portfolios. The tools used need to be highly performant, especially during an event, with the power to process the sheer volume of data. And the analysis needs to be consistent, using one financial model that can account for all the reinsurance structures in place.

Many RMS® clients are benefiting from the ExposureIQ™ application on the cloud-native RMS Intelligent Risk Platform. All the data contained within the platform, portfolios, exposure, and treaty structures, along with risk models, greatly simplifies obtaining a gross and net loss across an entire portfolio.

And when an event occurs, the event footprints automatically flow into the ExposureIQ application for quick analysis across an entire enterprise of hundreds of portfolios and thousands of cedants, providing confidence to your stakeholders.

Talk to us about taking the strain out of enterprise-level exposure analysis – arrange a demo of the ExposureIQ application, or watch the recent Insider Engage webinar conducted in partnership with RMS: Enterprise-Wide Exposures: Exploring the Top Challenges.

 

Share:
You May Also Like
link
Office
February 25, 2022
RMS ExposureIQ: Award-Winning Application for Exposure Management …
Read More
link
Modern office
May 31, 2022
Exposure Management: Managing the Complexity of Estimating Net Losses Across Property Treaty Business …
Read More
Related Products
link
exposure iq
ExposureIQ

Proactively manage organization-wide risk…

Learn More
Chloe Garrish
Chloe Garrish
Senior Product Marketing Manager

Chloe is a senior product marketing manager at Moody's RMS, where she helps customers develop data-driven strategies to better understand their risk. She has more than a decade of experience in catastrophe modeling for the (re)insurance industry.

Previously, Chloe was a senior account director for CoreLogic, managing catastrophe modeling clients across Europe, Asia, and the U.S. Prior to CoreLogic, Chloe worked in various catastrophe modeling roles within a Lloyd’s Syndicate, AIG, and Aspen Re.

Chloe has a bachelor’s degree in Geography with Anthropology from Oxford Brookes University.

cta image

Need Help Managing Your Portfolio?

close button
Overlay Image
Video Title

Thank You

You’ll be contacted by an Moody's RMS specialist shortly.