For (re)insurers, the efficient modeling of property exposure data, especially during renewal seasons, has emerged as a competitive differentiator.
Once an exposure data file is received, the race is on to ensure the data is cleaned, precisely geocoded, includes the correct building attributes, and is fully validated and structured – ready for risk modeling. Speed and accuracy matters. Being the first to respond to a client or a broker with a quote in a competitive market is a prerequisite to business success.
But what happens when the exposure data files start flowing in? That’s when the process involved in getting data ready, right up to the point it is analyzed in risk models, can reveal a patchwork of manual work-arounds, multiple discrete automation tools, and various incompatible applications used to obtain data. A huge amount of time and effort can be spent going through the different stages of exposure modeling, with every stage introducing the potential risk of errors due to manual data handling.
In a recent survey of U.S. insurers, a third of respondents indicated that their data analysts are spending more than 70 percent of their time in searching for, cleaning, and enhancing data for analysis. And another third indicated 50 to 70 percent of the time is spent on preparing data for modeling – instead of delivering advanced risk insights.
To speed up the process, it’s tempting to ignore lower-resolution geocoding, use broadly defined construction and occupancy codes, overlook the secondary modifiers, not test for validity of data, etc. But this just transfers issues to the risk modeling stage with increased uncertainty in the model outcome.
The exposure modeling process is ripe for automation – to deliver the speed, accuracy, and consistency required – so the focus is on risk insights rather than data handling. How can automation help?
Why Automation Matters
Automation is making a real difference for RMS® Analytical Services, one of the largest providers of catastrophe risk modeling services globally, delivering data enrichment, account modeling, portfolio analytics, and regulatory reporting services. Our more than 350 risk analysts serving over 100 clients use RMS models and software solutions to offer a flexible, scalable service to enhance or build modeling capacity.
In our ambition to deliver a competitive edge to our clients, we recently implemented an integrated automation workflow – and both reductions in processing times and improvements in data quality are significant.
This is not just about automating each process with handoffs at each stage. Instead, we use an analytic process automation platform from Alteryx focused on integration and delivering business outcomes. Automated processes extend across secured data transfer/import, cleaning, geocoding, data coding, modeling, and reporting.
Automation starts with clients securely transferring files to our data exchange portal, with some clients choosing to use application programming interfaces (APIs) integrated within their applications that send data directly to RMS Analytical Services. A customized application built using RMS Location Intelligence prepares address information in a model-ready format and adds missing details of the property address.
Algorithms trained on thousands of data points that RMS has processed over many years are then used to translate English descriptions of primary modifiers, such as construction and occupancy codes, into model-readable codes.
Our data validation engine then applies over 300 validation checks to discover any anomalies before importing the exposure data into the risk model. The policy coding module applies policy conditions that our analysts have summarized in a template using policy slips. And once modeled, customized reports are generated by the reporting module, with our model output integrated with the client’s applications.
Currently, we are integrating RMS Risk Modeler APIs into the process workflows, which would enable automated modeling once the exposure data is available. In the future, it is not entirely inconceivable that simple data can be modeled in a straight-through process from receipt of data to reporting of results with negligible manual intervention.
Driving Process Efficiency
(Re)insurers want to get to modeled risk insights as quickly as possible with the knowledge that accuracy, consistency, and quality is assured. So, what impact has automation had for our processes and to deliver on our clients’ demands?
Data enrichment processing time has improved, and we expect it to decrease by 20 to 25 percent in the next few quarters. For the large datasets with more than 10,000 locations, time spent by analysts could reduce even further.
With consistent quality of output and more time spent on analysis and review, our ongoing quality assurance processes have seen a further drop in the error rate. During the busy renewal period, the increased efficiency delivered by automation means that RMS provides operational flexibility to easily scale data volumes as required.
RMS Analytical Services provides clients with a combination of infrastructure, model IP, and services. With automated workflows and a large resource pool, we offer clients the ability to operationalize their business faster without having to invest significant sums in building in-house capability.
To learn more about automation in catastrophe modeling processes and how RMS Analytical Services can help, contact us.