logo image
More Topics

Reset Filters

Helen Yates
link
September 22, 2021
Deconstructing Social Inflation

After the loss creep associated with Hurricane Irma in 2017, (re)insurers are keen to quantify how social inflation could exacerbate claims costs in the future. The challenge lies in eliminating the more persistent, longer-term trends, allowing these factors to be explicitly modeled and reducing the “surprise factor” the next time a major storm blows through.  A few days after Hurricane Irma passed over Marco Island, Florida, on September 10, 2017, RMS® deployed a reconnaissance team to offer some initial feedback on the damage that was sustained. Most properties on the island had clay tile roofs and while the team noted some dislodged or broken tiles, damage did not appear to be severe. A year later, when Peter Datin, senior director of modeling at RMS, decided to revisit the area, he was shocked by what he saw. “There were so many roofing contractors still on the island, and almost every house seemed to be getting a full roof replacement. We found out that US$900 million worth of roofing permits for repairs had been filed in Marco Island alone.” Trying to find the exact shape and color for tile replacements was a challenge, forcing contractors to replace the entire roof for aesthetic reasons. Then there is Florida's “25 percent rule,” which previously applied to High-Velocity Hurricane Zones in South Florida (Miami-Dade and Broward Counties) before expanding statewide under the 2017 Florida Building Code. Under the rule, if a loss assessor or contractor determines that a quarter or more of the roof has been damaged in the last 12 months, it cannot simply be repaired, and 100 percent must be replaced. This begins to explain why, in the aftermath of Hurricane Irma and to a lesser extent Hurricane Michael in 2018, claims severity and loss creep were such an issue. “We looked at some modeling aspects in terms of the physical meaning of this,” says Datin. “If we were to directly implement an engineering or physics-based approach, what does that mean? How does it impact the vulnerability curve? "We went through this exercise last summer and found that if you hit that threshold of the 25 percent roof damage ratio, particularly for low wind speeds, that's a fourfold increase on your claims. At certain wind speeds, it can therefore have a very material increase on the losses being paid. It’s not quite that straightforward to implement on the vulnerability curve, but it is very significant.” But issues such as the 25 percent rule do not tell the whole story, and in a highly litigious market such as Florida, determining whether a roof needs a complete replacement is not just down to physics. Increasingly, the confluence of additional factors that fall under the broad description of “social inflation” are also having a meaningful impact on the total cost of claims. What Is Social Inflation? Broadly, social inflation refers to all the ways in which insurers’ claims costs rise over and above general economic inflation (i.e., growth in wages and prices), which will influence the cost of repairs and/or replacing damaged property. It therefore captures the growth in costs connected to the following: unanticipated emerging perils associated with, for example, new materials or technologies, shifts in the legal environment, evolving social attitudes and preferences towards equitable risk absorption, and demographic and political developments. (Source: Geneva Association) Florida's “David and Goliath” Law A major driver is the assertive strategies of the plaintiffs' bar, compounded by the three-year window in which to file a claim and the use and potential abuse of practices such as assignment of benefits (AOB). The use of public adjusters and broader societal attitudes towards insurance claiming also need to be taken into consideration. Meanwhile, the expansion of coverage terms and conditions in the loss-free years between 2005 and 2017 and generous policy interpretations play their part in driving up claims frequency and severity. What Is Assignment of Benefits (AOB)? An assignment of benefits, or AOB, is a document signed by a policyholder that allows a third party, such as a water extraction company, a roofer or a plumber to '”stand in the shoes” of the insured and seek payment directly from the policyholder's insurance company for the cost of repairs. AOBs have long been part of Florida’s insurance marketplace. (Source: Florida Office of Insurance Regulation) More recently, the effects of COVID-19 has impacted the cost of repairs, in turn increasing insurers' loss ratios. (Re)insurers naturally want to better understand how social inflation is likely to impact their cost of claims. But determining the impact of social inflation on the “claims signal” is far from simple. From a modeling perspective, the first step is deselecting the different elements that contribute toward social inflation and understanding which trends are longer term in nature. The recently released Version 21 of the RMS North Atlantic Hurricane Models incorporates an alternative view of vulnerability for clients and reflects the changing market conditions applicable to Florida residential lines, including the 25 percent roof replacement rule. However, the effects of social inflation are still largely considered non-modeled. They are removed from available data where possible, during the model development process. Any residual impacts are implicitly represented in the model. “Quantifying the impacts of social inflation is a complex task, partly because of the uncertainty in how long these factors will persist,” says Jeff Waters, senior product manager at RMS. “The question is, going forward, how much of an issue is social inflation really going to be for the next three, five or 10 years? Should we start thinking more about ways in which to explicitly account for these social inflation factors or give model users the ability to manually fold in these different factors? “One issue is that social inflation really ramped up over the last few years,” he continues. “It's especially true in Florida following events like Hurricanes Irma and Michael. At RMS, we have been working hard trying to determine which of these signals are caused by social inflation and which are caused by other things happening in Florida. Certainly, the view of vulnerability in Version 21 starts to reflect these elevated risk factors.” AOB had a clear impact on claims from Irma and Michael. Florida's “David and Goliath” law was intended to level the playing field between policyholders and economically powerful insurers, notes the Insurance Information Institute's Jeff Dunsavage. Instead, the law offered an incentive for attorneys to file thousands of AOB-related suits. The ease with which unscrupulous contractors can “find” damage and make claims within three years of a catastrophe loss has further exacerbated the problem. Waters points out that in 2006 there were only around 400 AOB lawsuits in the market. By 2018, that number had risen to over 135,000. In a decade that had seen very few storms, it was difficult to predict how significant an impact AOB would have on hurricane-related claims, until Irma struck. Of the Irma and Michael claims investigated by RMS, roughly 20 percent were impacted by AOB. “From a claims severity standpoint, the cost of those claims increased up to threefold on average compared to claims that were not affected by AOB," says Waters. Insurers on the Brink The problem is not just limited to recent hurricane events. Due to the Sunshine State's increased litigation, insurers are continuing to face a barrage of AOB non-catastrophe claims, including losses relating to water and roof damage. Reforms introduced in 2019 initially helped rein in the more opportunistic claims, but notifications dialed back up again after attorneys were able to find and exploit loopholes. Amid pressures on the court system due to COVID-19, reform efforts are continuing. In April 2021, the Florida Legislature passed a new law intended to curb market abuse of litigation and roofing contractor practices, among other reforms. Governor Ron DeSantis said the law had been a reaction to “mounting insurance costs” for homeowners. As loss ratios rose, carriers have been passing some of the additional costs back onto the policyholders in the form of additional premiums (around US$680 per family on average). Meanwhile, some carriers have begun to offer policies with limited AOB rights, or none at all, in an effort to get more control over the spiraling situation. “There are some pushes in the legislature to try to curb some of the more litigious behavior on the part of the trial lawyers,” says Matthew Nielsen, senior director, regulatory affairs at RMS. Nielsen thinks the 2021 hurricane season could be telling in terms of separating out some of the more permanent changes in the market where social inflation is concerned. The National Oceanic and Atmospheric Administration (NOAA) still predicts another above-average season in the North Atlantic, but currently does not anticipate the historic level of storm activity seen in 2020.  “What's going to happen when the next hurricane makes landfall, and which of these elements are actually going to still be here?” asks Nielsen. “What nobody wants to see again is the kind of chaos that came after 2004 and 2005, when there were questions about the health of the insurance market and what the roles of the Florida Hurricane Catastrophe Fund (FHCF) and Florida Citizens Property Insurance Corporation were going to be.” “Ultimately, we're trying to figure out which of these social inflation signals are going to stick around, and the difficulty is separating the long-term signals from the short-term ones,” he continues. “The 25 percent roof replacement rule is written into legislation, and so that is going to be the new reality going forward. On the other hand, we don't want to include something that is a temporary blip on the radar.”

NIGEL ALLEN
link
May 05, 2021
Data From the Ashes

Five years on from the wildfire that devastated Fort McMurray, the event has proved critical to developing a much deeper understanding of wildfire losses in Canada In May 2016, Fort McMurray, Alberta, became the location of Canada’s costliest wildfire event to date. In total, some 2,400 structures were destroyed by the fire, with a similar number designated as uninhabitable. Fortunately, the evacuation of the 90,000-strong population meant that no lives were lost as a direct result of the fires. From an insurance perspective, the estimated CA$4 billion loss elevated wildfire risk to a whole new level. This was a figure now comparable to the extreme fire losses experienced in wildfire-exposed regions such as California, and established wildfire as a peak natural peril second only to flood in Canada. However, the event also exposed gaps in the market’s understanding of wildfire events and highlighted the lack of actionable exposure data. In the U.S., significant investment had been made in enhancing the scale and granularity of publicly available wildfire data through bodies such as the United States Geological Survey, but the resolution of data available through equivalent parties in Canada was not at the same standard. A Question of Scale Making direct wildfire comparisons between the U.S. and Canada is difficult for multiple reasons. Take, for example, population density. Canada’s total population is approximately 37.6 million, spread over a landmass of 9,985 million square kilometers (3,855 million square miles), while California has a population of around 39.5 million, inhabiting an area of 423,970 square kilometers (163,668 square miles). The potential for wildfire events impacting populated areas is therefore significantly less in Canada. In fact, in the event of a wildfire in Canada—due to the reduced potential exposure—fires are typically allowed to burn for longer and over a wider area, whereas in the U.S. there is a significant focus on fire suppression. This willingness to let fires burn has the benefit of reducing levels of vegetation and fuel buildup. Also, more fires in the country are a result of natural rather than human-caused ignitions and occur in hard-to-access areas with low population exposure. Sixty percent of fires in Canada are attributed to human causes. The challenge for the insurance industry in Canada is therefore more about measuring the potential impact of wildfire on smaller pockets of exposure Michael Young, senior director, product management, at RMS But as Fort McMurray showed, the potential for disaster clearly exists. In fact, the event was one of a series of large-scale fires in recent years that have impacted populated areas in Canada, including the Okanagan Mountain Fire, the McLure Fire, the Slave Lake Fire, and the Williams Lake and Elephant Hills Fire. “The challenge for the insurance industry in Canada,” explains Michael Young, senior director, product management, at RMS, “is therefore more about measuring the potential impact of wildfire on smaller pockets of exposure, rather than the same issues of frequency and severity of event that are prevalent in the U.S.” Regions at Risk What is interesting to note is just how much of the populated territories are potentially exposed to wildfire events in Canada, despite a relatively low population density overall. A 2017 report entitled Mapping Canadian Wildland Fire Interface Areas, published by the Canadian Forest Service, stated that the threat of wildfire impacting populated areas will inevitably increase as a result of the combined impacts of climate change and the development of more interface area “due to changes in human land use.” This includes urban and rural growth, the establishment of new industrial facilities and the building of more second homes. According to the study, the wildland-human interface in Canada spans 116.5 million hectares (288 million acres), which is 13.8 percent of the country’s total land area or 20.7 percent of its total wildland fuel area. In terms of the wildland-urban interface (WUI), this covers 32.3 million hectares (79.8 million acres), which is 3.8 percent of land area or 5.8 percent of fuel area. The WUI for industrial areas (known as WUI-Ind) covers 10.5 million hectares (25.9 million acres), which is 1.3 percent of land area or 1.9 percent of fuel area. In terms of the provinces and territories with the largest interface areas, the report highlighted Quebec, Alberta, Ontario and British Columbia as being most exposed. At a more granular level, it stated that in populated areas such as cities, towns and settlements, 96 percent of locations had “at least some WUI within a five-kilometer buffer,” while 60 percent also had over 500 hectares (1,200 acres) of WUI within a five-kilometer buffer (327 of the total 544 areas). Data: A Closer Look Fort McMurray has, in some ways, become an epicenter for the generation of wildfire-related data in Canada. According to a study by the Institute for Catastrophic Loss Reduction, which looked at why certain homes survived, the Fort McMurray Wildfire “followed a well-recognized pattern known as the wildland/urban interface disaster sequence.” The detailed study, which was conducted in the aftermath of the disaster, showed that 90 percent of properties in the areas affected by the wildfire survived the event. Further, “surviving homes were generally rated with ‘Low’ to ‘Moderate’ hazard levels and exhibited many of the attributes promoted by recommended FireSmart Canada guidelines.” FireSmart Canada is an organization designed to promote greater wildfire resilience across the country. Similar to FireWise in the U.S., it has created a series of hazard factors spanning aspects such as building structure, vegetation/fuel, topography and ignition sites. It also offers a hazard assessment system that considers hazard layers and adoption rates of resilience measures. According to the study: “Tabulation by hazard level shows that 94 percent of paired comparisons of all urban and country residential situations rated as having either ‘Low’ or ‘Moderate’ hazard levels survived the wildfire. Collectively, vegetation/fuel conditions accounted for 49 percent of the total hazard rating at homes that survived and 62 percent of total hazard at homes that failed to survive.” Accessing the Data In many ways, the findings of the Fort McMurray study are reassuring, as they clearly demonstrate the positive impact of structural and topographical risk mitigation measures in enhancing wildfire resilience—essentially proving the underlying scientific data. Further, the data shows that “a strong, positive correlation exists between home destruction during wildfire events and untreated vegetation within 30 meters of homes.” “What the level of survivability in Fort McMurray showed was just how important structural hardening is,” Young explains. “It is not simply about defensible space, managing vegetation and ensuring sufficient distance from the WUI. These are clearly critical components of wildfire resilience, but by factoring in structural mitigation measures you greatly increase levels of survivability, even during urban conflagration events as extreme as Fort McMurray.” What the level of survivability in Fort McMurray showed was just how important structural hardening is Michael Young, senior director, product management, RMS From an insurance perspective, access to these combined datasets is vital to effective exposure analysis and portfolio management. There is a concerted drive on the part of the Canadian insurance industry to adopt a more data-intensive approach to managing wildfire exposure. Enhancing data availability across the region has been a key focus at RMS® in recent years, and efforts have culminated in the launch of the RMS® Canada Wildfire HD Model. It offers the most complete view of the country’s wildfire risk currently available and is the only probabilistic model available to the market that covers all 10 provinces. “The hazard framework that the model is built on spans all of the critical wildfire components, including landscape and fire behavior patterns, fire weather simulations, fire and smoke spread, urban conflagration and ember intensity,” says Young. “In each instance, the hazard component has been precisely calibrated to reflect the dynamics, assumptions and practices that are specific to Canada. “For example, the model’s fire spread component has been adjusted to reflect the fact that fires tend to burn for longer and over a wider area in the country, which reflects the watching brief that is often applied to managing wildfire events, as opposed to the more suppression-focused approach in the U.S.,” he continues. “Also, the urban conflagration component helps insurers address the issue of extreme tail-risk events such as Fort McMurray.” Another key model differentiator is the wildfire vulnerability function, which automatically determines key risk parameters based on high-resolution data. In fact, RMS has put considerable efforts into building out the underlying datasets by blending multiple different information sources to generate fire, smoke and ember footprints at 50-meter resolution, as opposed to the standard 250-meter resolution of the publicly available data. Critical site hazard data such as slope, distance to vegetation, and fuel types can be set against primary building modifiers such as construction, number of stories and year built. A further secondary modifier layer enables insurers to apply building-specific mitigation measures such as roof characteristics, ember accumulators and whether the property has cladding or a deck. Given the influence of such components on building survivability during the Fort McMurray Fire, such data is vital to exposure analysis at the local level. A Changing Market “The market has long recognized that greater data resolution is vital to adopting a more sophisticated approach to wildfire risk,” Young says. “As we worked to develop this new model, it was clear from our discussions with clients that there was an unmet need to have access to hard data that they could ‘hang numbers from.’ There was simply too little data to enable insurers to address issues such as potential return periods, accumulation risk and countrywide portfolio management.” The ability to access more granular data might also be well timed in response to a growing shift in the information required during the insurance process. There is a concerted effort taking place across the Canadian insurance market to reduce the information burden on policyholders during the submission process. At the same time, there is a shift toward risk-based pricing. “As we see this dynamic evolve,” Young says, “the reduced amount of risk information sourced from the insured will place greater importance on the need to apply modeled data to how insurance companies manage and price risk accurately. Companies are also increasingly looking at the potential to adopt risk-based pricing, a process that is dependent on the ability to apply exposure analysis at the individual location level. So, it is clear from the coming together of these multiple market shifts that access to granular data is more important to the Canadian wildfire market than ever.”

NIGEL ALLEN
link
February 11, 2021
Location, Location, Location: A New Era in Data Resolution

The insurance industry has reached a transformational point in its ability to accurately understand the details of exposure at risk. It is the point at which three fundamental components of exposure management are coming together to enable (re)insurers to systematically quantify risk at the location level: the availability of high-resolution location data, access to the technology to capture that data and advances in modeling capabilities to use that data. Data resolution at the individual building level has increased considerably in recent years, including the use of detailed satellite imagery, while advances in data sourcing technology have provided companies with easier access to this more granular information. In parallel, the evolution of new innovations, such as RMS® High Definition Models™ and the transition to cloud-based technologies, has facilitated a massive leap forward in the ability of companies to absorb, analyze and apply this new data within their actuarial and underwriting ecosystems. Quantifying Risk Uncertainty “Risk has an inherent level of uncertainty,” explains Mohsen Rahnama, chief modeling officer at RMS. “The key is how you quantify that uncertainty. No matter what hazard you are modeling, whether it is earthquake, flood, wildfire or hurricane, there are assumptions being made. These catastrophic perils are low-probability, high-consequence events as evidenced, for example, by the 2017 and 2018 California wildfires or Hurricane Katrina in 2005 and Hurricane Harvey in 2017. For earthquake, examples include Tohoku in 2011, the New Zealand earthquakes in 2010 and 2011, and Northridge in 1994. For this reason, risk estimation based on an actuarial approach cannot be carried out for these severe perils; physical models based upon scientific research and event characteristic data for estimating risk are needed.” A critical element in reducing uncertainty is a clear understanding of the sources of uncertainty from the hazard, vulnerability and exposure at risk. “Physical models, such as those using a high-definition approach, systematically address and quantify the uncertainties associated with the hazard and vulnerability components of the model,” adds Rahnama. “There are significant epistemic (also known as systematic) uncertainties in the loss results, which users should consider in their decision-making process. This epistemic uncertainty is associated with a lack of knowledge. It can be subjective and is reducible with additional information.” What are the sources of this uncertainty? For earthquake, there is uncertainty about the ground motion attenuation functions, soil and geotechnical data, the size of the events, or unknown faults. Rahnama explains: “Addressing the modeling uncertainty is one side of the equation. Computational power enables millions of events and more than 50,000 years of simulation to be used, to accurately capture the hazard and reduce the epistemic uncertainty. Our findings show that in the case of earthquakes the main source of uncertainty for portfolio analysis is ground motion; however, vulnerability is the main driver of uncertainty for a single location.” The quality of the exposure data as the input to any mathematical models is essential to assess the risk accurately and reduce the loss uncertainty. However, exposure could represent the main source of loss uncertainty, especially when exposure data is provided in aggregate form. Assumptions can be made to disaggregate exposure using other sources of information, which helps to some degree reduce the associated uncertainty. Rahnama concludes, “Therefore, it is essential in order to minimize the uncertainty related to exposure to try to get location-level information about the exposure, in particular for the region with the potential of liquification for earthquake or for high-gradient hazard such as flood and wildfire.”  A critical element in reducing that uncertainty, removing those assumptions and enhancing risk understanding is combining location-level data and hazard information. That combination provides the data basis for quantifying risk in a systematic way. Understanding the direct correlation between risk or hazard and exposure requires location-level data. The potential damage caused to a location by flood, earthquake or wind will be significantly influenced by factors such as first-floor elevation of a building, distance to fault lines or underlying soil conditions through to the quality of local building codes and structural resilience. And much of that granular data is now available and relatively easy to access. “The amount of location data that is available today is truly phenomenal,” believes Michael Young, vice president of product management at RMS, “and so much can be accessed through capabilities as widely available as Google Earth. Straightforward access to this highly detailed satellite imagery means that you can conduct desktop analysis of individual properties and get a pretty good understanding of many of the building and location characteristics that can influence exposure potential to perils such as wildfire.” Satellite imagery is already a core component of RMS model capabilities, and by applying machine learning and artificial intelligence (AI) technologies to such images, damage quantification and differentiation at the building level is becoming a much more efficient and faster undertaking — as demonstrated in the aftermath of Hurricanes Laura and Delta. “Within two days of Hurricane Laura striking Louisiana at the end of August 2020,” says Rahnama, “we had been able to assess roof damage to over 180,000 properties by applying our machine-learning capabilities to satellite images of the affected areas. We have ‘trained’ our algorithms to understand damage degree variations and can then superimpose wind speed and event footprint specifics to group the damage degrees into different wind speed ranges. What that also meant was that when Hurricane Delta struck the same region weeks later, we were able to see where damage from these two events overlapped.” The Data Intensity of Wildfire Wildfire by its very nature is a data-intensive peril, and the risk has a steep gradient where houses in the same neighborhood can have drastically different risk profiles. The range of factors that can make the difference between total loss, partial loss and zero loss is considerable, and to fully grasp their influence on exposure potential requires location-level data. The demand for high-resolution data has increased exponentially in the aftermath of recent record-breaking wildfire events, such as the series of devastating seasons in California in 2017-18, and unparalleled bushfire losses in Australia in 2019-20. Such events have also highlighted myriad deficiencies in wildfire risk assessment including the failure to account for structural vulnerabilities, the inability to assess exposure to urban conflagrations, insufficient high-resolution data and the lack of a robust modeling solution to provide insight about fire potential given the many years of drought. Wildfires in 2017 devastated the town of Paradise, California In 2019, RMS released its U.S. Wildfire HD Model, built to capture the full impact of wildfire at high resolution, including the complex behaviors that characterize fire spread, ember accumulation and smoke dispersion. Able to simulate over 72 million wildfires across the contiguous U.S., the model creates ultrarealistic fire footprints that encompass surface fuels, topography, weather conditions, moisture and fire suppression measures. “To understand the loss potential of this incredibly nuanced and multifactorial exposure,” explains Michael Young, “you not only need to understand the probability of a fire starting but also the probability of an individual building surviving. “If you look at many wildfire footprints,” he continues, “you will see that sometimes up to 60 percent of buildings within that footprint survived, and the focus is then on what increases survivability — defensible space, building materials, vegetation management, etc. We were one of the first modelers to build mitigation factors into our model, such as those building and location attributes that can enhance building resilience.” Moving the Differentiation Needle In a recent study by RMS and the Center for Insurance Policy Research, the Insurance Institute for Business and Home Safety and the National Fire Protection Association, RMS applied its wildfire model to quantifying the benefits of two mitigation strategies — structural mitigation and vegetation management — assessing hypothetical loss reduction benefits in nine communities across California, Colorado and Oregon. Young says: “By knowing what the building characteristics and protection measures are within the first 5 feet and 30 feet at a given property, we were able to demonstrate that structural modifications can reduce wildfire risk up to 35 percent, while structural and vegetation modifications combined can reduce it by up to 75 percent. This level of resolution can move the needle on the availability of wildfire insurance as it enables development of robust rating algorithms to differentiate specific locations — and means that entire neighborhoods don’t have to be non-renewed.” “By knowing what the building characteristics and protection measures are within the first 5 feet and 30 feet at a given property, we were able to demonstrate that structural modifications can reduce wildfire risk up to 35 percent, while structural and vegetation modifications combined can reduce it by up to 75 percent” Michael Young, RMS While acknowledging that modeling mitigation measures at a 5-foot resolution requires an immense granularity of data, RMS has demonstrated that its wildfire model is responsive to data at that level. “The native resolution of our model is 50-meter cells, which is a considerable enhancement on the zip-code level underwriting grids employed by some insurers. That cell size in a typical suburban neighborhood encompasses approximately three-to-five buildings. By providing the model environment that can utilize information within the 5-to-30-foot range, we are enabling our clients to achieve the level of data fidelity to differentiate risks at that property level. That really is a potential market game changer.” Evolving Insurance Pricing It is not hyperbolic to suggest that being able to combine high-definition modeling with high-resolution data can be market changing. The evolution of risk-based pricing in New Zealand is a case in point. The series of catastrophic earthquakes in the Christchurch region of New Zealand in 2010 and 2011 provided a stark demonstration of how insufficient data meant that the insurance market was blindsided by the scale of liquefaction-related losses from those events. “The earthquakes showed that the market needed to get a lot smarter in how it approached earthquake risk,” says Michael Drayton, consultant at RMS, “and invest much more in understanding how individual building characteristics and location data influenced exposure performance, particularly in relation to liquefaction. “To get to grips with this component of the earthquake peril, you need location-level data,” he continues. “To understand what triggers liquefaction, you must analyze the soil profile, which is far from homogenous. Christchurch, for example, sits on an alluvial plain, which means there are multiple complex layers of silt, gravel and sand that can vary significantly from one location to the next. In fact, across a large commercial or industrial complex, the soil structure can change significantly from one side of the building footprint to the other.” Extensive building damage in downtown Christchurch, New Zealand after 2011 earthquakeThe aftermath of the earthquake series saw a surge in soil data as teams of geotech engineers conducted painstaking analysis of layer composition. With multiple event sets to use, it was possible to assess which areas suffered soil liquefaction and from which specific ground-shaking intensity. “Updating our model with this detailed location information brought about a step-change in assessing liquefaction exposures. Previously, insurers could only assess average liquefaction exposure levels, which was of little use where you have highly concentrated risks in specific areas. Through our RMS® New Zealand Earthquake HD Model, which incorporates 100-meter grid resolution and the application of detailed ground data, it is now possible to assess liquefaction exposure potential at a much more localized level.” “Through our RMS® New Zealand Earthquake HD model, which incorporates 100-meter grid resolution and the application of detailed ground data, it is now possible to assess liquefaction exposure potential at a much more localized level” — Michael Drayton, RMS This development represents a notable market shift from community to risk-based pricing in New Zealand. With insurers able to differentiate risks at the location level, this has enabled companies such as Tower Insurance to more accurately adjust premium levels to reflect risk to the individual property or area. In its annual report in November 2019, Tower stated: “Tower led the way 18 months ago with risk-based pricing and removing cross-subsidization between low- and high-risk customers. Risk-based pricing has resulted in the growth of Tower’s portfolio in Auckland while also reducing exposure to high-risk areas by 16 percent. Tower’s fairer approach to pricing has also allowed the company to grow exposure by 4 percent in the larger, low-risk areas like Auckland, Hamilton, and Taranaki.” Creating the Right Ecosystem The RMS commitment to enable companies to put high-resolution data to both underwriting and portfolio management use goes beyond the development of HD Models™ and the integration of multiple layers of location-level data. Through the launch of RMS Risk Intelligence™, its modular, unified risk analytics platform, and the Risk Modeler™ application, which enables users to access, evaluate, compare and deploy all RMS models, the company has created an ecosystem built to support these next-generation data capabilities. Deployed within the Cloud, the ecosystem thrives on the computational power that this provides, enabling proprietary and tertiary data analytics to rapidly produce high-resolution risk insights. A network of applications — including the ExposureIQ™ and SiteIQ™ applications and Location Intelligence API — support enhanced access to data and provide a more modular framework to deliver that data in a much more customized way. “Because we are maintaining this ecosystem in the Cloud,” explains Michael Young, “when a model update is released, we can instantly stand that model side-by-side with the previous version. As more data becomes available each season, we can upload that new information much faster into our model environment, which means our clients can capitalize on and apply that new insight straightaway.” Michael Drayton adds: “We’re also offering access to our capabilities in a much more modular fashion, which means that individual teams can access the specific applications they need, while all operating in a data-consistent environment. And the fact that this can all be driven through APIs means that we are opening up many new lines of thought around how clients can use location data.” Exploring What Is Possible There is no doubt that the market is on the cusp of a new era of data resolution — capturing detailed hazard and exposure and using the power of analytics to quantify the risk and risk differentiation. Mohsen Rahnama believes the potential is huge. “I foresee a point in the future where virtually every building will essentially have its own social-security-like number,” he believes, “that enables you to access key data points for that particular property and the surrounding location. It will effectively be a risk score, including data on building characteristics, proximity to fault lines, level of elevation, previous loss history, etc. Armed with that information — and superimposing other data sources such as hazard data, geological data and vegetation data — a company will be able to systematically price risk and assess exposure levels for every asset up to the portfolio level.” “The only way we can truly assess this rapidly changing risk is by being able to systematically evaluate exposure based on high-resolution data and advanced modeling techniques that incorporate building resilience and mitigation measures” — Mohsen Rahnama, RMS Bringing the focus back to the here and now, he adds, the expanding impacts of climate change are making the need for this data transformation a market imperative. “If you look at how many properties around the globe are located just one meter above sea level, we are talking about trillions of dollars of exposure. The only way we can truly assess this rapidly changing risk is by being able to systematically evaluate exposure based on high-resolution data and advanced modeling techniques that incorporate building resilience and mitigation measures. How will our exposure landscape look in 2050? The only way we will know is by applying that data resolution underpinned by the latest model science to quantify this evolving risk.”

Helen Yates
link
September 06, 2019
Severe Convective Storms: A New Peak Peril?

Severe convective storms (SCS) have driven U.S. insured catastrophe losses in recent years with both attritional and major single-event claims now rivaling an average hurricane season. EXPOSURE looks at why SCS losses are rising and asks how (re)insurers should be responding At the time of writing, 2019 was already shaping up to be another active season for U.S. severe convective storms (SCS), with at least eight tornadoes daily over a period of 12 consecutive days in May. It was the most May tornadoes since 2015, with no fewer than seven outbreaks of SCS across central and eastern parts of the U.S. According to data from the National Oceanic and Atmospheric Administration (NOAA), there were 555 preliminary tornado reports, more than double the average of 276 for the month in the period of 1991-2010. According to the current numbers, May 2019 produced the second-highest number of reported tornadoes for any month on record after April 2011, which broke multiple records in relation to SCS and tornado touchdowns. It continues a trend set over the past two decades, which has seen SCS losses increasing significantly and steadily. In 2018, losses amounted to US$18.8 billion, of which US$14.1 billion was insured. This compares to insurance losses of US$15.6 billion for hurricane losses in the same period. While losses from SCS are often the buildup of losses from multiple events, there are examples of single events costing insurers and reinsurers over US$3 billion in claims. This includes the costliest SCS to date, which hit Tuscaloosa, Alabama, in April 2011, involving several tornado touchdowns and causing US$7.9 billion in insured damage. The second-most-costly SCS occurred in May of the same year, striking Joplin, Missouri, and other locations, resulting in insured losses of nearly US$7.6 billion. “The trend in the scientific discussion is that there might be fewer but more-severe events” Juergen Grieser RMS According to RMS models, average losses from SCS now exceed US$15 billion annually and are in the same range as hurricane average annual loss (AAL), which is also backed up by independently published scientific research. “The losses in 2011 and 2012 were real eye-openers,” says Rajkiran Vojjala, vice president of modeling at RMS. “SCS is no longer a peril with events that cost a few hundred million dollars. You could have cat losses of US$10 billion in today’s money if there were events similar to those in April 2011.”  Nearly a third of all average annual reported tornadoes occur in the states of Texas, Oklahoma, Kansas and Nebraska, all states that are within the “Tornado Alley.” This is where cold, dry polar air meets warm, moist air moving up from the Gulf of Mexico, causing strong convective activity. “A typical SCS swath affects many states. So the extent is large, unlike, say, wildfire, which is truly localized to a small particular region,” says Vojjala. Research suggests the annual number of Enhanced Fujita (EF) scale EF2 and stronger tornadoes hitting the U.S. has trended upward over the past 20 years; however, there is some doubt over whether this is a real meteorological trend. One explanation could be that increased observational practices simply mean that such weather phenomena are more likely to be recorded, particularly in less populated regions.  According to Juergen Grieser, senior director of modeling at RMS, there is a debate whether part of the increase in claims relating to SCS could be attributed to climate change. “A warmer climate means a weaker jet stream, which should lead to less organized convection while the energy of convection might increase,” he says. “The trend in the scientific discussion is that there might be fewer but more-severe events.” Claims severity rather than claims frequency is a more significant driver of losses relating to hail events, he adds. “We have an increase in hail losses of about 11 percent per year over the last 15 years, which is quite a lot. But 7.5 percent of that is from an increase in the cost of individual claims,” explains Grieser. “So, while the claims frequency has also increased in this period, the individual claim is more expensive now than it was ever before.”  Claims go ‘Through the Roof’ Another big driver of loss is likely to be aging roofs and the increasing exposure at risk of SCS. The contribution of roof age was explored in a blog last year by Stephen Cusack, director of model development at RMS. He noted that one of the biggest changes in residential exposure to SCS over the past two decades has been the rise in the median age of housing from 30 years in 2001 to 37 years in 2013. A changing insurance industry climate is also a driver for increased losses, thinks Vojjala. “There has been a change in public perception on claiming whereby even cosmetic damage to roofs is now being claimed and contractors are chasing hailstorms to see what damage might have been caused,” he says. “So, there is more awareness and that has led to higher losses. “The insurance products for hail and tornado have grown and so those perils are being insured more, and there are different types of coverage,” he notes. “Most insurers now offer not replacement cost but only the actual value of the roofs to alleviate some of the rising cost of claims. On the flip side, if they do continue offering full replacement coverage and a hurricane hits in some of those areas, you now have better roofs.” How insurance companies approach the peril is changing as a result of rising claims. “Historically, insurance and reinsurance clients have viewed SCS as an attritional loss, but in the last five to 10 years the changing trends have altered that perception,” says Vojjala. “That’s where there is this need for high-resolution modeling, which increasingly our clients have been asking for to improve their exposure management practices. “With SCS also having catastrophic losses, it has stoked interest from the ILS community as well, who are also experimenting with parametric triggers for SCS,” he adds. “We usually see this on the earthquake or hurricane side, but increasingly we are seeing it with SCS as well.” 

Helen Yates
link
September 06, 2019
Ridgecrest: A Wake-Up Call

Marleen Nyst and Nilesh Shome of RMS explore some of the lessons and implications from the recent sequence of earthquakes in California On the morning of July 4, 2019, the small town of Ridgecrest in California’s Mojave Desert unexpectedly found itself at the center of a major news story after a magnitude 6.4 earthquake occurred close by. This earthquake later transpired to be a foreshock for a magnitude 7.1 earthquake the following day, the strongest earthquake to hit the state for 20 years. These events, part of a series of earthquakes and aftershocks that were felt by millions of people across the state, briefly reignited awareness of the threat posed by earthquakes in California. Fortunately, damage from the Ridgecrest earthquake sequence was relatively limited. With the event not causing a widespread social or economic impact, its passage through the news agenda was relatively swift.  But there are several reasons why an event such as the Ridgecrest earthquake sequence should be a focus of attention both for the insurance industry and the residents and local authorities in California.  “If Ridgecrest had happened in a more densely populated area, this state would be facing a far different economic future than it is today” Glenn Pomeroy California Earthquake Authority “We don’t want to minimize the experiences of those whose homes or property were damaged or who were injured when these two powerful earthquakes struck, because for them these earthquakes will have a lasting impact, and they face some difficult days ahead,” explains Glenn Pomeroy, chief executive of the California Earthquake Authority. “However, if this series of earthquakes had happened in a more densely populated area or an area with thousands of very old, vulnerable homes, such as Los Angeles or the San Francisco Bay Area, this state would be facing a far different economic future than it is today — potentially a massive financial crisis,” Pomeroy says. Although one of the most populous U.S. states, California’s population is mostly concentrated in metropolitan areas. A major earthquake in one of these areas could have repercussions for both the domestic and international economy.  Low Probability, High Impact Earthquake is a low probability, high impact peril. In California, earthquake risk awareness is low, both within the general public and many (re)insurers. The peril has not caused a major insured loss for 25 years, the last being the magnitude 6.7 Northridge earthquake in 1994. California earthquake has the potential to cause large-scale insured and economic damage. A repeat of the Northridge event would likely cost the insurance industry today around US$30 billion, according to the latest version of the RMS® North America Earthquake Models, and Northridge is far from a worst-case scenario. From an insurance perspective, one of the most significant earthquake events on record would be the magnitude 9.0 Tōhoku Earthquake and Tsunami in 2011. For California, the 1906 magnitude 7.8 San Francisco earthquake, when Lloyd’s underwriter Cuthbert Heath famously instructed his San Franciscan agent to “pay all of our policyholders in full, irrespective of the terms of their policies”, remains historically significant. Heath’s actions led to a Lloyd’s payout of around US$50 million at the time and helped cement Lloyd’s reputation in the U.S. market. RMS models suggest a repeat of this event today could cost the insurance industry around US$50 billion. But the economic cost of such an event could be around six times the insurance bill — as much as US$300 billion — even before considering damage to infrastructure and government buildings, due to the surprisingly low penetration of earthquake insurance in the state. Events such as the 1906 earthquake and even Northridge are too far in the past to remain in public consciousness. And the lack of awareness of the peril’s damage potential is demonstrated by the low take-up of earthquake insurance in the state. “Because large, damaging earthquakes don’t happen very frequently, and we never know when they will happen, for many people it’s out of sight, out of mind. They simply think it won’t happen to them,” Pomeroy says. Across California, an average of just 12 percent to 14 percent of homeowners have earthquake insurance. Take-up varies across the state, with some high-risk regions, such as the San Francisco Bay Area, experiencing take-up below the state average. Take-up tends to be slightly higher in Southern California and is around 20 percent in Los Angeles and Orange counties.  Take-up will typically increase in the aftermath of an event as public awareness rises but will rapidly fall as the risk fades from memory. As with any low probability, high impact event, there is a danger the public will not be well prepared when a major event strikes.  The insurance industry can take steps to address this challenge, particularly through working to increase awareness of earthquake risk and actively promoting the importance of having insurance coverage for faster recovery. RMS and its insurance partners have also been working to improve society’s resilience against risks such as earthquake, through initiatives such as the 100 Resilient Cities program. Understanding the Risk While the tools to model and understand earthquake risk are improving all the time, there remain several unknowns which underwriters should be aware of. One of the reasons the Ridgecrest Earthquake came as such a surprise was that the fault on which it occurred was not one that seismologists knew existed.  Several other recent earthquakes — such as the 2014 Napa event, the Landers and Big Bear Earthquakes in 1992, and the Loma Prieta Earthquake in 1989 — took place on previously unknown or thought to be inactive faults or fault strands. As well as not having a full picture of where the faults may lie, scientific understanding of how multifaults can link together to form a larger event is also changing.  Events such as the Kaikoura Earthquake in New Zealand in 2016 and the Baja California Earthquake in Mexico in 2010 have helped inform new scientific thinking that faults can link together causing more damaging, larger magnitude earthquakes. The RMS North America Earthquake Models have also evolved to factor in this thinking and have captured multifault ruptures in the model based on the latest research results. In addition, studying the interaction between the faults that ruptured in the Ridgecrest events will allow RMS to improve the fault connectivity in the models.  A further learning from New Zealand came via the 2011 Christchurch Earthquake, which demonstrated how liquefaction of soil can be a significant loss driver due to soil condition in certain areas. The San Francisco Bay Area, an important national and international economic hub, could suffer a similar impact in the event of a major earthquake. Across the area, there has been significant residential and commercial development on artificial landfill areas over the last 100 years, which are prone to have significant liquefaction damage, similar to what was observed in Christchurch. Location, Location, Location Clearly, the location of the earthquake is critical to the scale of damage and insured and economic impact from an event. Ridgecrest is situated roughly 200 kilometers north of Los Angeles. Had the recent earthquake sequence occurred beneath Los Angeles instead, then it is plausible that the insured cost could have been in excess of US$100 billion.  The Puente Hills Fault, which sits underneath downtown LA, wasn’t discovered until around the turn of the century. A magnitude 6.8 Puente Hills event could cause an insured loss of US$78.6 billion, and a Newport-Inglewood magnitude 7.3 would cost an estimated US$77.1 billion according to RMS modeling. These are just a couple of the examples within its stochastic event set with a similar magnitude to the Ridgecrest events and which could have a significant social, economic and insured loss impact if they took place elsewhere in the state. The RMS model estimates that magnitude 7 earthquakes in California could cause insurance industry losses ranging from US$20,000 to a US$20 billion, but the maximum loss could be over US$100 billion if occurring in high population centers such as Los Angeles. The losses from the Ridgecrest event were on the low side of the range of loss as the event occurred in a less populated area. For the California Earthquake Authority’s portfolio in Los Angeles County, a large loss event of US$10 billion or greater can be expected approximately every 30 years. As with any major catastrophe, several factors can drive up the insured loss bill, including post-event loss amplification and contingent business interruption, given the potential scale of disruption. In Sacramento, there is also a risk of failure of the levee system. Fire following earthquake was a significant cause of damage following the 1906 San Francisco Earthquake and was estimated to account for around 40 percent of the overall loss from that event. It is, however, expected that fire would make a much smaller contribution to future events, given modern construction materials and methods and fire suppressant systems.  Political pressure to settle claims could also drive up the loss total from the event. Lawmakers could put pressure on the CEA and other insurers to settle claims quickly, as has been the case in the aftermath of other catastrophes, such as Hurricane Sandy. The California Earthquake Authority has recommended homes built prior to 1980 be seismically retrofitted to make them less vulnerable to earthquake damage. “We all need to learn the lesson of Ridgecrest: California needs to be better prepared for the next big earthquake because it’s sure to come,” Pomeroy says. “We recommend people consider earthquake insurance to protect themselves financially,” he continues. “The government’s not going to come in and rebuild everybody’s home, and a regular residential insurance policy does not cover earthquake damage. The only way to be covered for earthquake damage is to have an additional earthquake insurance policy in place.  “Close to 90 percent of the state does not have an earthquake insurance policy in place. Let this be the wake-up call that we all need to get prepared.”

Helen Yates
link
September 06, 2019
Shaking Up Workers' Compensation

Are (re)insurers sufficiently capitalized to withstand a major earthquake in a metropolitan area during peak hours?  The U.S. workers’ compensation insurance market continues to generate underwriting profit. According to Fitch Ratings, 2019 is on track to mark the fifth consecutive year of profits and deliver a statutory combined ratio of 86 percent in 2018. Since 2015, it has achieved an annual average combined ratio of 93 percent. The market’s size has increased considerably since the 2008 financial crisis sparked a flurry of activity in the workers’ compensation arena. Over the last 10 years, written premiums have risen 50 percent from approximately US$40 billion to almost US$60 billion, aided by low unemployment and growth in rate and wages.  Yet market conditions are changing. The pricing environment is deteriorating, prior-year reserve releases are slowing and severity is ticking upwards. And while loss reserves currently top US$150 billion, questions remain over whether these are sufficient to bear the brunt of a major earthquake in a highly populated area. The Big One California represents over 20 percent of the U.S. workers’ compensation market. The Workers’ Compensation Insurance Rating Bureau of California (WCIRB) forecasts a written premium pot of US$15.7 billion for 2019, a slight decline on 2018’s US$17 billion figure.  “So, the workers’ compensation sector’s largest premium is concentrated in the area of the U.S. most exposed to earthquake risk,” explains Nilesh Shome, vice president at RMS. “This problem is unique to the U.S., since in most other countries occupational injury is covered by government insurance schemes instead of the private market. Further, workers’ compensation policies have no limits, so they can be severely impacted by a large earthquake.” Workers’ compensation insurers enjoy relatively healthy balance sheets, with adequate profitability and conservative premium-to-surplus ratios. But, when you assess the industry’s exposure to large earthquakes in more detail, the surplus base starts to look a little smaller. “We are also talking about a marketplace untested in modern times,” he continues. “The 1994 Northridge Earthquake in Los Angeles, for example, while causing major loss, occurred at 4:30 a.m. when most people were still in bed, so had limited impact from a workers’ compensation perspective.” Analyzing the Numbers Working with the WCIRB, RMS modeled earthquake scenarios using Version 17 of the RMS® North America Earthquake Casualty Model, which incorporates the latest science in earthquake hazard and vulnerability research. The portfolio provided by the WCIRB contained exposure information for 11 million full-time-equivalent employees, including occupation details for each. The analysis showed that the average annual estimated insured loss is US$29 million, which corresponds to 0.5 cents per $100 payroll and $2.50 per employee. The 1-in-100-year insurance loss is expected to exceed US$300 million, around 5,000 casualties including 300 fatalities; while at peak work-time hours, the loss could rise to US$1.5 billion. For a 1-in-250-year loss, the figure could top US$1.4 billion and more than 1,000 fatalities, rising to US$5 billion at peak work-time hours.  But looking at the magnitude 7.8 San Francisco Earthquake in 1906 at 5:12 a.m., the figure would be 7,300 injuries, 1,900 fatalities and around US$1 billion in loss. At peak work hours, this would rise to 22,000 casualties, 5,800 fatalities and a US$3 billion loss. To help reduce the impact of major earthquakes, RMS is working with the Berkeley Research Lab and the United States Geological Survey (USGS) to research the benefits of an earthquake early warning system (EEWS) and safety measures such as drop-cover-hold and evacuating buildings after an EEWS alarm. Initial studies indicate that an EEWS alert for the large, faraway earthquakes such as the 1857 magnitude 7.9 Fort Tejon Earthquake near Los Angeles can reduce injuries by 20 percent-50 percent.  Shome concludes: “It is well known in the industry that workers’ compensation loss distribution has a long tail, and at conferences RMS has demonstrated how our modeling best captures this tail. The model considers many low probability, high consequence events by accurately modeling the latest USGS findings.”

NIGEL ALLEN
link
September 06, 2019
What a Difference

As the insurance industry’s Dive In Festival continues to gather momentum, EXPOSURE examines the factors influencing the speed at which the diversity and inclusion dial is moving September 2019 marks the fifth Dive In Festival, a global movement in the insurance sector to support the development of inclusive workplace cultures. An industry phenomenon, it has ballooned in size from a London-only initiative in 2015 attracting 1,700 people to an international spectacle spanning 27 countries and reaching over 9,000 people in 2018. That the event should gather such momentum clearly demonstrates a market that is moving forward. There is now an industrywide acknowledgement of the need to better reflect the diversity of the customer base within the industry’s professional ranks. The Starting Point As Pauline Miller, head of talent development and inclusion (D&I) at Lloyd’s, explains, the insurance industry is a market that has, in the past, been slow to change its practitioner profile. “If you look at Lloyd’s, for example, for nearly three hundred years it was a men-only environment, with women only admitted as members in December 1969. “It’s about bringing together the most creative group of people that represent different ways of thinking that have evolved out of the multiple factors that make them different” Pauline Miller Lloyd’s “You also have to recognize that the insurance industry is not as far along the diversity and inclusion journey compared to other sectors,” she continues. “I previously worked in the banking industry, and diversity and inclusion had been an agenda issue in the organization for a number of years. So, we must acknowledge that this is a journey that will require multiple more steps before we really begin breaking down barriers.” However, she is confident the insurance industry can quickly make up ground. “By its very nature, the insurance market lends itself to the spread of the D&I initiative,” Miller believes. “We are a relationship-based business that thrives on direct contact, and our day-to-day activities are based upon collaboration. We must leverage this to help speed up the creation of a more diverse and inclusive environment.” The positive effects of collaboration are already evident in how this is evolving. Initiatives like Dive In, a weeklong focus on diversity and inclusion, within other financial sectors have tended to be confined to individual organizations, with few generating the level of industrywide engagement witnessed within the insurance sector. However, as Danny Fisher, global HR business partner and EMEA HR manager at RMS, points out, for the drive to gain real traction there must be marketwide consensus on the direction it is moving in. “There is always a risk,” he says, “that any complex initiative that begins with such positive intent can become derailed if there is not an understanding of a common vision from the start, and the benefits it will deliver. “There also needs to be better understanding and acknowledgement of the multitude of factors that may have contributed to the uniformity we see across the insurance sector. We have to establish why this has happened and address the flaws in our industry contributing to it.” It can be argued that the insurance industry is still composed of a relatively homogeneous group of people. In terms of gender disparity, ethnic diversity, and people of different sexual orientations, from different cultural or social backgrounds, or with physical or mental impairments, the industry recognizes a need to improve.  Diversity is the range of human differences, including but not limited to race, ethnicity, gender, gender identity, sexual orientation, age, social class, physical ability or attributes, religious or ethical values system, national origin, and political beliefs. “As a market,” Miller agrees, “there is a tendency to hire people similar to the person who is recruiting. Whether that’s someone of the same gender, ethnicity, sexual orientation or from the same university or social background.” “You can end up with a very uniform workforce,” adds Fisher, “where people look the same and have a similar view of the world, which can foster ‘groupthink’ and is prone to bias and questionable conclusions. People approach problems and solutions in the same way, with no one looking at an alternative — an alternative that is often greatly needed. So, a key part of the diversity push is the need to generate greater diversity of thought.” The challenge is also introducing that talent in an inclusive way that promotes the effective development of new solutions to existing and future problems. That broad palette of talent can only be created by attracting and retaining the best and brightest from across the social spectrum within a framework in which that blend of skills, perspectives and opinions can thrive. “Diversity is not simply about the number of women, ethnicities, people with disabilities or people from disadvantaged backgrounds that you hire,” believes Miller. “It’s about bringing together the most creative group of people that represent different ways of thinking that have evolved out of the multiple factors that make them different.” Moving the Dial There is clearly a desire to make this happen and strong evidence that the industry is moving together. Top-level support for D&I initiatives coupled with the rapid growth of industrywide networks representing different demographics are helping firm up the foundations of a more diverse and inclusive marketplace.  But what other developments are needed to move the dial further? “We have to recognize that there is no ‘one-size-fits-all’ to this challenge,” says Miller. “Policies and strategies must be designed to create an environment in which diversity and inclusion can thrive, but fundamentally they must reflect the unique dynamics of your own organization. “We also must ensure we are promoting the benefits of a career in insurance in a more powerful and enticing way and to a broader audience,” she adds. “We operate in a fantastic industry, but we don’t sell it enough. And when we do get that diversity of talent through the door, we have to offer a workplace that sticks, so they don’t simply walk straight back out again.  “For example, someone from a disadvantaged community coming through an intern program may never have worked in an office environment before, and when they look around are they going to see people like themselves that they can relate to? What role models can they connect with? Are we prepared for that?” For Fisher, steps can also be taken to change processes and modernize thinking and habits. “We have to be training managers in interview and evaluation techniques and discipline to keep unconscious bias in check. There has to be consistency with meaningful tests to ensure data-driven hiring decisions. “At RMS, we are fortunate to attract talent from around the world and are able to facilitate bringing them on board to add further variety in solving for complex problems. A successful approach for us, for example, has been accessing talent early, often prior to their professional career.” There is, of course, the risk that the push for greater diversity leads to a quota-based approach.  “Nobody wants this to become a tick-box exercise,” believes Miller, “and equally nobody wants to be hired simply because they represent a particular demographic. But if we are expecting change, we do need measurements in place to show how we are moving the dial forward. That may mean introducing realistic targets within realistic timeframes that are monitored carefully to ensure we are on track. “Ultimately,” she concludes, “what we are all working to do is to create the best environment for the broadest spectrum of people to come into what is a truly amazing marketplace. And when they do, offering a workplace that enables them to thrive and enjoy very successful careers that contribute to the advancement of our industry. That’s what we all have to be working toward.”

Loading Icon
close button
Overlay Image
Video Title

Thank You

You’ll be contacted by an Moody's RMS specialist shortly.