Like many communities in California with a mild climate, affordable housing, and scenic wilderness, Butte County (pop. ~230,000) has grown significantly over the past four decades. Broadly, this growth is happening all around the county — both in cities (e.g. Chico, the county seat and largest city, pop. ~94,000) as well as in more rural areas. Looking more closely, however, the specific spatial patterns of Butte’s development reveal conditions that set the stage for the ongoing Camp Fire to become one of the deadliest and most destructive fires in California history.
Chris Folkman, senior director of product management at RMS, was interviewed by Paula Newton on CNN’s Quest Means Business program on Monday, November 12, about the impact of the California wildfires.
Paula asked Chris about the range of factors that have made these wildfires so intense, and also about the potential causes of the fires. Chris explained how the fires could have started and how the almost perfect conditions for the fire produced such a rapid spread. For the Camp Fire in Northern California, deaths were caused by the fire’s sheer speed that had overwhelmed residents as they tried to escape from the path of the flames.
Almost one and a half million people have died in natural disasters over the past 20 years. This is a waste of life; a waste of potential.
Natural disasters also have a massive economic impact. Our models suggest natural catastrophes cost the world’s poorest countries almost US$30 billion a year on average. Hard-won development gains are regularly wiped out — and it is the poor and the vulnerable who are most impacted.
In case anyone had forgotten the crippling impacts of natural disasters, 2017 served a painful reminder. Hurricanes Irma and Maria left vulnerable people in the Caribbean devastated. Somalia, Ethiopia and Kenya struggled with drought. Floods and landslides wrecked lives and livelihoods in Sri Lanka and Bangladesh. And then there was Hurricane Harvey which, along with the California wildfires, made 2017 the costliest on record in the United States.
Whenever and wherever catastrophe strikes, our thoughts are with those so profoundly affected.
We did not, however, need last summer’s tropical cyclones to understand that something is not working. We did not need Irma and Maria to learn that investments in resilience reduce losses from natural disasters. And we did not need the events of 2017 to know that incentives are too often insufficient to drive action in the most vulnerable regions.
Describing the scale and savagery of the wildfires currently burning in California is difficult to do, but a simple recounting of the statistics is a good starting point. They are thus:
At the time of writing, fifteen wildfires are now burning more than 280,000 acres (~113,000 hectares) in California. Collectively, they have laid waste to almost 7,000 homes and businesses. 31 people have died in the fires. 300,000 more were evacuated. 12,000 firefighters are working the front lines, making admirable progress at containment.
The biggest of these events, the Camp Fire (named for the road of its point of origin) is the most destructive wildfire in history, with 6,700 structures burned. During a period of particularly intense wind, it spread at a rate of more than one football field per second. Entire towns in its path are effectively destroyed.
Catastrophe modeling remains work in progress. With each upgrade we aim to build a better model, employing expanded data sets for hazard calibration, longer simulation runs, more detailed exposure data, and higher resolution digital terrain models (DTMs).
Yet the principal way that the catastrophe model “learns” still comes from the experience of actual disasters. What elements, or impacts, were previously not fully appreciated? What loss pattern is new? How do actual claims relate to the severity of the hazard, or change with time through shifts in the claiming process?
After a particularly catastrophic season we give presentations around ”the lessons from last year’s catastrophes.” We should make it a practice, a few years later, to recount how those lessons became implemented in the models.
Images of total devastation from Typhoon Haiyan shocked the global community in 2013, and Haiyan still haunts the Philippines five years on. At 4.40 a.m. local time on Friday, November 8, 2013, the city of Guiuan (pop. ~52,000) on the island of Leyte, in the Eastern Visayas, Philippines, first experienced the full force of Typhoon Haiyan (Super Typhoon Yolanda) as it made landfall. The city’s mayor declared “100 percent damage.” A community found itself homeless as 10,008 structures in Guiuan were destroyed and 1,601 were partially damaged. The Joint Typhoon Warning Center (JTWC) estimated Haiyan’s one-minute sustained winds at 315 kilometers per hour (195 miles per hour) at landfall, and at the time, this unofficially made Haiyan the strongest tropical cyclone ever observed based on wind speed.
Haiyan was a story of prolific intensification, starting life as an area of low pressure some 3,200 kilometers (2,000 miles) east-southeast from landfall just six days previously. Warmed by the Pacific, Haiyan was a tropical depression on November 3, tropical storm on November 4, and claimed typhoon status by November 5. Four days into monitoring, by November 6, the JTWC assessed Haiyan as the equivalent of a Category 5 on the Saffir-Simpson Hurricane Wind Scale (SSHWS). It continued to intensify before landfall.
Today is World Tsunami Awareness Day — designated by the United Nations General Assembly, and according to the United Nations Office for Disaster Risk Reduction (UNISDR), on average, tsunami events have a higher mortality rate than any other hazard. Over the past 20 years (1998-2017) tsunamis have claimed more than 250,000 lives and are also attributable for US$280 billion of the US$661 billion of total recorded economic losses for earthquakes and tsunamis. Between 1978-1997, tsunamis claimed 998 lives, and US$2.7 billion in losses. Overall, tsunamis are rare, but as the UN points out, when they occur they are deadly and hugely damaging. This infrequency makes building awareness and preparedness more of a challenge.
The UN has promoted World Tsunami Awareness Day since 2015, and the UN Secretary-General’s Special Representative for Disaster Risk Reduction, Mami Mizutori, stated that “…it is an occasion to promote greater understanding of tsunami risk to avoid future loss of life. This year we also want to bring attention to the economic losses tsunamis can inflict as a result of damage to critical infrastructure located along vulnerable, densely populated coastlines.”
With positive changes under way to improve both public and private carrier participation across the U.S. flood market, many are looking to seize the opportunity that the U.S. flood market presents. Insurers, reinsurers, and the capital markets are exploring this opportunity which, in turn, has created a thirst for knowledge. I had the opportunity to see this first-hand when I was invited by Trading Risk magazine to take part in a panel discussion at the Trading Risk ILS: Reloaded and Resurgent event in New York last month. Sofia Geraghty from Trading Risk served as our moderator, and Joanna Syroka, Director of New Markets at Fermat Capital Management, and Ian Hanson, Vice President of Willis Re, were also on the panel.
One point that the audience wanted to understand was the level of demand to take on flood risk from an investor’s viewpoint, and also whether U.S. flood risk can be a portfolio diversifier. From the insurance-linked securities (ILS) side, Joanna confirmed the demand is there, but as with any peril, the ILS market needs to be able to clearly understand and define the risk to get comfortable enough to invest.
I had the privilege of joining Property Casualty 360 for a Facebook Live video discussion last week, together with my colleague Wallace Hogsett, client manager at RMS. Danielle Ling, associate editor at PC360 was the host of the discussion, entitled “2018 Hurricane Season: Where Are We Now?”.
We began by providing a perspective on the impacts of this season’s hurricanes. The two big hurricane events to impact the U.S. in 2018 (so far) have obviously been Hurricanes Florence and Michael, but each possessed very different characteristics. Florence maintained Category 4 status on the Saffir-Simpson Hurricane Wind Scale (SSHWS) for around a week, before wind shear tempered it to a Category 1 as it made landfall near Wrightsville Beach, North Carolina on September 14. While many areas were subject to significant wind gusts and storm surge, Florence was primarily a flood event, causing historic rainfall and inland flooding throughout the Carolinas.
On the other end of the scale, Wallace stated how Michael was a classic intense hurricane — the most intense to make landfall in the U.S. since Andrew in 1992 — almost reaching Category 5 status upon its landfall in Mexico Beach on October 10. The scenes of structures reduced to their “slabs” with just their foundations left showed that this was primarily a wind and storm surge event. In total, damages stretched from the Florida Panhandle region through the Southeast and the Carolinas.
This RMS article was previously published in Property Casualty 360
The effective use of data is so important to every insurance business — especially as big data and analytics are seen as a “silver bullet” for transformation. But to get on this transformative journey, your approach to data in your business may have to change. The traditional view of data focuses mainly on data collection and storage: how to collect, store, access and arrange the data, with rules and procedures to achieve this.
There is a tendency to separate data from analytics. If you think of data analytics, the image may be of the hard-pressed team of analysts and IT specialists, working to tight deadlines, “mining the data” to deliver the core reports that the business needs.
If any of the above rings true, you may need to change your mindset. First, for data collection and storage, the cloud has revolutionized the way data is stored, accessed and managed, offering high capacity and high availability, all typically on a pay-as-you-use basis. Historically, this is where much of the investment in this area went. But with the cloud, the burden has lifted as businesses now do not need to become experts in data storage or to plan, build and manage data centers, which were seen as critical in-house infrastructure in the past.