When I was a kid, my favorite breakfast cereal was Kellogg’s Sugar Frosted Flakes. As a teenager in the 1980s, I recall that the name changed to Frosted Flakes. In 1983, to appeal to a more health-conscious consumer, Kellogg simply dropped “sugar” from the name. And around the same time, Kellogg’s Sugar Smacks became Honey Smacks. There didn’t seem to be a dramatic reduction in sugar. Even today, sugar makes up over 55 percent of the total content of Honey Smacks and is the lead ingredient. Honey trails at fourth. The idea was … if the consumer didn’t see the word sugar, they wouldn’t necessarily jump to the conclusion that it was loaded with sugar. One could argue that this was just a marketing ploy – yet most would agree it secured marketing appeal by removing a potential distraction from its name.
With the arrival and full realization of the Information Age, data really has become the new currency, and whether you are an individual or a business, we are all getting wise to the fact that data has significant intrinsic value. We are witnessing an exponential increase in the volume of data generated by everything being measured, monitored, observed and recorded all around us, and that pace does not show signs of slowing anytime soon. Every day we add to this data mountain, creating new data – whether we are aware of it or not. We give some data explicitly – but more often we now generate data implicitly, simply by our interactions with people, devices, and even locations.
When I was still a teenager – summer brave, full of sport, hot and bold – I hitchhiked from Lithuania to Armenia and back again. Outbound via the former Soviet Union and the Caucasus; home via Turkey and the Balkans.
Time rich and cash poor, I took risks I wouldn’t today. All the same, my gambles paid off and I look back on that adventure fondly.
The journey was filled with comparisons and contrasts. Some things, like being invited in basic Russian to squeeze into a crammed Lada Riva, remained almost constant from country to country. Others, like the landscapes and local delicacies, evolved with every new ride.
When I found myself back in Istanbul last month for the first time since my hitchhiking days, I was again struck by these contrasts. Here I was, a guest of the United Nations, discussing disaster risk reduction financing with the finance ministers of those countries through which I’d once hitchhiked. And here I was, marveling afresh at the cultural, political, economic and geographical diversity of a vast region which yet shares so much.
At the time of writing, the recent Camp and Woolsey Fires in California have burned a combined total of 245,000 acres (93,000 hectares) — an area about the size of Dallas. These fires have destroyed more than 12,000 homes and businesses, and killed 80 civilians. Ordinarily these would be called extreme events. But these are not ordinary times. After back-to-back record breaking wildfire seasons, including the Wine Country fires (US$11 billion) and Southern California Fires (US$2.3 billion) in 2017, and the Carr Fire (~US$1.2 billion) and Mendocino Complex fires (~US$200 million) this year in July, California Governor Jerry Brown perfectly summed up the current situation in his state: “This is the new abnormal.”
As firefighters make continuing progress on containment of both fires, the California Department of Forestry and Fire Protection (CAL FIRE) is quickly assembling an inventory of each burned structure, to detail the extent of the damage. Based on this data, plus a simulated reconstruction of the event’s wind, moisture, fuel, and fire spread parameters, RMS estimates the insured damage at between US$7.5 billion and US$10 billion for the Camp Fire, and US$1.5 billion and US$3 billion for the Woolsey Fire. This estimate accounts for burn and smoke damage; structure, contents, business interruption (BI), and additional living expenses (ALE) payouts; damage to autos; and modest post loss amplification (PLA) that may result from surges in labor costs, ordinance and law endorsements, and related coverage extensions.
Like many communities in California with a mild climate, affordable housing, and scenic wilderness, Butte County (pop. ~230,000) has grown significantly over the past four decades. Broadly, this growth is happening all around the county — both in cities (e.g. Chico, the county seat and largest city, pop. ~94,000) as well as in more rural areas. Looking more closely, however, the specific spatial patterns of Butte’s development reveal conditions that set the stage for the ongoing Camp Fire to become one of the deadliest and most destructive fires in California history.
Chris Folkman, senior director of product management at RMS, was interviewed by Paula Newton on CNN’s Quest Means Business program on Monday, November 12, about the impact of the California wildfires.
Paula asked Chris about the range of factors that have made these wildfires so intense, and also about the potential causes of the fires. Chris explained how the fires could have started and how the almost perfect conditions for the fire produced such a rapid spread. For the Camp Fire in Northern California, deaths were caused by the fire’s sheer speed that had overwhelmed residents as they tried to escape from the path of the flames.
Almost one and a half million people have died in natural disasters over the past 20 years. This is a waste of life; a waste of potential.
Natural disasters also have a massive economic impact. Our models suggest natural catastrophes cost the world’s poorest countries almost US$30 billion a year on average. Hard-won development gains are regularly wiped out — and it is the poor and the vulnerable who are most impacted.
In case anyone had forgotten the crippling impacts of natural disasters, 2017 served a painful reminder. Hurricanes Irma and Maria left vulnerable people in the Caribbean devastated. Somalia, Ethiopia and Kenya struggled with drought. Floods and landslides wrecked lives and livelihoods in Sri Lanka and Bangladesh. And then there was Hurricane Harvey which, along with the California wildfires, made 2017 the costliest on record in the United States.
Whenever and wherever catastrophe strikes, our thoughts are with those so profoundly affected.
We did not, however, need last summer’s tropical cyclones to understand that something is not working. We did not need Irma and Maria to learn that investments in resilience reduce losses from natural disasters. And we did not need the events of 2017 to know that incentives are too often insufficient to drive action in the most vulnerable regions.
Describing the scale and savagery of the wildfires currently burning in California is difficult to do, but a simple recounting of the statistics is a good starting point. They are thus:
At the time of writing, fifteen wildfires are now burning more than 280,000 acres (~113,000 hectares) in California. Collectively, they have laid waste to almost 7,000 homes and businesses. 31 people have died in the fires. 300,000 more were evacuated. 12,000 firefighters are working the front lines, making admirable progress at containment.
The biggest of these events, the Camp Fire (named for the road of its point of origin) is the most destructive wildfire in history, with 6,700 structures burned. During a period of particularly intense wind, it spread at a rate of more than one football field per second. Entire towns in its path are effectively destroyed.
Catastrophe modeling remains work in progress. With each upgrade we aim to build a better model, employing expanded data sets for hazard calibration, longer simulation runs, more detailed exposure data, and higher resolution digital terrain models (DTMs).
Yet the principal way that the catastrophe model “learns” still comes from the experience of actual disasters. What elements, or impacts, were previously not fully appreciated? What loss pattern is new? How do actual claims relate to the severity of the hazard, or change with time through shifts in the claiming process?
After a particularly catastrophic season we give presentations around ”the lessons from last year’s catastrophes.” We should make it a practice, a few years later, to recount how those lessons became implemented in the models.
Images of total devastation from Typhoon Haiyan shocked the global community in 2013, and Haiyan still haunts the Philippines five years on. At 4.40 a.m. local time on Friday, November 8, 2013, the city of Guiuan (pop. ~52,000) on the island of Leyte, in the Eastern Visayas, Philippines, first experienced the full force of Typhoon Haiyan (Super Typhoon Yolanda) as it made landfall. The city’s mayor declared “100 percent damage.” A community found itself homeless as 10,008 structures in Guiuan were destroyed and 1,601 were partially damaged. The Joint Typhoon Warning Center (JTWC) estimated Haiyan’s one-minute sustained winds at 315 kilometers per hour (195 miles per hour) at landfall, and at the time, this unofficially made Haiyan the strongest tropical cyclone ever observed based on wind speed.
Haiyan was a story of prolific intensification, starting life as an area of low pressure some 3,200 kilometers (2,000 miles) east-southeast from landfall just six days previously. Warmed by the Pacific, Haiyan was a tropical depression on November 3, tropical storm on November 4, and claimed typhoon status by November 5. Four days into monitoring, by November 6, the JTWC assessed Haiyan as the equivalent of a Category 5 on the Saffir-Simpson Hurricane Wind Scale (SSHWS). It continued to intensify before landfall.