How Big Data Means “Bigger is Better” for Weather Safety & Resilience

9-3-15 Figure 4 Data Comparison Final

As conversations about the application of Big Data come out of the 2016 World Economic Forum in Davos this week, I recall how the headline, Is IBM Building the Most Powerful Weather Service the World Has Ever Seen?, made perfect sense to me. The article described how IBM purchased The Weather Company’s digital and data assets in a deal inspired by opportunities in Big Data. The Wall Street Journal valued the transaction at more than $2 billion, and IBM described their rationale:

With this acquisition, IBM is going to harness one of the largest big data               opportunities in the world — weather. Weather is probably the single largest swing factor in business performance — it impacts 1/3 of the world’s GDP and in the U.S. alone, weather is responsible for about half a trillion dollars in impact. Weather affects every aspect of the economy – energy usage, travel and transportation, new construction, agricultural yields, mall and restaurant traffic, etc.

This move is a profound illustration of the opportunity that Big Data presents in the realm of disaster resilience, a topic we addressed in our paper, Understanding the Intersection of Resilience, Big Data, and the Internet of Things in a Changing Insurance Marketplace.

Truly, with $500 Billion (yes they say billion) in opportunity costs on the line, Big (weather) Data can not only inform to, but can revolutionize industries and movements across the world.

In my last blog, I discussed the movement behind resilience metrics—an issue that can be categorized as a “Big Data” problem. Like all sectors, we’re experiencing the advent of exploding amounts of data, and one growing source of this data is from the Internet of Things (IoT).

One IoT example in homes is the Nest Thermostat. The device not only indicates the temperature in your house, but it issues alerts if it detects significant temperature swings. The obvious advantage is that you learn of potential system problems right away. Consider the power of learning that your home temperature is dropping during winter weather conditions. This could provide invaluable lead time to prevent costly damage like frozen pipes, especially if you’re away from home when the alert arrives. The device can help you keep up with routine maintenance as well by tracking air filter usage, and reminding you when filters are ready to be changed.

Smart home technology like Nest’s helps us understand IoT and information (data) creation, but the questions of how to harness and leverage all the new information once we have it takes us back to the Big Data side of the equation…

Big Data issues are taking center stage in our movement as social science efforts explore the interrelation of Big Data and resilience at the United Nations’ Global Pulse, the Rockefeller Foundation’s Bellagio Center, the U.S. National Science Foundation and the Japan Science and Technology Agency, and the World Bank. These initiatives raise common issues around resilience and Big Data, including:

  • How can we protect privacy while still benefitting from the data?
  • How (can) the different, emerging resiliency efforts integrate into an understandable system?
  • Who are the decision-makers, data owners, and what are their rights?
  • Should insurers/reinsurers heighten engagement or simply design their own system?

It’s clear that tremendous value lies in the use of Big Data, like the IBM weather data deal described above. What we envision is applying Big Data on building features managed through a philosophy of transparency to benefit residents and communities alike. We want to capture and share the relevant building data that drives ultimate home performance during a disaster or over time, and I will cover this in a future blog.

IBM’s vision for Big (weather) Data is just the start. We all must get and stay involved to ensure we leverage Big Data, one of most promising and powerful tools for creating a reliably strong, safe and durable built environment. As we know, that is the most essential element of resilience.

If What Gets Measured Gets Done, How Can We Measure Resilience?

Many of us in the disaster-resilience movement have witnessed and participated in the creation of resilience measurement systems. Along the way, we’ve observed that one of the strongest aspects of our movement is also what makes measurement so difficult: diversity. There are so many puzzle pieces that must be considered—economic, physical, political, and social—just to name a few.

In October of this year, we released a paper entitled Understanding the Intersection of Resilience, Big Data, and the Internet of Things in the Changing Insurance Marketplace, in which we explored resilience measurement as well as issues of Big Data and the Internet of Things (IoT). The good news is that the landscape is rich with efforts to apply a measuring stick to resilience from micro- up to macro-levels. The other news is that this issue is mindboggling in its enormity.

So over the next couple of posts, I plan to return to an examination of these issues because Big Data and the IoT can transform the disaster-resilience movement, and the implications are almost unfathomable. These concepts and their intersection present opportunities that we must understand and plan for before we can harvest any benefits, or at least prevent unintended consequences.

The first step is to develop uniform, consistent metrics to gauge progress toward resilient communities. We need verifiable, practical, and replicable tools. And we need to apply these tools to quality data.

At FLASH, we are concerned with community-wide resilience, but our work focuses more particularly on strengthening homes and safeguarding families from the ground up with strong building codes, beyond-code mitigation, and personal preparedness as part of a culture of resilience. These are foundational aspects of community resilience of course.

But here’s the challenge. Data on building characteristics and performance is not historically granular enough to derive ongoing insights except in certain post-disaster situations. Even then, we must obtain onsite analysis. That is why we need forensic engineering efforts like FEMA’s Mitigation Assessment Team post-disaster to assess structural performance, and evaluate failure patterns to inform to better building codes and standards in the future.

But this is where Big Data and the need for quality data (IoT) can make a difference. When IoT generated-data from new or emerging technologies like sensors can give us precise information on how a building performs, how will we leverage the information for better building practices to avoid future losses? Can we use the Big Data generated by the IoT to get ahead of the next disaster instead of learning after the fact? And what are the accepted metrics (if any) to build a credible database for our insights?

One way to understand the potential of Big Data and IoT is to apply the question to modern water detection systems. The marketplace is exploding with products like Fibaro Flood Sensor, Quirky Overflow, Utilitech Leak Detector, Wally, WaterCop, and more. These systems use sensors to detect moisture, temperature, and humidity and can identify water leakage from all kinds of sources like dishwashers, frozen pipes, washing machine hoses, and water heaters. This triggers an alert to the resident who can stop the leak, clean up the water, and prevent or mitigate costly damage and repairs.

This is a very meaningful breakthrough in loss prevention given the billions paid annually in water-involved insurance losses.

But in our vision of Big Data and IoT, we’d take it several steps further. Leak occurrences would be analyzed in the context of type of appliance, type of pipes, weather conditions, age of home, installation methods, etc. And any relevant specifics would be captured to develop insights on better (or worse) performing construction methods, products, and technologies. This data would inform to future products and practices.

From there, a database of homes with certain characteristics would be built. And this same approach using sensors and tech could be applied to many other failure modes from wind to seismic. All of these databases then would become elements of resilience measurement. Indeed, some already are.

The water detection example shows how resilience metrics are a Big Data problem. The myriad water detection devices cited above create data, but how to sort, harvest, and use the data effectively has a long way to go if it will be valuable beyond the individual loss. This type of micro-measurement is just one of the many ways we grapple with resilience measurement.

In our paper, we also examine various macro-resilience metrics/frameworks/indices including the U.S. Resiliency Council’s effort to rank buildings for seismic performance.

The key is working out how the micro-metrics and data can be developed and reliably applied in the macro-context of community resilience. There are existing options. For residential structures, MLS could include scores on the structural integrity, and attendant durability and life expectancy of a home and/or its location relative to natural hazards. Trulia and Zillow can make this information transparent to homeowners, and it would finally be a factor in driving the market value. Think about it. How is it that the most important aspect of a home—its structural integrity—is still not transparent to homeowners?

Cisco projects that by the year 2020 there will be 37 billion smart products on the market. Juniper Research predicts that there will be 10 million smart connected devices by 2017. It’s time for us to figure out a formula to harness the explosion of data that is headed our way.

This is an important conversation that is only beginning. Our next blog will look at the larger issue of Big Data and disaster resilience and how these movements are converging. And, you may want to join us as we spotlight this issue during our 2016 Annual Conference Meeting with a panel entitled The Next Generation of Resilient Communities. We are bringing together leading voices from government, private, and association to share their perspectives on current initiatives, opportunities, and challenges of fostering resilient communities.

Either way, we look forward to the ongoing conversation.

 

 

New Federal Alliance for Safe Homes (FLASH)® Paper on Disaster Resilience, Insurance, and Technology

Understanding the Intersection of Resilience, Big Data, and the Internet of Things in the Changing Insurance Marketplace

There has been a recent explosion of data and technology, affecting every aspect of society. The resilience movement is no different. This paper examines the intersection of big data, including efforts to measure resilience and telematics, and the Internet of Things (IoT), including smart home technology.

From the 2005 initiative to score individual houses as part of the $250 million My Safe Florida Home program, to current efforts at the U.S. Resiliency Council to rank buildings for seismic performance, one thing is clear—comprehensive building rating programs are emerging alongside the call for disaster resilience in communities across the globe.

The other side of measuring resilience and big data generally is the IoT. Specifically, the smart home technology movement has the potential to create an enormous amount of data, and that data can revolutionize how we understand risk. This paper explores how smart home technology can make homes not just smarter, but safer and stronger as well.

The potential for smart home technology is limitless. A home can be transformed in a way that both optimizes the functionality of a dwelling, and provides previously unknown insights about the behavior of the homeowner to more accurately assess risk. There are crucial considerations to the success of smart home technology, and security and data privacy are essential. The status of telematics is examined, as well as how they represent a full circle back to the role of big data.

This paper provides a framework of considerations for approaching the potentially groundbreaking convergence of big data and the IoT to transform the resilience movement and the overall safety and strength of residential structures.

Click here to download or read the full paper.