3 Ways Geospatial Data is Changing the Way We Manage the Environment Around Us

10 Nov.,2022

 

LiDAR Mapping Services

Dr. Michael Flaxman, director of Geodesign Technologies and Geospatial Community Advocate for OmniSci, presents three example of how newly available geospatial data is being leveraged.  

These days we are drenched in data. The hose has been opened wide, thanks to a deluge of connected devices that produce data at an incredible rate—with no sign of slowing down. Research estimates that by 2025 nearly 463 exabytes of data will be created each day globally. Much of this new data has a geospatial component to it, that is, it contains information that identifies a specific geographic location. Traditionally geospatial data was primarily used by the military or intelligence agencies. Today, however, the use of geospatial information has exploded and has crossed over into nearly every industry, as more and more markets are discovering the new levels of insight that it can provide. What are some of the ways this newly available geospatial data is being leveraged? From mitigating wildfire risk to planning for future resource demand, here are three examples.

1. Visualization and analysis of deeply temporal spatial data: planning for future water demand

Water infrastructure is very expensive, both to build and to maintain, which makes sizing it correctly a major concern. Recently, Geodesign Technologies used OmniSci, a GPU-accelerated analytics platform, to summarize thirty years of historic data that related water demand to zoning density and building types for the South Florida Water Management District’s (SFWMD) service area. As part of a National Science Foundation project on Water, Sustainability and Climate Change, it is developing a set of open spatial planning scenarios which combine models of climate with projected urban growth.  Such spatial scenarios can be used to plan for future water demand as cities and regions grow and also to target water demand management incentive programs to improve sustainability while lowering ratepayer capital infrastructure costs.

Conventional methods for water demand projections are based on historic per capita use, adjusted for income. However, we know from satellite monitoring that water use per person varies widely based on location, with roughly half of water consumption relating to outdoor landscaping.  A couple living in a large lot subdivision does not use the same amount of water as a couple living in a condo—even if they have the same family size and income.  New housing options often include a variety of water-saving technologies, and housing data of construction or major retrofits are also major predictive factors.  We also know that climate change is already affecting water demand, and will likely be an even stronger influence within the lifespan of infrastructure that is built today.

What has stopped water planners from adopting more accurate models? It turns out that the problem is not a lack of data, but rather the challenge of integrating diverse, large new data sources together. Utility companies today have access to ‘smart meters,’ parcel information and openly available satellite data which measures outdoor water use directly.  The catch is that parcels, satellite and climate model output data are huge in both space and time, requiring new methods to visualize and analyze it. For example, the SFWMD service area includes 6.9 million parcels. A tool like OmniSci can help find patterns or identify particular events in data that is too large to scan visually.  Often known as “hot spot” analysis, the ability to view decades worth of data and observe anomalyous behavior is fundamental to integrated water demand modeling.

2. Evaluating risk for natural disasters using big data

For the first time ever, we now know the location of every single structure in the United States. Microsoft, as a demonstration of its machine-learning prowess, has released a public dataset of 125 million polygons. The ripple effects of this contribution are only starting to become clear, but could be transformative for emergency and disaster planning. Previously, publicly available datasets such as US Census data only included aggregate counts of residences at much coarser spatial grain, and did not include business locations.  This led to a poor understanding of risk, particularly to business disruption and in lower density communities.  Yet as recent US natural disasters have made clear, disasters striking each of these areas are both common and costly.

What can we learn from understanding more precisely where people really live and work? For starters, we can and should re-evaluate flood risk, which is both the most costly and the most common form of damage.  Conventionally, US efforts have largely focussed on mapping “100-year” flood plains, and encouraging better practices in those locations.  Yet it is becoming increasingly clear that future efforts will need to look more broadly—and dynamically—at flood risk.

US startup EcoAcumen is providing county governments with “dynamic flood map” products based on pre-computed impervious cover and “height above nearest drainage” (“HAND”)  data for every structure. This can be combined with near real-time weather and soil moisture data to give flood estimates based on current conditions, instead of just historic averages. This is important because climate change is leading to storms which are often more intense than the “design storms” used in conventional mapping.  The catch here is that it is also highly computationally-intensive. Consider the changes from conventional practices just in terms of data inputs.  Instead of a single “design storm,” actual rainfall data is used, with updates every 6-minutes from the National Oceanic and Atmospheric Administration’s “GOES” satellites.  Rather than estimating run-off once every 20-years or so based on “average” soil moisture, EcoAcumen re-evaluates it every 2-5 days based on satellite data updates.  Similarly, they measure land cover with monthly updates at 10m resolution, whereas conventional studies are based on 30m data updated every 5 years.

The trend in data volumes and update rates is clear—more accurate and useful natural disaster planning data will require a shift from one-off analyses based on a single ‘typical extreme’ to continuous environmental monitoring and data science.  Tools like OmniSci provide a unique analysis and distribution environment, democratizing access to the compute and visualization capabilities needed to substantially improve our natural disaster preparedness.

3. Handling millions of LIDAR points: determining vegetation clearances around power lines for wildfire safety

LIDAR (Light Detection and Ranging) is three-dimensional measurement data obtained from ground and aerial active sensors that is used to examine the surface of the Earth. These sensors work like bats, sending out signal pulses and measuring distances based on the signal return times. Currently, these sensors are very expensive. However, as the technology is increasingly being adopted for use in self-driving cars, prices are beginning to fall rapidly. LIDAR data can be useful in many ways. For example, LIDAR data from planes and satellites can measure building height, or tree heights across vast areas of land. It can also help combate some of risks associated with climate change. Climate change and population growth have caused an increased risk to communities from catastrophic wildfires, and to help mitigate these risks, utility companies should identify vulnerable areas and prioritize vegetation management. LIDAR data can help with this by measuring horizontal vegetative clearance from power line segments. Visualizing the levels of vegetation growth as well as vegetation health in an interactive tool can help utility companies identify areas of wildfire risk.

About the Author

Dr. Michael Flaxman is the director of Geodesign Technologies, a San Francisco-based environmental planning firm specializing in spatial scenario planning and satellite image analysis.  He is also a Geospatial Community Advocate for OmniSci.  Recent projects have included incidental take simulation and climate change planning for the Florida Beaches Habitat Conservation Plan (FBHCP), and technical scenario simulation for the  Southern Florida Water, Sustainability and Climate project (sponsored by the National Science Foundation) and Peninsular Florida Landscape Conservation Cooperative.

Related

Share this article

  • LinkedIn

  • Print