Automatisierung von Geospatial und Predictive Analytics für schnellere Erkenntnisse und Missionsergebnisse

Technology   |   Andy MacIsaac   |   Oct 15, 2020 TIME TO READ: 7 MINS
TIME TO READ: 7 MINS

Historically, the federal government has been a primary provider of authoritative geospatial information; however, geospatial information has become crucial to a wide range of federal applications and online services across multiple domains — military, law enforcement, intelligence, emergency response, agriculture and weather-climate prediction — there’s a significant cultural shift underway.

 

Now, federal agencies are consuming rather than providing geospatial information from a variety of sources. As a result, the federal government’s role also has shifted toward coordinating and managing geospatial data and facilitating partnerships among the producers and consumers of geospatial information in government, the private sector, and academia.

 

The volume of geospatial data and its multiple sources — ranging from phones, health devices, vehicles, private satellites, drones, and many other types of sensors, both fixed and mobile — creates a reality where geospatial information is dynamic and ever-changing. As government organizations face a wide range of significant challenges from wildfires, hurricanes, COVID-19, social unrest, traffic congestion, service delivery, utilities and many more, the importance of accurate, timely, and informative geospatial data will continue to increase.

 

If data is not consumable, is it real?

A challenge arising from this shift to being more of a consumer of third-party geospatial data and dealing with the extreme variety of available sources is that agency leaders, especially CIOs and CDOs, need to focus on how their platforms are utilized to make this data available and consumable.

 

The CIO from the U.S. Army Corps of Engineers put it this way, “We want to be able to ensure that the data is consumable, regardless of what form it’s in. So we’re assessing the various repositories, assessing the various sources, and we’re trying to consolidate or collapse our enterprise to begin to pull from those respective repositories of what is authoritative and begin to put out an output that’s consumable by the end user.”

 

An article in NextGov profiled some of the forthcoming technology needs that the National Geospatial Intelligence Agency (NGA) needs to address in order to stay ahead of near-peer adversaries that have much of the same access to datasets and are making investments in new technologies. As NGA CTO, Mark Munsell, explained, “To stay ahead of these adversaries, we must bring together our world-class experts at NGA, industry partners with exquisite domain expertise and technical capabilities, and companies who have never worked with government before but whose products could help advance NGA’s mission.”

 

The article goes on to highlight some of the specific needs and capabilities that agencies like the NGA and others need to acquire. In the area of advanced analytics and modeling, analysts working with data need the ability to rapidly discover and integrate diverse data types from multiple sources to discover and characterize relevant patterns. These data teams also require a user-friendly assisted modeling capability to recommend statistical approaches based on observed workflow data to ensure they are aware of analytical options that may be of use.

 

When it comes to managing the veracity and volume of data, data teams also need the ability to rapidly aggregate diverse data types and schemas from sources across multiple domains to quickly extract insight at scale. From a geospatial perspective, data teams at the NGA and in other agencies need quick and ready access to location-based insights with the ability integrate large amounts of data from commercial location-based services into existing workflows to improve their spatial and temporal insights about the physical environment.

 

These required abilities sound like a lot, but with the Alteryx APA Platform, these abilities are built in to enable the modern-day government analyst, citizen data scientists, traditional data scientists or any level of data worker to access and leverage these capabilities and more in a unified advanced analytics platform.

 

Advanced Geospatial Analytics in Disaster Response

The proof of this comes from some robust geospatial related real-world uses cases developed by an Alteryx partner, Atkins, a
contractor to the U.S. Federal Emergency Management Agency (FEMA). When dual hurricanes hit Puerto Rico in
2017, Atkins used the geospatial and predictive analytics capabilities found in the Alteryx APA Platform to perform automated substantial damage estimates for close to 150,000 structures, identifying and prioritizing those structures most in need of a physical evaluation.

 

The strategy was to use Alteryx to estimate the structural damage that had occurred, prioritize areas that still needed some sort of human inspection, and ultimately reduce the total number of in-person inspections so that the recovery process could begin quickly. Alteryx was used to blend over a dozen datasets and the following variables to predict damage:

  • Building style
  • Wind speed
  • Flood levels
  • Elevation levels
  • Structure type
  • Wind exposure
  • Construction quality

Atkins determined that they needed to get three functional groups of teams out to the field quickly. The first team used a geographic information system (GIS) to find the locations of damaged structures. The second team collected the information on the structures, and the third team built the analytics model in Alteryx.

 

The data that was feeding the model came from very disparate sources, including data from the European Union, NOAA, the National Weather Service, FEMA, and the Army Corps of Engineers. This caused some significant data challenges, such as overlapping building stock data, numerous databases of different ages and sources, an inconsistent addressing system, and multiple sources of inundation and flood depth data. Additionally, this data was stored in various formats that required cleansing prior to blending, and creation of indices between the datasets. Alteryx was used for these data prep and blend tasks.

 

At the heart of this work was the ability to use the Alteryx APA Platform’s predictive modeling capabilities to overcome some of
the incomplete or missing data, by filling in the gaps. Additionally, the team leveraged a Boosted Regression Model to look for patterns in the data across thousands of iterations, selecting the most relevant variables and providing a high degree of control over the entire predictive model.

 

With this model, 146,039 structures were evaluated and sorted into priority groups for actual physical inspection. From those structures loaded into the model, it was determined that just over 30,000 of them required the dispatch of inspectors. With this geospatial and predictive capability, inspection resources were deployed where they were most needed, saving tens of millions of dollars in inspection costs and most importantly — speeding up delivery of recovery support to people and communities.

 

Staying Ahead of the Flood of Data at FEMA

This same ability to ingest, prep, join, and analyze massive geospatial datasets and apply predictive analysis in a unified platform enabled Atkins to work with FEMA to analyze and identify flood risk across 90,000,000 structures, throughout all 50 states and six territories. The Alteryx workflow ingested and blended over 300 datasets, with 4000 internal layers. The workflow contains over 3600 automation building blocks and produced actionable insight to inform federal, state, and local authorities on the risk faced by individual structures across different types of flooding situations.

 

Both of these use cases are real-world examples that show how data teams with the NGA, IC, DOD, and other federal agencies can seamlessly harness and expand the use of commercially available datasets and leverage an easy to use spatial and predictive analytics capabilities within a unified analytics platform. What the Atkins use cases prove is that even with large datasets and multiple sources and high complexity, agencies can create the actionable insight they need to accelerate mission outcomes.

 

STAY PUT.

 

Want to hear more on the subject?

 

Watch our webinar on-demand: Automating Geospatial Analytics and Predictive Analytics with Alteryx, featuring Michael DePue, Principal Technical Professional, Atkins Global.

Tags