The Defense Department’s Joint Artificial Intelligence Center plans to use their sensor and automation capabilities to help provide intelligence about ongoing wildfires.
Below is an excerpt from an article at Nextgov.com:
…Defense has troves of sensor data, digital video data, digital infrared data and sonar data—all of which are attractive environments for machine-learning algorithms. Through this disaster-relief initiative, the agency plans to fly airborne sensors over wildfires in California and collect full-motion video data of the activity. At the same time, they are going to be automatically using a computer vision algorithm to detect which frames of the video have active wildfire.
Typically, the Federal Emergency Management Agency and other disaster-relief entities try to disseminate maps of the fire to all relevant organizations involved in the efforts once per day.
“We believe we will be able to cut that to about once per hour distributed over an app,” Allen said. “By switching to this airborne sensor, applying an AI computer vision algorithm and converting that to geolocation data that is useful for a map application we are also developing, we’ll really be able to make an impact for our users in a short time frame.”
The agency expect to begin testing this capability within the next few months with the National Guard.
The article did not specify how the intelligence would be collected, such as by satellites, fixed wing aircraft, or Unmanned Aerial Systems.