NASA offering online training for satellite observations of wildfires

Fire risk, detection, and analysis

NASA ARSET training
NASA ARSET training

Brock Blevins, the Training Coordinator for the NASA Applied Remote Sensing Training Program (ARSET) asked that we pass along an online training opportunity.

NASA’s ARSET will be offering a new online webinar series: Satellite Observations and Tools for Fire Risk, Detection, and Analysis.

The six-part training in English and Spanish will cover how remote sensing and Earth observations can be used to monitor conditions before, during and after fires. Topics covered will include weather and climate conditions, fuel characterization, fire risk, smoke detection, monitoring, forecasting, fire behavior, and post-fire landscapes. This intermediate-level training will provide lectures and case studies focused on the use of Earth observations for operational fire monitoring.

Course Dates in 2021: May 11, 13, 18, 20, 25, 27.

Times and Registration Information:

English Session: 11:00-13:00 EDT (UTC-4):
Spanish Session: 15:00-17:00 EDT (UTC-4): 

Learning Objectives: By the end of this training attendees will understand:

  • Terminology regarding type and components of fire (pre, during, post)
  • Climatic and biophysical conditions pre-, during-, and post-fire
  • The satellites and instruments used in conducting fire science
  • The applications of passive and active remote sensing for fires
  • How to visualize fire emissions and particulate matter
  • The use of tools for active fires, emissions, and burned areas
  • How to acquire data for conducting analysis in a given study area 


Audience: This training is primarily intended for local, regional, state, federal, and international organizations involved in resource and ecosystem management, health and air quality, disaster risk management, disaster response, and those with an interest in applying remote sensing to fire science.

Course Format: Six, 2-hour Parts

NASA uses UAVs and satellites equipped with radar to monitor recovery from vegetation fires

They observe fire fronts and burn scars during and shortly after fire moves across a landscape

remote sensing to monitor wildfire recovery

For the past few decades, scientists have been using satellite- and airplane-based radar instruments to detect damage caused by wildfires and human-caused blazes. Radar instruments can observe by day or night and can see land through clouds and smoke, so they are helpful for observing fire fronts and burn scars during and shortly after fire moves across a landscape.

Landscape ecologist Naiara Pinto and colleagues at NASA’s Jet Propulsion Laboratory are now taking a longer view. They are trying to decipher where and how well forests and scrublands are recovering in the years after a fire.

Synthetic aperture radar (SAR) instruments send out pulses of microwaves that bounce off of Earth’s surfaces. The reflected waves are detected and recorded by the instrument and can help map the shape of the land surface (topography) and the land cover—from cities to ice to forests. By comparing changes in the signals between two separate satellite or airplane overpasses, scientists can observe surface changes like land deformation after earthquakes, the extent of flooding, or the exposure of denuded or bare ground after large fires.

NASA research aircraft
One of the aircraft NASA equips with synthetic aperture radar or other sensors. This is a medium-sized UAV-NASA SIERRA. SIERRA medium UAV at NASA Ames Research Center, Moffett Field, California. (Photograph: NASA.) NASA SIERRA Pilot and Range Safety Officer Mark Sumitch shown for scale.

SAR instruments are carried on the European Space Agency’s Sentinel-1 satellites, while NASA currently deploys its Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) via research aircraft. NASA and the Indian Space Research Organization are planning to launch the NISAR satellite in 2022.

remote sensing to monitor wildfire recovery

Mounted on the bottom of NASA research planes, UAVSAR has been flown over the same portions of Southern California several times since 2009. Pinto and JPL colleagues Latha Baskaran, Yunling Lou, and David Schimel analyzed that data and developed a mapping technique to show the different stages of removal and regrowth of vegetation (chapparal and forest).

The maps above are essentially mosaics of the observations across a decade. Radar signals bounce off burned, barren terrain differently than they reflect from unburned, brush-covered hillsides or from fresh growth. The colors indicate the relative amount of vegetation observed by different UAVSAR flights at different times. Yellow lines on the maps indicate the extent of several major fires: StationColbySan Gabriel (SG) ComplexLa Tuna, and Bobcat.

“Overall, the colors are telling us that the Angeles National Forest contains a patchwork of plant communities at different stages of regeneration,” said Pinto, who is a science coordinator for UAVSAR. For instance, areas with more red had more vegetation in 2010 than they do now. Areas with more blue and green shading had more vegetation (regrowth) in recent years. Yellow indicates areas burned in 2020 that had a higher volume of vegetation in 2010 and 2017 (red+green) but lower volume in 2020 (blue).

remote sensing to monitor wildfire recovery

The image above illustrates how those maps were assembled. Radar data were collected during UAVSAR flights in 2010, 2017, and 2020 over Angeles National Forest and other areas northeast of the greater Los Angeles metropolitan area.

The project has been supported by NASA’s Earth Applied Sciences Disasters program, which generates maps and other data products for institutional partners as they work to mitigate and recover from natural hazards and disasters. The SAR technique is still being tested and validated, but the intent is to monitor forest regrowth and fire scar change over time, which are important information for forest and fire managers working to manage risks.

NASA Earth Observatory images by Joshua Stevens, using UAVSAR data and imagery courtesy of Anne Marie Peacock, Naiara Pinto, and Yunling Lou and NASA/Caltech UAVSAR. Story by Michael Carlowicz.