Image credit: NASA
New software developed at NASA’s Jet Propulsion Laboratory may give firefighters a new tool for spotting forest fires before they get a chance to really get going. The software will link various NASA Earth science satellites together into a virtual web of sensors. If one satellite spots a blaze, it can instruct the other satellites to take more detailed photographs of the area. Controllers can then report the fire to officials as well as scientists interested in how early forest fires behave. Similar software is being considered for other natural events, like floods.
If a forest catches fire and no one is around to see it, can it call for help? The forest cannot call, but thanks to new technology developed by NASA, firefighters may get the word faster through new, high-tech eyes in the sky.
Remove All Ads on Universe Today
Join our Patreon for as little as $3!
Get the ad-free experience for life
New software developed by NASA’s Jet Propulsion Laboratory, Pasadena, Calif., helps link NASA’s Earth science satellites together to form a virtual web of sensors with the ability to monitor the globe far better than individual satellites. An imaging instrument flying on one satellite can detect a fire or other hazard, and automatically instruct a different satellite that has the ability to take more detailed pictures to take a closer look. If the images show that a potential hazard does exist, the responding satellite provides data to ground controllers, who then report the fire to forest officials and to an interested science team.
“Essentially, we are adding the response mechanism to the detection process,” said Dr. Steve Chien, JPL principal scientist in artificial intelligence. “This is a first step to enabling users of satellite remote sensing data to specify the kind of data they want, such as forest fires or floods, rather than the traditional request to, say, look at northern Montana.”
One of the core components in this collaborative effort is the Science Goal Monitor system being developed at NASA’s Goddard Space Flight Center, Greenbelt, Md. The system enables scientists to specify what to look for and how to react in descriptive rather than technical terms. Then the system monitors science streams of data to identify occurrences of the key events previously specified by the scientist.
“When an event occurs, the system autonomously coordinates the execution of the scientist’s desired reactions between different observatories or satellites,” said Jeremy Jones, Goddard’s task leader for the monitor system. “This is designed to be adaptable to many different types of phenomena and supports a wide variety of sensor web configurations.”
Using the sensor web method, investigators no longer have to rely on after-the-fact data analysis to determine what happened. The information can be used to rapidly respond to hazardous events such as forest fires.
For example, moderate-resolution imaging instruments that fly on both NASA’s Terra and Aqua spacecraft observe the entire globe every day. The instruments’ data are automatically processed on the ground within hours of acquisition by the Rapid Response System at the Goddard Space Flight Center. If this processing detects a hot spot, scientific criteria can be used to automatically redirect the Earth Observing 1 satellite to provide high-resolution images. When that information comes back to a scientist for interpretation, it is made available to forest officials to determine the appropriate response. All this can happen in 24 to 48 hours, compared to a typical lead time of 14 days for preplanned observations.
The satellite sensor web demonstration is a collaborative effort between JPL and the Goddard Space Flight Center. The Rapid Response project is a joint Goddard Space Flight Center effort with the University of Maryland, College Park, led by Dr. Chris Justice.
Original Source: NASA/JPL News Release