Press "Enter" to skip to content

Artificial Intelligence Improves NASA’s Eyes on the Sun

A group of researchers is using Artificial Intelligence techniques to calibrate some of NASA’s photos of the Sun, which will assist scientists in improving the data they use for solar study. On April 13, 2021, the new approach was published in the journal Astronomy & Astrophysics. A solar telescope has a difficult task ahead of it. Staring at the Sun takes a toll on the body since solar particles and bright sunlight constantly bombards it.

Solar telescopes’ sensitive lenses and sensors deteriorate with time. So scientists recalibrate such equipment regularly to guarantee that the data they transmit back is still accurate. They do this to ensure that they understand how the instrument is evolving. Since its launch in 2010, NASA’s Solar Dynamics Observatory, or SDO, has provided high-resolution photographs of the Sun.

Its photographs have provided scientists with a close look at various solar events that might cause space weather and impact our astronauts and equipment both on Earth and in space. The Atmospheric Imagery Assembly, or AIA, is one of two imaging sensors on SDO that keeps a steady eye on the Sun, capturing photographs every 12 seconds spanning ten wavelengths of ultraviolet light.

This provides a wealth of information about the Sun that no other instrument can match. Still, like all Sun-staring instruments, AIA degrades over time, and the data must be calibrated regularly. So scientists have been calibrating AIA with sounding rockets since SDO’s launch. Sounding rockets are smaller rockets that carry a few instruments and fly into space for a brief time – usually 15 minutes. Importantly, sounding rockets fly above most of the Earth’s atmosphere, allowing devices on board to view the AIA-measured ultraviolet wavelengths.

Be First to Comment

Leave a Reply

Your email address will not be published.