NASA uses AI to get better pictures of the sun, as space telescopes can be damaged if they stare at it

NASA uses artificial intelligence to get better pictures of the sun as space telescopes can be damaged by staring at the massive star

  • NASA uses artificial intelligence to better see the sun
  • It uses machine learning at the Solar Dynamics Observatory and its imaging tool for the Atmospheric Imagery Assembly
  • Allows NASA to take photos and limit the effects of solar particles and strong sunlight
  • Launched in 2010, the SDO has taken millions of photos of the sun

The sun may be the most powerful source of energy in the galaxy, but NASA researchers are using artificial intelligence to get a better look at the giant gas ball.

The US space agency uses machine learning on solar telescopes, including the Solar Dynamics Observatory (SDO), which launched in 2010, and the Atmospheric Imagery Assembly (AIA), an imaging tool that constantly looks at the sun.

This allows the agency to take incredible photos of the celestial giant, while mitigating the effects of solar particles and “intense sunlight,” which begins to corrode lenses and sensors over time.

NASA uses artificial intelligence to better see the sun and protect its instruments from solar particles and constant sunlight

The machine learning is used on the Solar Dynamics Observatory and the Atmospheric Imagery Assembly imaging instrument (pictured) that constantly looks at the sun

The machine learning is used on the Solar Dynamics Observatory and the Atmospheric Imagery Assembly imaging instrument (pictured) that constantly looks at the sun

This slider shows the sun as seen by AIA in 304 Angstrom light in 2021 before degradation correction (left) and with corrections from a sounding rocket calibration (right)

WHAT CAUSES THE SOLAR CYCLE?

The sun goes through an 11-year cycle in which it goes from very active to less active.

It is followed by sunspots and is currently going through a quiet phase.

Researchers from Germany published a recent study claiming it could be caused by the corresponding gravitational fields of Venus, Earth and Jupiter.

This only happens once every 11.07 years and is probably the cause of the switch in the solar cycle.

The sloshing of the plasma may only be disrupted by one millimeter, but this is enough to alter the motion on a larger scale, they claim.

Scientists used to use “sounding rockets” — small rockets that carry only a few instruments and take a 15-minute flight into space — to calibrate the AIA, but they can only be launched so many times.

Because the AIA was continuously looking at the sun, scientists had to come up with a new way to calibrate the telescope.

Enter machine learning.

“It’s also important for deep space missions, which don’t have the ability to sound rocket calibration,” said Dr. Luiz Dos Santos, a solar physicist at NASA’s Goddard Space Flight Center and the study’s lead author, in a pronunciation.

“We’re tackling two problems at once.”

The scientists trained the algorithm to understand solar structures and compare them to AIA data by giving it images of the sounding rocket flights and letting it know how much calibration is needed.

Once it had entered enough data, the algorithm knew how much calibration was needed for each image; in addition, it was also able to compare different structures over multiple wavelengths of light, such as a solar flare.

“This was the most important thing,” Dos Santos said. “Instead of just identifying it at the same wavelength, we identify structures across the wavelengths.”

The algorithm is also able to compare different structures across multiple wavelengths of light, such as a solar flare

The algorithm is also able to compare different structures across multiple wavelengths of light, such as a solar flare

This allows researchers to consistently calibrate AIA’s images and improve data accuracy.

The researchers said their approach could be “adapted to other imaging or special instruments that operate at other wavelengths,” the study summary said.

The study was published in the journal Astronomy and Astrophysics in April 2021.

Last June, the US agency released a 10-year time-lapse of the sun to mark the 10th anniversary of the SDO in space.

The space agency has a library of the SDO’s biggest shots in the past decade, including strange plasma tornadoes in 2012 and dark spots called “coronal holes,” where extreme ultraviolet emissions are low.

WHAT IS NASA’S SOLAR DYNAMICS OBSERVATORY SATELLITE?

The Solar Dynamics Observatory (SDO) is a NASA mission that has been observing the sun since 2010.

The ultra-HD cameras convert different wavelengths of light into an image that people can see, and the light is then colored in a rainbow of colors.

The satellite was launched on February 11, 2010 from Cape Canaveral.

The SDO contains a suite of instruments that provide observations that will lead to a more complete understanding of the solar dynamics driving the variability in the Earth’s environment.

One of the many incredible images provided by the SDO

One of the many incredible images provided by the SDO

Among the tasks that this set of instruments can accomplish is to measure ultraviolet light, variations in the Sun’s magnetic field, create images of the chromosphere and inner corona, and record solar variations occurring in different time periods of a solar cycle. can occur.

It does this using three separate devices: the Helioseismic and Magnetic Imager; Atmospheric imaging; and Extreme Ultraviolet Variability Experiment.

Science teams receive this data, which they then process, analyse, archive and release to the public.

Advertisement

.