How to Avoid ‘Bad Astrophotography:’ Advice from Thierry Legault

Take a look at the collection of images above. All are high resolution astrophotos of different artificial satellites, taken by renowned astrophotographer Thierry Legault, using one of his 10″ telescopes and a simple webcam. The images have been sharpened and enlarged so that it’s easy to see small structures on the satellites such as antennas or solar panels.

Like this one, which is surely the Soyuz, with solar panels on each side:

Could this object be a Soyuz spacecraft? Credit: Thierry Legault

These are pretty awesome images….

…except Thierry and I are not telling the truth.

These images are not of satellites, but are all pictures of the star Vega.

What you have just seen is an example of what Legault calls “Bad Astrophotography,” a phrase Legault uses in homage to Phil Plait and his Bad Astronomy blog. Basically, this means that because of image artifacts or over-processing you can be fooled – intentionally or unintentionally — into seeing something that is not really there.

“In any raw image there is noise and if you process this image too strongly, the noise appears and some processing can transform the noise into something that looks like detail – but it is not detail,” said Legault.

So just like the images that have been touted as the Bigfoot on Mars, or even blurry pictures of supposed UFOs, sometimes astrophotos can look like something they are not.

“Many people are not aware that an image is not reality — it is a transformation of reality,” Legault told Universe Today, “and any image that is taken under difficult conditions or close to the resolution limits of the telescope, the image is less and less reliable or reflects less and less the reality.”

Many things can cause problems in astrophotography:

  • atmospheric turbulence, which can distort images and even create false details or make real ones disappear
  • the unavoidable shaking of the telescope due to manual tracking, especially in satellite imaging
  • noise, the variation of brightness or color in images, due to sensor and circuitry of a digital camera, or the diffraction of light from the telescope

These problems may be hard to avoid, depending on your equipment and level of skill. So what should an astrophotographer do?

“The solution for these issues is to be careful with processing,” Legault explained. “I’ve often said the best, most skilled person in imaging processing is not the one that knows all the possibilities of processing, but the person that knows when to stop processing an image.”


Over-processing, such as multiple smoothing, sharpening and enlargement operations, or layer transformations and combinations in Photoshop can create false details in images.

The issues with the lead image in this article of all the “satellites” — the structures and the different colors you see — are mainly caused by atmospheric turbulence and noise in the raw images, combined with effects from the color sensor in the camera.

Atmospheric Turbulence

Think of how when you look at a star that is low on the horizon with the naked eye, you see twinkling, and sometimes even changes in color, so the atmospheric turbulence can definitely make an effect on colors.

The star Vega again, a series of images put together in an animation: it appears to be a satellite during its flight showing variation of size and apparent rotation. But it is not. Credit: Thierry Legault.

“When you observe a star through a telescope at high magnification, it can become even more distorted,” Legault said. “You have spikes, distortions and changes in shape, and a star that is supposed to be a point or a disk, unfortunately, by turbulence is transformed into something that is completely distorted and can take many shapes.”

Equipment issues

Additionally, Legault said, combining the distortions with an effect from color sensors in the camera, called the Bayer sensor, can cause additional issues.

“For the sensor, you have pixels by groups of four: one red, one blue and two green in square,” Legault said, “and you can easily imagine that if the object is very small, such as a very small star, the light can fall on a red pixel and then the image can become red. Then the image of the star is distorted and you have some spikes that fall on a different color pixel.”

And then the processing does the rest, transforming turbulence and camera artifacts into details that may look real, Legault said.

Legault recalled an amateur who, a few years ago, published an image of Saturn’s moon Titan.

“The image contained surface details and a sharp disk edge,” he said, “and looked quite convincing. But we all know that Titan is covered with an opaque and uniform atmosphere, and surface details can’t be seen. The details were actually only artifacts created from noise or other image defects by over-processing a poor resolution image with multiple upsizing, downsizing, sharpening and smoothing operations.”

What’s an amateur astrophotographer to do?

So, with more and more people doing astrophotography these days, how can they make sure that what they think they are seeing is real?

“There are solutions like combining raw images,” Legault said. “When you combine 10 or 20 or 100 raw images, you can decrease the noise and the image is more reliable and less distorted by turbulence.”

For example, take a look at the images of the space shuttle Discovery below. The two left images are consecutive single frames, processed by smoothing (noise reduction), sharpening (wavelets) and was enlarged 3 times.

The space shuttle discovery imaged in orbit. Credit and copyright: Thierry Legault

The first and second images, although blurry, seem to show lots of very small details. But when they are compared together or with a combination of the 27 best images of the series (on the right), only the larger structures are finally common.

“The bright line marked A is not real, it is an artifact likely caused by turbulence,” Legault said, “and if it were an image of the space station taken during an EVA, I could perhaps claim that this detail is an astronaut, but I would be wrong. The double dark spot marked B, could be taken for windows on top of the cockpit of Discovery. But it is not real; if it were an image of the Space Station, I could claim that it’s the windows of the Cupola, but again I would be wrong. In C, the two parallel lines of the payload bay door is common to both images, but a comparison with the right image, which contains only real details, show that they are not real and that they are probably a processing artifact.”

One of the drawbacks of color sensors is that there is more noise in the image, so the image is less reliable than with black and white sensors. This is the reason that deep sky cameras often use black and white sensors. And so for imaging satellites like the International Space Station, Legault uses a black and white camera.

“It is more reliable, and you don’t need a color camera because the space station is colorless, except for the solar panels,” Legault said. “In addition, the monochrome sensor is much more sensitive to light, by 3 or 4 times. More sensitive means you have less noise.”

Logical advice

Legault’s main advice is just to be logical about what you are seeing in both raw and processed images.

“You need to look at the whole image, the consistency of the whole image, and not just one detail,” he said. “If I take an image that I say has detail on Jupiter’s satellites and on the same image I cannot even see the great red spot on Jupiter, it doesn’t work – that is not possible. The image must have an overall consistency and include details of an object larger than the one that we are interested in. So, if we see an image where someone is supposed to have an astronaut and a module of the space station, and a larger module is not visible or is completely distorted, there is a problem.”

On March 7, 2011 the robotic arm on space shuttle Discovery is used for a last inspection of the protection tiles before landing on the STS-133 mission. Image credit and copyright: Thierry Legault

Another piece of advice is to compare your image to another image taken by someone else — another amateur astrophotographer, a professional or even a space agency.

“If You have a photo of the space shuttle or the space station, for example, you can compare it to a real photo and see if all the details are there,” Legault said.

And if you still have questions about what you are seeing on your own images, Legault also suggests posting your images on astronomy forums so you can get the analysis and insights of other amateur astrophotographers.

“So, there are solutions to make sure that details are real details,” Legault said, “and as you get used to observing raw images and processed images, it will become easier to understand if everything is real, if just a part is real, or if almost nothing is real.”

But Legault’s main advice is not to over-process your images. “Many of amateurs take amazing, sharp images and using gentle and reasonable processing so that there are no artifacts.”

For more information and advice from Thierry Legault, see his website, especially the technical pages. Legault has written a detailed article for the March issue of Sky & Telescope on how to image the International Space Station.

You can also read our article on Legault’s astrophotography, published on March 1, 2012.

4 Replies to “How to Avoid ‘Bad Astrophotography:’ Advice from Thierry Legault”

  1. “In addition, the color sensor is much more sensitive…”
    That should read” …the monochrome sensor is much more sensitive…”

  2. “In addition, the color sensor is much more sensitive…”
    That should read” …the monochrome sensor is much more sensitive…”

  3. It’s surprising how much people take for granted when it comes to astrophotography and it’s good to know how much effort Thierry puts into making sure his pictures match reality as closely as possible. Great work Thierry, and thanks for another great article Nancy!

Comments are closed.