Categories: Apollo

AI Upscales Apollo Lunar Footage to 60 FPS

As exciting and thrilling as it is to watch all the historic footage from the Apollo Moon landings, you have to admit, the quality is sometimes not all that great. Even though NASA has worked on restoring and enhancing some of the most popular Apollo footage, some of it is still grainy or blurry — which is indicative of the video technology available in the 1960s.

But now, new developments in artificial intelligence have come to the rescue, providing viewers a nearly brand new experience in watching historic Apollo video.

A photo and film restoration specialist, who goes by the name of DutchSteamMachine, has worked some AI magic to enhance original Apollo film, creating strikingly clear and vivid video clips and images.

“I really wanted to provide an experience on this old footage that has not been seen before,” he told Universe Today.

Take a look at this enhanced footage from an Apollo 16 lunar rover traverse with Charlie Duke and John Young, where the footage that was originally shot with 12 frames per second (FPS) has been increased to 60 FPS:

Stunning, right? And I was blown away by the crisp view of the Moon’s surface in this enhanced view of Apollo 15’s landing site at Hadley Rille:

Or take a look at how clearly Neil Armstrong is visible in this enhanced version of the often-seen “first step” video from Apollo 11 taken by a 16mm video camera inside the Lunar Module:

Wow, just incredible!

The AI that DutchSteamMachine uses is called Depth-Aware video frame INterpolation, or DAIN for short. This AI is open source, free and constantly being developed and improved upon. Motion interpolation or motion-compensated frame interpolation is a form of video processing in which intermediate animation frames are generated between existing ones, in an attempt to make the video more fluid, to compensate for blurriness, etc.

“People have used the same AI programs to bring old film recordings from the 1900s back to life, in high definition and colour,” he said. “This technique seemed like a great thing to apply to much newer footage.”

But you may not be able to try this at home. It takes a powerful, high-end GPU (with special cooling fans!) DutchSteamMachine said that a video of just 5 minutes can take anywhere from 6 to 20 hours to complete. But the results speak for themselves.

He explained how he does this work:

“First I set out to find the highest quality source videos, which I thankfully found as high-bitrate 720p video files,” he said. “So the quality problem was solved. It is important to start with the highest possible source and edit from there. However, most of the sequences shot were still very choppy. This is because to spare film and record for long periods of time, most of the rover footage was shot at 12, 6 or even 1 frame(s) per second. While people have previously tried to apply stabilization and/or types of frame-blending to ease this effect, I have never really been satisfied with it.”

DutchSteamMachine looks to find what framerate the footage was shot at, which can usually be found in NASA documents or, as in the case of the Apollo 16 footage above, the astronauts announce it when they turn the camera on.

“Unfortunately sometimes the framerate seems to be off or fluctuating, not always working as intended,” he said. “So the best way to find the framerate is to listen to landmarks the astronauts are talking about and match the footage to that.”

Want more details of the process? He explains more:

I split the source file up into individual PNG frames, input them to the AI together with the input framerate (1, 6, 12 or 24) and the desired output framerate by rate of interpolation (2x, 4x, 8x). The Ai starts using my GPU and looks at two real, consecutive frames. Using algorithms, it analyzes movements of objects in the two frames and renders entirely new ones. With an interpolation rate of for example;  5x, it is able to render 5 ‘fake’ frames from just 2 real frames. If footage was recorded at 12fps and the interpolation rate is set to 5x, the final framerate will be 60, meaning that with just 12 real frames it made 48 ‘fake’ frames. Both are then exported back to a video and played back at 60fps with both the real and fake frames.Finally, I apply colour correction, as often the source files have a blue or orange tint to them. I synchronize the footage with audio and if possible, also television and photos taken at the same time. Sometimes two 16mm cameras were running at the same time, so I can play those back next to each other.

Here’s a video he shared of his studio and his specialized equipment:

DutchSteamMachine does this work in his spare time, and posts it for free on his YouTube page.  His tagline is “Preserving the past for the future,” and he also uses the same techniques to enhance old home video, images and slides.

“It’s great to read people’s reactions on my footage,” he said. “So when people post things like, ‘Wow! This is Amazing! I have never seen this before!’. This keeps me going.”

If you’d like to support the amazing restoration/enhancement work that DutchSteamMachine is doing for the Apollo footage, here’s his Patreon Page. By supporting his work, you’ll get extras, early-access and previews of upcoming work and a chance to ask questions about the process.

And he’s planning to keep it all coming.

“I plan to improve tons of Apollo footage like this,” he said. “A lot more space and history-related footage is going to be published on my YT channel continuously.” He also has a Flickr page with more enhanced imagery.

Thanks to DutchSteamMachine for sharing the details of his work! More details at these links:


Nancy Atkinson

Nancy has been with Universe Today since 2004, and has published over 6,000 articles on space exploration, astronomy, science and technology. She is the author of two books: "Eight Years to the Moon: the History of the Apollo Missions," (2019) which shares the stories of 60 engineers and scientists who worked behind the scenes to make landing on the Moon possible; and "Incredible Stories from Space: A Behind-the-Scenes Look at the Missions Changing Our View of the Cosmos" (2016) tells the stories of those who work on NASA's robotic missions to explore the Solar System and beyond. Follow Nancy on Twitter at and and Instagram at and

Recent Posts

This is the Last Thing DART saw as it Smashed Into its Asteroid Target

The first-ever planetary defense technology demonstration mission successfully conducted its mission, slamming into the surface…

16 mins ago

Scientists in Antarctica Have Access to Starlink Now. It’s Available on 7 Continents

SpaceX’s Starlink service is now available in Antarctica, according to a tweet from the National…

11 hours ago

Jupiter at Opposition 2022, Closest in 59 Years

Be sure to observe Jupiter this week, during its finest apparition of a lifetime.

1 day ago

Chinese Companies are Planning to Offer Space Tourism Flights by 2025

China hopes to send passengers to space by 2025 as part of their plan to…

2 days ago

Gravity Really Tangled up the Light From a Distant Quasar

Way back in 1979, astronomers spotted two nearly identical quasars that seemed close to each…

2 days ago

Life can Thrive Around Even the Smallest Stars

By simulating the light of small stars, we now know life can survive on planets…

2 days ago