Categories: Apollo

AI Upscales Apollo Lunar Footage to 60 FPS

As exciting and thrilling as it is to watch all the historic footage from the Apollo Moon landings, you have to admit, the quality is sometimes not all that great. Even though NASA has worked on restoring and enhancing some of the most popular Apollo footage, some of it is still grainy or blurry — which is indicative of the video technology available in the 1960s.

But now, new developments in artificial intelligence have come to the rescue, providing viewers a nearly brand new experience in watching historic Apollo video.

A photo and film restoration specialist, who goes by the name of DutchSteamMachine, has worked some AI magic to enhance original Apollo film, creating strikingly clear and vivid video clips and images.

“I really wanted to provide an experience on this old footage that has not been seen before,” he told Universe Today.

Take a look at this enhanced footage from an Apollo 16 lunar rover traverse with Charlie Duke and John Young, where the footage that was originally shot with 12 frames per second (FPS) has been increased to 60 FPS:

Stunning, right? And I was blown away by the crisp view of the Moon’s surface in this enhanced view of Apollo 15’s landing site at Hadley Rille:

Or take a look at how clearly Neil Armstrong is visible in this enhanced version of the often-seen “first step” video from Apollo 11 taken by a 16mm video camera inside the Lunar Module:

Wow, just incredible!

The AI that DutchSteamMachine uses is called Depth-Aware video frame INterpolation, or DAIN for short. This AI is open source, free and constantly being developed and improved upon. Motion interpolation or motion-compensated frame interpolation is a form of video processing in which intermediate animation frames are generated between existing ones, in an attempt to make the video more fluid, to compensate for blurriness, etc.

“People have used the same AI programs to bring old film recordings from the 1900s back to life, in high definition and colour,” he said. “This technique seemed like a great thing to apply to much newer footage.”

But you may not be able to try this at home. It takes a powerful, high-end GPU (with special cooling fans!) DutchSteamMachine said that a video of just 5 minutes can take anywhere from 6 to 20 hours to complete. But the results speak for themselves.

He explained how he does this work:

“First I set out to find the highest quality source videos, which I thankfully found as high-bitrate 720p video files,” he said. “So the quality problem was solved. It is important to start with the highest possible source and edit from there. However, most of the sequences shot were still very choppy. This is because to spare film and record for long periods of time, most of the rover footage was shot at 12, 6 or even 1 frame(s) per second. While people have previously tried to apply stabilization and/or types of frame-blending to ease this effect, I have never really been satisfied with it.”

DutchSteamMachine looks to find what framerate the footage was shot at, which can usually be found in NASA documents or, as in the case of the Apollo 16 footage above, the astronauts announce it when they turn the camera on.

“Unfortunately sometimes the framerate seems to be off or fluctuating, not always working as intended,” he said. “So the best way to find the framerate is to listen to landmarks the astronauts are talking about and match the footage to that.”

Want more details of the process? He explains more:

I split the source file up into individual PNG frames, input them to the AI together with the input framerate (1, 6, 12 or 24) and the desired output framerate by rate of interpolation (2x, 4x, 8x). The Ai starts using my GPU and looks at two real, consecutive frames. Using algorithms, it analyzes movements of objects in the two frames and renders entirely new ones. With an interpolation rate of for example;  5x, it is able to render 5 ‘fake’ frames from just 2 real frames. If footage was recorded at 12fps and the interpolation rate is set to 5x, the final framerate will be 60, meaning that with just 12 real frames it made 48 ‘fake’ frames. Both are then exported back to a video and played back at 60fps with both the real and fake frames.Finally, I apply colour correction, as often the source files have a blue or orange tint to them. I synchronize the footage with audio and if possible, also television and photos taken at the same time. Sometimes two 16mm cameras were running at the same time, so I can play those back next to each other.

Here’s a video he shared of his studio and his specialized equipment:

DutchSteamMachine does this work in his spare time, and posts it for free on his YouTube page.  His tagline is “Preserving the past for the future,” and he also uses the same techniques to enhance old home video, images and slides.

“It’s great to read people’s reactions on my footage,” he said. “So when people post things like, ‘Wow! This is Amazing! I have never seen this before!’. This keeps me going.”

If you’d like to support the amazing restoration/enhancement work that DutchSteamMachine is doing for the Apollo footage, here’s his Patreon Page. By supporting his work, you’ll get extras, early-access and previews of upcoming work and a chance to ask questions about the process.

And he’s planning to keep it all coming.

“I plan to improve tons of Apollo footage like this,” he said. “A lot more space and history-related footage is going to be published on my YT channel continuously.” He also has a Flickr page with more enhanced imagery.

Thanks to DutchSteamMachine for sharing the details of his work! More details at these links:

Patreon
YouTube
Flickr

Nancy Atkinson

Nancy has been with Universe Today since 2004, and has published over 6,000 articles on space exploration, astronomy, science and technology. She is the author of two books: "Eight Years to the Moon: the History of the Apollo Missions," (2019) which shares the stories of 60 engineers and scientists who worked behind the scenes to make landing on the Moon possible; and "Incredible Stories from Space: A Behind-the-Scenes Look at the Missions Changing Our View of the Cosmos" (2016) tells the stories of those who work on NASA's robotic missions to explore the Solar System and beyond. Follow Nancy on Twitter at https://twitter.com/Nancy_A and and Instagram at and https://www.instagram.com/nancyatkinson_ut/

Recent Posts

What Deadly Venus Can Tell Us About Life on Other Worlds

Even though Venus and Earth are so-called sister planets, they're as different as heaven and…

1 hour ago

A Nebula that Extends its Hand into Space

The Gum Nebula is an emission nebula almost 1400 light-years away. It's home to an…

21 hours ago

41,000 Years Ago Earth’s Shield Went Down

Earth is naked without its protective barrier. The planet's magnetic shield surrounds Earth and shelters…

23 hours ago

Fall Into a Black Hole With this New NASA Simulation

No human being will ever encounter a black hole. But we can't stop wondering what…

24 hours ago

Solar Max is Coming. The Sun Just Released Three X-Class Flares

The Sun is increasing its intensity on schedule, continuing its approach to solar maximum. In…

1 day ago

New Evidence for Our Solar System’s Ghost: Planet Nine

Does another undetected planet languish in our Solar System's distant reaches? Does it follow a…

2 days ago