Categories: Apollo

AI Upscales Apollo Lunar Footage to 60 FPS

As exciting and thrilling as it is to watch all the historic footage from the Apollo Moon landings, you have to admit, the quality is sometimes not all that great. Even though NASA has worked on restoring and enhancing some of the most popular Apollo footage, some of it is still grainy or blurry — which is indicative of the video technology available in the 1960s.

But now, new developments in artificial intelligence have come to the rescue, providing viewers a nearly brand new experience in watching historic Apollo video.

A photo and film restoration specialist, who goes by the name of DutchSteamMachine, has worked some AI magic to enhance original Apollo film, creating strikingly clear and vivid video clips and images.

“I really wanted to provide an experience on this old footage that has not been seen before,” he told Universe Today.

Take a look at this enhanced footage from an Apollo 16 lunar rover traverse with Charlie Duke and John Young, where the footage that was originally shot with 12 frames per second (FPS) has been increased to 60 FPS:

Stunning, right? And I was blown away by the crisp view of the Moon’s surface in this enhanced view of Apollo 15’s landing site at Hadley Rille:

Or take a look at how clearly Neil Armstrong is visible in this enhanced version of the often-seen “first step” video from Apollo 11 taken by a 16mm video camera inside the Lunar Module:

Wow, just incredible!

The AI that DutchSteamMachine uses is called Depth-Aware video frame INterpolation, or DAIN for short. This AI is open source, free and constantly being developed and improved upon. Motion interpolation or motion-compensated frame interpolation is a form of video processing in which intermediate animation frames are generated between existing ones, in an attempt to make the video more fluid, to compensate for blurriness, etc.

“People have used the same AI programs to bring old film recordings from the 1900s back to life, in high definition and colour,” he said. “This technique seemed like a great thing to apply to much newer footage.”

But you may not be able to try this at home. It takes a powerful, high-end GPU (with special cooling fans!) DutchSteamMachine said that a video of just 5 minutes can take anywhere from 6 to 20 hours to complete. But the results speak for themselves.

He explained how he does this work:

“First I set out to find the highest quality source videos, which I thankfully found as high-bitrate 720p video files,” he said. “So the quality problem was solved. It is important to start with the highest possible source and edit from there. However, most of the sequences shot were still very choppy. This is because to spare film and record for long periods of time, most of the rover footage was shot at 12, 6 or even 1 frame(s) per second. While people have previously tried to apply stabilization and/or types of frame-blending to ease this effect, I have never really been satisfied with it.”

DutchSteamMachine looks to find what framerate the footage was shot at, which can usually be found in NASA documents or, as in the case of the Apollo 16 footage above, the astronauts announce it when they turn the camera on.

“Unfortunately sometimes the framerate seems to be off or fluctuating, not always working as intended,” he said. “So the best way to find the framerate is to listen to landmarks the astronauts are talking about and match the footage to that.”

Want more details of the process? He explains more:

I split the source file up into individual PNG frames, input them to the AI together with the input framerate (1, 6, 12 or 24) and the desired output framerate by rate of interpolation (2x, 4x, 8x). The Ai starts using my GPU and looks at two real, consecutive frames. Using algorithms, it analyzes movements of objects in the two frames and renders entirely new ones. With an interpolation rate of for example;  5x, it is able to render 5 ‘fake’ frames from just 2 real frames. If footage was recorded at 12fps and the interpolation rate is set to 5x, the final framerate will be 60, meaning that with just 12 real frames it made 48 ‘fake’ frames. Both are then exported back to a video and played back at 60fps with both the real and fake frames.Finally, I apply colour correction, as often the source files have a blue or orange tint to them. I synchronize the footage with audio and if possible, also television and photos taken at the same time. Sometimes two 16mm cameras were running at the same time, so I can play those back next to each other.

Here’s a video he shared of his studio and his specialized equipment:

DutchSteamMachine does this work in his spare time, and posts it for free on his YouTube page.  His tagline is “Preserving the past for the future,” and he also uses the same techniques to enhance old home video, images and slides.

“It’s great to read people’s reactions on my footage,” he said. “So when people post things like, ‘Wow! This is Amazing! I have never seen this before!’. This keeps me going.”

If you’d like to support the amazing restoration/enhancement work that DutchSteamMachine is doing for the Apollo footage, here’s his Patreon Page. By supporting his work, you’ll get extras, early-access and previews of upcoming work and a chance to ask questions about the process.

And he’s planning to keep it all coming.

“I plan to improve tons of Apollo footage like this,” he said. “A lot more space and history-related footage is going to be published on my YT channel continuously.” He also has a Flickr page with more enhanced imagery.

Thanks to DutchSteamMachine for sharing the details of his work! More details at these links:

Patreon
YouTube
Flickr

Nancy Atkinson

Nancy has been with Universe Today since 2004. She is the author of a new book on the Apollo program, "Eight Years to the Moon," which shares the stories of 60 engineers and scientists who worked behind the scenes to make landing on the Moon possible. Her first book, "Incredible Stories from Space: A Behind-the-Scenes Look at the Missions Changing Our View of the Cosmos" tells the stories of those who work on NASA's robotic missions to explore the Solar System and beyond.

Recent Posts

A Machine-Learning Algorithm Just Found 301 Additional Planets in Kepler Data

Using a new type of deep-learning algorithm, a team of NASA scientists have detected 301…

3 hours ago

Astronomers Find a Planet That Orbits its Star in Just 16 HOURS!

Mercury is the speed champion in our Solar System. It orbits the Sun every 88…

8 hours ago

The Severe Pacific Northwest Flooding Seen From Space

The severe flooding that happened this month was captured by Earth Observation satellites, showing the…

9 hours ago

A Space Telescope With one job: Find Habitable Planets at Alpha Centauri

Breakthrough Initiatives just announced a collaborative effort to launch a space telescope that will search…

1 day ago

“Incident” that Occurred During Loading Pushes the Webb Launch Date to Dec. 22nd

An incident occurred at Europe's Spaceport as engineers were preparing the James Webb Space Telescope,…

2 days ago

Mother and Daughter Win Tickets for Suborbital Space Ride With Virgin Galactic

A wellness coach from Antigua and her daughter are getting tickets for a suborbital space…

2 days ago