How Artificial Intelligence Can Find the Source of Gamma-Ray Bursts

Gamma-ray bursts (GRBs) are powerful flashes of energetic gamma-rays lasting from less than a second to several minutes. They release a tremendous amount of energy in this short time making them the most powerful events in the Universe. They are thought to be mostly associated with the explosion of stars that collapse into black holes. In the explosion, two jets of very fast-moving material are ejected, as depicted in this artist’s illustration. If a jet happens to be aimed at Earth, we see a brief but powerful gamma-ray burst. Credit: ESO/A. Roquette

Gamma-ray bursts come in two main flavors, short and long. While astronomers believe that they understand what causes these two kinds of bursts, there is still significant overlap between them. A team of researchers have proposed a new way to classify gamma-ray bursts using the aid of machine learning algorithms. This new classification scheme will help astronomers better understand these enigmatic explosions.

Continue reading “How Artificial Intelligence Can Find the Source of Gamma-Ray Bursts”

A Computer Algorithm is 88% Accurate in Finding Gravitational Lenses

Pictures of gravitational lenses from the AGEL survey. Credit: ARC Centre of Excellence for All Sky Astrophysics in 3-Dimensions (ASTRO3D) and the University of NSW (UNSW).

Astronomers have been assessing a new machine learning algorithm to determine how reliable it is for finding gravitational lenses hidden in images from all sky surveys. This type of AI was used to find about 5,000 potential gravitational lenses, which needed to be confirmed. Using spectroscopy for confirmation, the international team has now determined the technique has a whopping 88% success rate, which means this new tool could be used to find thousands more of these magical quirks of physics.

Continue reading “A Computer Algorithm is 88% Accurate in Finding Gravitational Lenses”

A Machine-Learning Algorithm Just Found 301 Additional Planets in Kepler Data

Artist's concept of the Kepler mission with Earth in the background. Credit: NASA/JPL-Caltech
Artist's concept of the Kepler mission with Earth in the background. Credit: NASA/JPL-Caltech

Looking to the future, astronomers are excited to see how machine learning – aka. deep learning and artificial intelligence (AI) – will enhance surveys. One field that is already benefitting in the search for extrasolar planets, where researchers rely on machine-learning algorithms to distinguish between faint signals and background noise. As this field continues to transition from discovery to characterization, the role of machine intelligence is likely to become even more critical.

Take the Kepler Space Telescope, which accounted for 2879 confirmed discoveries (out of the 4,575 exoplanets discovered made to date) during its nearly ten years of service. After examining the data collected by Kepler using a new deep-learning neural network called ExoMiner, a research team at NASA’s Ames Research Center was able to detect 301 more planetary signals and add them to the growing census of exoplanets.

Continue reading “A Machine-Learning Algorithm Just Found 301 Additional Planets in Kepler Data”

NASA’s Perseverance Rover: The Most Ambitious Space Mission Ever?

Artist's impression of the Perseverance rover on Mars. Credit: NASA-JPL

When it comes to Mars exploration, NASA has more success than any other agency. This week, they’ll attempt to land another sophisticated rover on the Martian surface to continue the search for evidence of ancient life. The Mars Perseverance rover will land on Mars on Thursday, February 18th, and it’s bringing some very ambitious technologies with it.

Continue reading “NASA’s Perseverance Rover: The Most Ambitious Space Mission Ever?”

Machine Learning Software is Now Doing the Exhausting Task of Counting Craters On Mars

The tiny black speck in the lower left corner of this image within the red circle is a cluster of recently formed craters spotted on Mars using a new machine-learning algorithm. This image was taken by the Context Camera aboard NASA's Mars Reconnaissance Orbiter in a region called Noctis Fossae, located at latitude -3.213, longitude: 259.415. Image Credit: NASA/JPL-Caltech/MSSS

Does the life of an astronomer or planetary scientists seem exciting?

Sitting in an observatory, sipping warm cocoa, with high-tech tools at your disposal as you work diligently, surfing along on the wavefront of human knowledge, surrounded by fine, bright people. Then one day—Eureka!—all your hard work and the work of your colleagues pays off, and you deliver to humanity a critical piece of knowledge. A chunk of knowledge that settles a scientific debate, or that ties a nice bow on a burgeoning theory, bringing it all together. Conferences…tenure…Nobel Prize?

Well, maybe in your first year of university you might imagine something like that. But science is work. And as we all know, not every minute of one’s working life is super-exciting and gratifying.

Sometimes it can be dull and repetitious.

Continue reading “Machine Learning Software is Now Doing the Exhausting Task of Counting Craters On Mars”

AI Upscales Apollo Lunar Footage to 60 FPS

Mosaic of 16mm footage from the Apollo 16 mission. Original footage credit: NASA. Mosaic by Niels/ DutchSteamMachine.

As exciting and thrilling as it is to watch all the historic footage from the Apollo Moon landings, you have to admit, the quality is sometimes not all that great. Even though NASA has worked on restoring and enhancing some of the most popular Apollo footage, some of it is still grainy or blurry — which is indicative of the video technology available in the 1960s.

But now, new developments in artificial intelligence have come to the rescue, providing viewers a nearly brand new experience in watching historic Apollo video.

A photo and film restoration specialist, who goes by the name of DutchSteamMachine, has worked some AI magic to enhance original Apollo film, creating strikingly clear and vivid video clips and images.

Continue reading “AI Upscales Apollo Lunar Footage to 60 FPS”

Rovers Will be Starting to Make Their Own Decisions About Where to Search for Life

An artist's illustration of the ExoMars/Rosalind Franklin rover on Mars. Image Credit: ESA/ATG medialab

We all know how exploration by rover works. The rover is directed to a location and told to take a sample. Then it subjects that sample to analysis and sends home the results. It’s been remarkably effective.

But it’s expensive and time-consuming to send all this data home. Will this way of doing things still work? Or can it be automated?

Continue reading “Rovers Will be Starting to Make Their Own Decisions About Where to Search for Life”

NASA Tests Water Powered Spacecraft in Orbit

Two cubesats communicated and then maneuvered towards one another in a recent technology demonstration. Image Credit: NASA

Picture two tissue box-sized spacecraft orbiting Earth.

Then picture them communicating, and using a water-powered thruster to approach each other. If you can do that, then you’re up to speed on one of the activities of NASA’s Small Spacecraft Technology Program (SSTP.) It’s all part of NASA’s effort to develop small spacecraft to serve their space exploration, science, space operations, and aeronautics endeavors.

Continue reading “NASA Tests Water Powered Spacecraft in Orbit”

Scientists are Using Artificial Intelligence to See Inside Stars Using Sound Waves

NASA's Solar Dynamics Observatory has captured images of a growing dark region on the surface of the Sun. Called a coronal hole, it produces high-speed solar winds that can disrupt satellite communications. Image: Solar Dynamics Observatory / NASA
NASA's Solar Dynamics Observatory has captured images of a growing dark region on the surface of the Sun. Called a coronal hole, it produces high-speed solar winds that can disrupt satellite communications. Image: Solar Dynamics Observatory / NASA

How in the world could you possibly look inside a star? You could break out the scalpels and other tools of the surgical trade, but good luck getting within a few million kilometers of the surface before your skin melts off. The stars of our universe hide their secrets very well, but astronomers can outmatch their cleverness and have found ways to peer into their hearts using, of all things, sound waves. Continue reading “Scientists are Using Artificial Intelligence to See Inside Stars Using Sound Waves”

Astronaut Scott Tingle Was Able To Control A Ground-Based Robot… From Space.

The artificially intelligent robot Justin cleans the solar panels in the simulated Martian landscape after being instructed to do so by American astronaut Scott Tingle aboard the ISS. Image: (DLR) German Aerospace Center (CC-BY 3.0)

If something called “Project METERON” sounds to you like a sinister project involving astronauts, robots, the International Space Station, and artificial intelligence, I don’t blame you. Because that’s what it is (except for the sinister part.) In fact, the Meteron Project (Multi-Purpose End-to-End Robotic Operation Network) is not sinister at all, but a friendly collaboration between the European Space Agency (ESA) and the German Aerospace Center (DLR.)

The idea behind the project is to place an artificially intelligent robot here on Earth under the direct control of an astronaut 400 km above the Earth, and to get the two to work together.

“Artificial intelligence allows the robot to perform many tasks independently, making us less susceptible to communication delays that would make continuous control more difficult at such a great distance.” – Neil Lii, DLR Project Manager.

On March 2nd, engineers at the DLR Institute of Robotics and Mechatronics set up the robot called Justin in a simulated Martian environment. Justin was given a simulated task to carry out, with as few instructions as necessary. The maintenance of solar panels was the chosen task, since they’re common on landers and rovers, and since Mars can get kind of dusty.

Justin is a pretty cool looking robot. Image: (DLR) German Aerospace Center (CC-BY 3.0)

The first test of the METERON Project was done in August. But this latest test was more demanding for both the robot and the astronaut issuing the commands. The pair had worked together before, but since then, Justin was programmed with more abstract commands that the operator could choose from.

American astronaut Scott Tingle issued commands to Justin from a tablet aboard the ISS, and the same tablet also displayed what Justin was seeing. The human-robot team had practiced together before, but this test was designed to push the pair into more challenging tasks. Tingle had no advance knowledge of the tasks in the test, and he also had no advance knowledge of Justin’s new capabilities. On-board the ISS, Tingle quickly realized that the panels in the simulation down here were dusty. They were also not pointed in the optimal direction.

This was a new situation for Tingle and for Justin, and Tingle had to choose from a range of commands on the tablet. The team on the ground monitored his choices. The level of complexity meant that Justin couldn’t just perform the task and report it completed, it meant that Tingle and the robot also had to estimate how clean the panels were after being cleaned.

“Our team closely observed how the astronaut accomplished these tasks, without being aware of these problems in advance and without any knowledge of the robot’s new capabilities,” says DLR engineer Daniel Leidner.

Streaks of dust or sand on NASA’s Mars rover Opportunity show what can happen to solar panels on the red planet. For any more permanent structures that we may put on Mars, an artificially intelligent maintenance robot under the control of an astronaut in orbit could be the perfect solution to the maintenance of solar panels. Credits: NASA/JPL-Caltech

The next test will take place in Summer 2018 and will push the system even further. Justin will have an even more complex task before him, in this case selecting a component on behalf of the astronaut and installing it on the solar panels. The German ESA astronaut Alexander Gerst will be the operator.

If the whole point of this is not immediately clear to you, think Mars exploration. We have rovers and landers working on the surface of Mars to study the planet in increasing detail. And one day, humans will visit the planet. But right now, we’re restricted to surface craft being controlled from Earth.

What METERON and other endeavours like it are doing, is developing robots that can do our work for us. But they’ll be smart robots that don’t need to be told every little thing. They are just given a task and they go about doing it. And the humans issuing the commands could be in orbit around Mars, rather than being exposed to all the risks on the surface.

“Artificial intelligence allows the robot to perform many tasks independently, making us less susceptible to communication delays that would make continuous control more difficult at such a great distance,” explained Neil Lii, DLR Project Manager. “And we also reduce the workload of the astronaut, who can transfer tasks to the robot.” To do this, however, astronauts and robots must cooperate seamlessly and also complement one another.

These two images from the camera on NASA’s Mars Global Surveyor show the effect that a global dust storm has on Mars. On the left is a normal view of Mars, on the right is Mars obscured by the haze from a dust storm. Image: NASA/JPL/MSSS

That’s why these tests are important. Getting the astronaut and the robot to perform well together is critical.

“This is a significant step closer to a manned planetary mission with robotic support,” says Alin Albu-Schäffer, head of the DLR Institute of Robotics and Mechatronics. It’s expensive and risky to maintain a human presence on the surface of Mars. Why risk human life to perform tasks like cleaning solar panels?

“The astronaut would therefore not be exposed to the risk of landing, and we could use more robotic assistants to build and maintain infrastructure, for example, with limited human resources.” In this scenario, the robot would no longer simply be the extended arm of the astronaut: “It would be more like a partner on the ground.”