I Can’t Stop Watching This Amazing Animation from Comet 67P

A single frame from the animation created by twitter user landru79. The images were taken by the Rosetta spacecraft of 67P on June 1st, 2016. Credit: Europeans Space Agency -ESAC

The European Space Agency’s Rosetta mission was an ambitious one. As the first-ever space probe to rendezvous with and then orbit a comet, Rosetta and its lander (Philae) revealed a great deal about the comet 67p/Churyumov-Gerasimenko. In addition to the learning things about the comet’s shape, composition and tail, the mission also captured some incredible images of the comet’s surface before it ended.

For instance, Rosetta took a series of images on June 1st, 2016, that showed what looks like a blizzard on the comet’s surface. Using these raw images (which were posted on March 22nd, 2018), twitter user landru79 created an eye-popping video that shows just what it would be like to stand on the comet’s surface. As you can see, its like standing in a blizzard on Earth, though scientists have indicated that it’s a little more complicated than that.

The video, which consists of 25 minutes worth of images taken by Rosetta’s Optical, Spectroscopic, and Infrared Remote Imaging System (OSIRIS), was posted by landru79 on April 23rd, 2018. It shows the surface of 67p/Churyumov-Gerasimenko on the loop, which lends it the appearance of panning across the surface in the middle of a snowstorm.

However, according to the ESA, the effect is likely caused by three separate phenomena. For instance, the snow-like particles seen in the video are theorized to be a combination of dust from the comet itself as well as high-energy particles striking the camera. Because of OSIRIS’ charge-coupled device (CCD) – a radiation-sensing camera – even invisible particles appear like bright streaks when passing in front of it.

As for the white specks in the background, those are stars belonging to the Canis Major constellation (according to ESA senior advisor Mark McCaughrean). Since originally posting the video, landru79 has posted another GIF on Twitter (see below) that freezes the starfield in place. This makes it clearer that the comet is moving, but the stars are remaining still (at least, relative to the camera’s point of view).

And of course, the entire video has been sped up considerably for dramatic effect. According to a follow-up tweet posted by landru79, the first image was shot on June 1st, 2016 at 3.981 seconds past 17:00 (UTC) while the last one was shot at 170.17 seconds past 17:25.

Still, one cannot deny that it is both captivating and draws attention to what Rosetta the mission accomplished. The mission launched in 2004 and reached 67P/Churyumov-Gerasimenko in 2014. After two years of gathering data, it was deliberately crashed on its surface in 2016. And yet, years later, what it revealed is still captivating people all over the world.

Further Reading: Live Science, Gizmodo

Facial Recognition Deep Learning Software is Surprisingly Good at Identifying Galaxies Too

Evolution diagram of a galaxy. First the galaxy is dominated by the disk component (left) but active star formation occurs in the huge dust and gas cloud at the center of the galaxy (center). Then the galaxy is dominated by the stellar bulge and becomes an elliptical or lenticular galaxy. Credit: NAOJ

A lot of attention has been dedicated to the machine learning technique known as “deep learning”, where computers are capable of discerning patterns in data without being specifically programmed to do so. In recent years, this technique has been applied to a number of applications, which include voice and facial recognition for social media platforms like Facebook.

However, astronomers are also benefiting from deep learning, which is helping them to analyze images of galaxies and understand how they form and evolve. In a new study, a team of international researchers used a deep learning algorithm to analyze images of galaxies from the Hubble Space Telescope. This method proved effective at classifying these galaxies based on what stage they were in their evolution.

The study, titled “Deep Learning Identifies High-z Galaxies in a Central Blue Nugget Phase in a Characteristic Mass Range“, recently appeared online and has been accepted for publication in the Astrophysical Journal. The study was led by Marc Huertes-Company of the University Paris Diderot and included members from the University of California Santa Cruz (UCSC), the Hebrew University, the Space Telescope Science Institute, the University of Pennsylvania Philadelphia, MINES ParisTech and Shanghai Normal University (SNHU).

A ‘deep learning’ algorithm trained on images from cosmological simulations is surprisingly successful at classifying real galaxies in Hubble images. Credit: HST/CANDELS

In the past, Marc Huertas-Company has already applied deep learning methods to Hubble data for the sake of galaxy classification. In collaboration with David Koo and Joel Primack, both of whom are professor emeritus’ at UC Santa Cruz (and with support from Google), Huertas-Company and the team spent the past two summers developing a neural network that could identify galaxies at different stages in their evolution.

“This project was just one of several ideas we had,” said Koo in a recent USCS press release. “We wanted to pick a process that theorists can define clearly based on the simulations, and that has something to do with how a galaxy looks, then have the deep learning algorithm look for it in the observations. We’re just beginning to explore this new way of doing research. It’s a new way of melding theory and observations.”

For the sake of their study, the researchers used computer simulations to generate mock images of galaxies as they would look in observations by the Hubble Space Telescope. The mock images were used to train the deep learning neural network to recognize three key phases of galaxy evolution that had been previously identified in the simulations. The researchers then used the network to analyze a large set of actual Hubble images.

As with previous images anaylzed by Huertas-Company, these images part of Hubble’s Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS) project – the largest project in the history of the Hubble Space Telescope. What they found was that the neural network’s classifications of simulated and real galaxies was remarkably consistent. As Joel Primack explained:

“We were not expecting it to be all that successful. I’m amazed at how powerful this is. We know the simulations have limitations, so we don’t want to make too strong a claim. But we don’t think this is just a lucky fluke.”

A spiral galaxy ablaze in the blue light of young stars from ongoing star formation (left) and an elliptical galaxy bathed in the red light of old stars (right). Credit: SDSS

 

The research team was especially interested in galaxies that have a small, dense, star-forming region known as a “blue nugget”. These regions occur early in the evolution of gas-rich galaxies, when big flows of gas into the center of a galaxy cause the formation of young stars that emit blue light. To simulate these and other types of galaxies, the team relied on state-of-the-art VELA simulations developed by Primack and an international team of collaborators.

In both the simulated and observational data, the computer program found that the “blue nugget” phase occurs only in galaxies with masses within a certain range. This was followed by star formation ending in the central region, leading to the compact “red nugget” phase, where the stars in the central region exit their main sequence phase and become red giants.

The consistency of the mass range was exciting because it indicated that the neural network was identifying a pattern that results from a key physical process in real galaxies – and without having to be specifically told to do so. As Koo indicated, this study as a big step forward for astronomy and AI, but a lot of research still needs to be done:

“The VELA simulations have had a lot of success in terms of helping us understand the CANDELS observations. Nobody has perfect simulations, though. As we continue this work, we will keep developing better simulations.”

Artist’s representation of an active galactic nucleus (AGN) at the center of a galaxy. Credit: NASA/CXC/M.Weiss

For instance, the team’s simulations did not include the role played by Active Galactic Nuclei (AGN). In larger galaxies, gas and dust is accreted onto a central Supermassive Black Hole (SMBH) at the core, which causes gas and radiation to be ejected in huge jets. Some recent studies have indicated how this may have an arresting effect on star formation in galaxies.

However, observations of distant, younger galaxies have shown evidence of the phenomenon observed in the team’s simulations, where gas-rich cores lead to the blue nugget phase. According to Koo, using deep learning to study galactic evolution has the potential to reveal previously undetected aspects of observational data. Instead of observing galaxies as snapshots in time, astronomers will be able to simulate how they evolve over billions of years.

“Deep learning looks for patterns, and the machine can see patterns that are so complex that we humans don’t see them,” he said. “We want to do a lot more testing of this approach, but in this proof-of-concept study, the machine seemed to successfully find in the data the different stages of galaxy evolution identified in the simulations.”

In the future, astronomers will have more observation data to analyze thanks to the deployment of next-generation telescopes like the Large Synoptic Survey Telescope (LSST), the James Webb Space Telescope (JWST), and the Wide-Field Infrared Survey Telescope (WFIRST). These telescopes will provide even more massive datasets, which can then be analyzed by machine learning methods to determine what patterns exist.

Astronomy and artificial intelligence, working together to better our understanding of the Universe. I wonder if we should put it on the task of finding a Theory of Everything (ToE) too!

Further Reading: UCSC, Astrophysical Journal

If We’re Searching for Earth 2.0, Would We Know It When We Find It?

Artist’s impression of how an an Earth-like exoplanet might look. Credit: ESO.

In the past few decades, there has been an explosion in the number of extra-solar planets that have been discovered. As of April 1st, 2018, a total of 3,758 exoplanets have been confirmed in 2,808 systems, with 627 systems having more than one planet. In addition to expanding our knowledge of the Universe, the purpose of this search has been to find evidence of life beyond our Solar System.

In the course of looking for habitable planets, astronomers have used Earth as a guiding example. But would we recognize a truly “Earth-like” planet if we saw one? This question was addressed in a recent paper by two professors, one of whom is an exoplanet-hunter and the other, an Earth science and astrobiology expert. Together, they consider what advances (past and future) will be key to the search for Earth 2.0.

The paper, titled “Earth as an Exoplanet“, recently appeared online. The study was conducted by Tyler D. Robinson, a former NASA Postdoctoral Fellow and an assistant professor from Northern Arizona University, and Christopher T. Reinhard – an assistant professor from the Georgia Institute of Technology’s School of of Earth and Atmospheric Studies.

Thanks to advances in technology and detection methods, astronomers have detected multiple Earth-like planets in our galaxy. Credit: NASA/JPL

For the sake of their study, Robinson and Reinhard focus on how the hunt for habitable and inhabited planets beyond our Solar System commonly focuses on Earth analogs. This is to be expected, since Earth is the only planet that we know of that can support life. As Professor Robinson told Universe Today via email:

“Earth is – currently! – our only example of a habitable and an inhabited world. Thus, when someone asks, “What will a habitable exoplanet look like?” or “What will a life-bearing exoplanet look like?”, our best option is to point to Earth and say, “Maybe it will look a lot like this.” While many studies have hypothesized other habitable planets (e.g., water-covered super-Earths), our leading example of a fully-functioning habitable planet will always be Earth.”

The authors therefore consider how observations made by spacecraft of the Solar System have led to the development of approaches for detecting signatures of habitability and life on other worlds. These include the Pioneer 10 and 11 missions and Voyager 1 and 2 spacecraft, which conducted flybys of many Solar System bodies during the 1970s.

These missions, which conducted studies on the planets and moons of the Solar System using photometry and spectroscopy allowed scientists to learn a great deal about these bodies’ atmospheric chemistry and composition, as well as meteorlogical patterns and chemistry. Subsequent missions have added to this by revealing key details about the surface details and geological evolution of the Solar planets and moons.

The “pale blue dot” of Earth captured by Voyager 1 spacecraft on Feb 14th, 1990. Credit: NASA/JPL

In addition, the Galileo probe conducted flybys of Earth in December of 1990 and 1992, which provided planetary scientists with the first opportunity to analyze our planet using the same tools and techniques that had previously been applied throughout the Solar System. It was also the Voyager 1 probe that took a distant image of Earth, which Carl Sagan referred to as the “Pale Blue Dot” photo.

However, they also note that Earth’s atmosphere and surface environment has evolved considerably over the past 4.5 billion years ago. In fact, according to various atmospheric and geological models, Earth has resembled many environments in the past that would be considered quite “alien” by today’s standards. These include Earth’s many ice ages and the earliest epochs, when Earth’s primordial atmosphere was the product of volcanic outgassing.

As Professor Robinson explained, this presents some complications when it comes to finding other examples of “Pale Blue Dots”:

“The key complication is being careful to not fall into the trap of thinking that Earth has always appeared the way it does today. So, our planet actually presents a huge array of options for what a habitable and/or inhabited planet might look like.”

In other words, our hunt for Earth analogs could reveal a plethora of worlds which are “Earth-like”, in the sense that they resemble a previous (or future) geological period of Earth. These include “Snowball Earth’s”, which would be covered by glacial sheets (but could still be life-bearing), or even what Earth looked like during the Hadean or Archean Eons, when oxygenic photosynthesis had not yet taken place.

Ice ages are characterized by a drop in average global temperatures, resulting in the expansion of ice sheets globally. Credit: NASA

This would also have implications when it comes to what kinds of life would be able to exist there. For instance, if the planet is still young and its atmosphere was still in its primordial state, life could be strictly in microbial form. However, if the planet was billions of years old and in an interglacial period, more complex life forms may have evolved and be roaming the Earth.

Robinson and Reinhard go on to consider what future developments will aid in the spotting of “Pale Blue Dots”. These include next-generation telescopes like the James Webb Space Telescope (JWST) – scheduled for deployment in 2020 – and the Wide-Field Infrared Survey Telescope (WFIRST), which is currently under development.  Other technologies include concepts like Starshade, which is intended to eliminate the glare of stars so that exoplanets can be directly imaged.

“Spotting true Pale Blue Dots – water-covered terrestrial worlds in the habitable zone of Sun-like stars – will require advancements in our ability to “directly image” exoplanets,” said Robinson. “Here, you use either optics inside the telescope or a futuristic-sounding “starshade” flying beyond the telescope to cancel out the light of a bright star thereby enabling you to see a faint planet orbiting that star. A number of different research groups, including some at NASA centers, are working to perfect these technologies.”

Once astronomers are able to image rocky exoplanets directly, they will at last be able to study their atmospheres in detail and place more accurate constraints on their potential habitability. Beyond that, there may come a day when we will be able to image the surfaces of these planets, either through extremely sensitive telescopes or spacecraft missions (such as Project Starshot).

Whether or not we find another “Pale Blue Dot” remains to be seen. But in the coming years, we may finally get a good idea of just how common (or rare) our world truly is.

Further Reading: arXiv

The DARKNESS Instrument Will Block Stars and Reveal Their Planets. 100 Million Times Fainter than the Star

The new DARKNESS camera developed by an international team of researchers will allow astronomers to directly study nearby exoplanets. Credit: Stanford/SRL

The hunt for planets beyond our Solar System has led to the discovery of thousands of candidates in the past few decades. Most of these have been gas giants that range in size from being Super-Jupiters to Neptune-sized planets. However, several have also been determined to be “Earth-like” in nature, meaning that they are rocky and orbit within their stars’ respective habitable zones.

Unfortunately, determining what conditions might be like on their surfaces is difficult, since astronomers are unable to study these planets directly. Luckily, an international team led by UC Santa Barbara physicist Benjamin Mazin has developed a new instrument known as DARKNESS. This superconducting camera, which is the world’s largest and most sophisticated, will allow astronomers to detect planets around nearby stars.

The team’s study which details their instrument, titled “DARKNESS: A Microwave Kinetic Inductance Detector Integral Field Spectrograph for High-contrast Astronomy“, recently appeared in the Publications of the Astronomy Society of the Pacific. The team was led by Benjamin Mazin, the Worster Chair in Experimental Physics at UCSB, and also includes members from NASA’s Jet Propulsion Laboratory, the California Institute of Technology, the Fermi National Accelerator Laboratory, and multiple universities.

The DARKNESS instrument is the worlds most advanced camera and will enable the detection of planets around the nearest stars. Credit: UCSB

Essentially, it is extremely difficult for scientists to study exoplanets directly because of the interference caused by their stars. As Mazin explained in a recent UCSB press release, “Taking a picture of an exoplanet is extremely challenging because the star is much brighter than the planet, and the planet is very close to the star.” As such, astronomers are often unable to analyze the light being reflected off of a planet’s atmosphere to determine its composition.

These studies would help place additional constraints on whether or not a planet is potentially habitable. At present, scientists are forced to determine if a planet could support life based on its size, mass, and distance from its star. In addition, studies have been conducted that have determined whether or not water exists on a planet’s surface based on how its atmosphere loses hydrogen to space.

The DARK-speckle Near-infrared Energy-resolved Superconducting Spectrophotometer (aka. DARKNESS), the first 10,000-pixel integral field spectrograph, seeks to correct this. In conjunction with a large telescope and adaptive optics, it uses Microwave Kinetic Inductance Detectors to quickly measure the light coming from a distant star, then sends a signal back to a rubber mirror that can form into a new shape 2,000 times a second.

MKIDs allow astronomers to determine the energy and arrival time of individual photons, which is important when it comes to distinguishing a planet from scattered or refracted light. This process also eliminates read noise and dark current – the primary sources of error in other instruments – and cleans up the atmospheric distortion by suppressing the starlight.

UCSB physicist Ben Mazin, who led the development of the DARKNESS camera. Credit: Sonia Fernandez

Mazin and his colleagues have been exploring MKIDs technology for years through the Mazin Lab, which is part of the UCSB’s Department of Physics. As Mazin explained:

“This technology will lower the contrast floor so that we can detect fainter planets. We hope to approach the photon noise limit, which will give us contrast ratios close to 10-8, allowing us to see planets 100 million times fainter than the star. At those contrast levels, we can see some planets in reflected light, which opens up a whole new domain of planets to explore. The really exciting thing is that this is a technology pathfinder for the next generation of telescopes.”

DARKNESS is now operational on the 200-inch Hale Telescope at the Palomar Observatory near San Diego, California, where it is part of the PALM-3000 extreme adaptive optics system and the Stellar Double Coronagraph. During the past year and a half, the team has conducted four runs with the DARKNESS camera to test its contrast ratio and make sure it is working properly.

In May, the team will return to gather more data on nearby planets and demonstrate their progress. If all goes well, DARKNESS will become the first of many cameras designed to image planets around nearby M-type (red dwarf) stars, where many rocky planets have been discovered in recent years. The most notable example is Proxima b, which orbits the nearest star system to our own (Proxima Centauri, roughly 4.25 light years away).

The Palomar Observatory, where the DARKNESS camera is currently installed. Credit: IPTF/Palomar Observatory

“Our hope is that one day we will be able to build an instrument for the Thirty Meter Telescope planned for Mauna Kea on the island of Hawaii or La Palma,” Mazin said. “With that, we’ll be able to take pictures of planets in the habitable zones of nearby low mass stars and look for life in their atmospheres. That’s the long-term goal and this is an important step toward that.”

In addition to the study of nearby rocky planets, this technology will also allow astronomers to study pulsars in greater detail and determine the redshift of billions of galaxies, allowing for more accurate measurements of how fast the Universe is expanding. This, in turn, will allow for more detailed studies of how our Universe has evolved over time and the role played by Dark Energy.

These and other technologies, such as NASA’s proposed Starshade spacecraft and Stanford’s mDot occulter, will revolutionize exoplanet studies in the coming years. Paired with next-generation telescopes – such as the James Webb Space Telescope and the Transiting Exoplanet Survey Satellite (TESS), which recently launched – astronomers will not only be able to discover more in the way exoplanets, but will be able to characterize them like never before.

Further Reading: UC Santa BarbaraPublications of the Astronomy Society of the Pacific

Did You Know the Earth Has a Second Magnetic Field? Its Oceans

The magnetic field and electric currents in and around Earth generate complex forces that have immeasurable impact on every day life. Credit: ESA/ATG medialab

Earth’s magnetic field is one of the most mysterious features of our planet. It is also essential to life as we know it, ensuring that our atmosphere is not stripped away by solar wind and shielding life on Earth from harmful radiation. For some time, scientists have theorized that it is the result of a dynamo action in our core, where the liquid outer core revolves around the solid inner core and in the opposite direction of the Earth’s rotation.

In addition, Earth’s magnetic field is affected by other factors, such as magnetized rocks in the crust and the flow of the ocean. For this reason, the European Space Agency’s (ESA) Swarm satellites, which have been continually monitoring Earth’s magnetic field since its deployment, recently began monitoring Earth’s oceans – the first results of which were presented at this year’s European Geosciences Union meeting in Vienna, Austria.

The Swarm mission, which consists of three Earth-observation satellites, was launched in 2013 for the sake of providing high-precision and high-resolution measurements of Earth’s magnetic field. The purpose of this mission is not only to determine how Earth’s magnetic field is generated and changing, but also to allow us to learn more about Earth’s composition and interior processes.

Artist’s impression of the ESA’s Swarm satellites, which are designed to measure the magnetic signals from Earth’s core, mantle, crust, oceans, ionosphere and magnetosphere. Credit: ESA/AOES Medialab

Beyond this, another aim of the mission is to increase our knowledge of atmospheric processes and ocean circulation patterns that affect climate and weather. The ocean is also an important subject of study to the Swarm mission because of the small ways in which it contributes to Earth’s magnetic field. Basically, as the ocean’s salty water flows through Earth’s magnetic field, it generates an electric current that induces a magnetic signal.

Because this field is so small, it is extremely difficult to measure. However, the Swarm mission has managed to do just that in remarkable detail. These results, which were presented at the EGU 2018 meeting, were turned into an animation (shown below), which shows how the tidal magnetic signal changes over a 24 hour period.

As you can see, the animation shows temperature changes in the Earth’s oceans over the course of the day, shifting from north to south and ranging from deeper depths to shallower, coastal regions. These changes have a minute effect on Earth’s magnetic field, ranging from 2.5 to -2.5 microtesla. As Nils Olsen, from the Technical University of Denmark, explained in a ESA press release:

“We have used Swarm to measure the magnetic signals of tides from the ocean surface to the seabed, which gives us a truly global picture of how the ocean flows at all depths – and this is new. Since oceans absorb heat from the air, tracking how this heat is being distributed and stored, particularly at depth, is important for understanding our changing climate. In addition, because this tidal magnetic signal also induces a weak magnetic response deep under the seabed, these results will be used to learn more about the electrical properties of Earth’s lithosphere and upper mantle.”

By learning more about Earth’s magnetic field, scientists will able to learn more about Earth’s internal processes, which are essential to life as we know it. This, in turn, will allow us to learn more about the kinds of geological processes that have shaped other planets, as well as determining what other planets could be capable of supporting life.

Be sure to check out this comic that explains how the Swarm mission works, courtesy of the ESA.

Further Reading: ESA

The Challenges of an Alien Spaceflight Program: Escaping Super Earths and Red Dwarf Stars

In a series of papers, Professor Loeb and Michael Hippke indicate that conventional rockets would have a hard time escaping from certain kinds of extra-solar planets. Credit: NASA/Tim Pyle
In a series of papers, Professor Loeb and Michael Hippke indicate that conventional rockets would have a hard time escaping from certain kinds of extra-solar planets. Credit: NASA/Tim Pyle

Since the beginning of the Space Age, humans have relied on chemical rockets to get into space. While this method is certainly effective, it is also very expensive and requires a considerable amount of resources. As we look to more efficient means of getting out into space, one has to wonder if similarly-advanced species on other planets (where conditions would be different) would rely on similar methods.

Harvard Professor Abraham Loeb and Michael Hippke, an independent researcher affiliated with the Sonneberg Observatory, both addressed this question in two recentlyreleased papers. Whereas Prof. Loeb looks at the challenges extra-terrestrials would face launching rockets from Proxima b, Hippke considers whether aliens living on a Super-Earth would be able to get into space.

The papers, tiled “Interstellar Escape from Proxima b is Barely Possible with Chemical Rockets” and “Spaceflight from Super-Earths is difficult” recently appeared online, and were authored by Prof. Loeb and Hippke, respectively. Whereas Loeb addresses the challenges of chemical rockets escaping Proxima b, Hippke considers whether or not the same rockets would able to achieve escape velocity at all.

Artist’s impression of Proxima b, which was discovered using the Radial Velocity method. Credit: ESO/M. Kornmesser

For the sake of his study, Loeb considered how we humans are fortunate enough to live on a planet that is well-suited for space launches. Essentially, if a rocket is to escape from the Earth’s surface and reach space, it needs to achieve an escape velocity of 11.186 km/s (40,270 km/h; 25,020 mph). Similarly, the escape velocity needed to get away from the location of the Earth around the Sun is about 42 km/s (151,200 km/h; 93,951 mph).

As Prof. Loeb told Universe Today via email:

“Chemical propulsion requires a fuel mass that grows exponentially with terminal speed. By a fortunate coincidence the escape speed from the orbit of the Earth around the Sun is at the limit of attainable speed by chemical rockets. But the habitable zone around fainter stars is closer in, making it much more challenging for chemical rockets to escape from the deeper gravitational pit there.”

As Loeb indicates in his essay, the escape speed scales as the square root of the stellar mass over the distance from the star, which implies that the escape speed from the habitable zone scales inversely with stellar mass to the power of one quarter. For planets like Earth, orbiting within the habitable zone of a G-type (yellow dwarf) star like our Sun, this works out quite while.

This infographic compares the orbit of the planet around Proxima Centauri (Proxima b) with the same region of the Solar System. Credit: Pale Red Dot

Unfortunately, this does not work well for terrestrial planets that orbit lower-mass M-type (red dwarf) stars. These stars are the most common type in the Universe, accounting for 75% of stars in the Milky Way Galaxy alone. In addition, recent exoplanet surveys have discovered a plethora of rocky planets orbiting red dwarf stars systems, with some scientists venturing that they are the most likely place to find potentially-habitable rocky planets.

Using the nearest star to our own as an example (Proxima Centauri), Loeb explains how a rocket using chemical propellant would have a much harder time achieving escape velocity from a planet located within it’s habitable zone.

“The nearest star to the Sun, Proxima Centauri, is an example for a faint star with only 12% of the mass of the Sun,” he said. “A couple of years ago, it was discovered that this star has an Earth-size planet, Proxima b, in its habitable zone, which is 20 times closer than the separation of the Earth from the Sun. At that location, the escape speed is 50% larger than from the orbit of the Earth around the Sun. A civilization on Proxima b will find it difficult to escape from their location to interstellar space with chemical rockets.”

Hippke’s paper, on the other hand, begins by considering that Earth may in fact not be the most habitable type of planet in our Universe. For instance, planets that are more massive than Earth would have higher surface gravity, which means they would be able to hold onto a thicker atmosphere, which would provide greater shielding against harmful cosmic rays and solar radiation.

Artists impression of a Super-Earth, a class of planet that has many times the mass of Earth, but less than a Uranus or Neptune-sized planet. Credit: NASA/Ames/JPL-Caltech

In addition, a planet with higher gravity would have a flatter topography, resulting in archipelagos instead of continents and shallower oceans – an ideal situation where biodiversity is concerned. However, when it comes to rocket launches, increased surface gravity would also mean a higher escape velocity. As Hippke indicated in his study:

“Rockets suffer from the Tsiolkovsky (1903) equation : if a rocket carries its own fuel, the ratio of total rocket mass versus final velocity is an exponential function, making high speeds (or heavy payloads) increasingly expensive.”

For comparison, Hippke uses Kepler-20 b, a Super-Earth located 950 light years away that is 1.6 times Earth’s radius and 9.7 times it mass. Whereas escape velocity from Earth is roughly 11 km/s, a rocket attempting to leave a Super-Earth similar to Kepler-20 b would need to achieve an escape velocity of ~27.1 km/s. As a result, a single-stage rocket on Kepler-20 b would have to burn 104 times as much fuel as a rocket on Earth to get into orbit.

To put it into perspective, Hippke considers specific payloads being launched from Earth. “To lift a more useful payload of 6.2 t as required for the James Webb Space Telescope on Kepler-20 b, the fuel mass would increase to 55,000 t, about the mass of the largest ocean battleships,” he writes. “For a classical Apollo moon mission (45 t), the rocket would need to be considerably larger, ~400,000 t.”

Project Starshot, an initiative sponsored by the Breakthrough Foundation, is intended to be humanity’s first interstellar voyage. Credit: breakthroughinitiatives.org

While Hippke’s analysis concludes that chemical rockets would still allow for escape velocities on Super-Earths up to 10 Earth masses, the amount of propellant needed makes this method impractical. As Hippke pointed out, this could have a serious effect on an alien civilization’s development.

“I am surprised to see how close we as humans are to end up on a planet which is still reasonably lightweight to conduct space flight,” he said. “Other civilizations, if they exist, might not be as lucky. On more massive planets, space flight would be exponentially more expensive. Such civilizations would not have satellite TV, a moon mission, or a Hubble Space Telescope. This should alter their way of development in certain ways we can now analyze in more detail.”

Both of these papers present some clear implications when it comes to the search for extra-terrestrial intelligence (SETI). For starters, it means that civilizations on planets that orbit red dwarf stars or Super-Earths are less likely to be space-faring, which would make detecting them more difficult. It also indicates that when it comes to the kinds of propulsion humanity is familiar with, we may be in the minority.

“This above results imply that chemical propulsion has a limited utility, so it would make sense to search for signals associated with lightsails or nuclear engines, especially near dwarf stars,” said Loeb. “But there are also interesting implications for the future of our own civilization.”

Artist’s concept of a bimodal nuclear rocket making the journey to the Moon, Mars, and other destinations in the Solar System. Credit: NASA

“One consequence of the paper is for space colonization and SETI,” added Hippke. “Civs from Super-Earths are much less likely to explore the stars. Instead, they would be (to some extent) “arrested” on their home planet, and e.g. make more use of lasers or radio telescopes for interstellar communication instead of sending probes or spaceships.”

However, both Loeb and Hippke also note that extra-terrestrial civilizations could address these challenges by adopting other methods of propulsion. In the end, chemical propulsion may be something that few technologically-advanced species would adopt because it is simply not practical for them. As Loeb explained:

“An advanced extraterrestrial civilization could use other propulsion methods, such as nuclear engines or lightsails which are not constrained by the same limitations as chemical propulsion and can reach speeds as high as a tenth of the speed of light. Our civilization is currently developing these alternative propulsion technologies but these efforts are still at their infancy.”

One such example is Breakthrough Starshot, which is currently being developed by the Breakthrough Prize Foundation (of which Loeb is the chair of the Advisory Committee). This initiative aims to use a laser-driven lightsail to accelerate a nanocraft up to speeds of 20% the speed of light, which will allow it to travel to Proxima Centauri in just 20 years time.

Artist’s impression of rocky exoplanets orbiting Gliese 832, a red dwarf star just 16 light-years from Earth. Credit: ESO/M. Kornmesser/N. Risinger (skysurvey.org).

Hippke similarly considers nuclear rockets as a viable possibility, since increased surface gravity would also mean that space elevators would be impractical. Loeb also indicated that the limitations imposed by planets around low mass stars could have repercussions for when humans try to colonize the known Universe:

“When the sun will heat up enough to boil all water off the face of the Earth, we could relocate to a new home by then. Some of the most desirable destinations would be systems of multiple planets around low mass stars, such as the nearby dwarf star TRAPPIST-1 which weighs 9% of a solar mass and hosts seven Earth-size planets. Once we get to the habitable zone of TRAPPIST-1, however, there would be no rush to escape. Such stars burn hydrogen so slowly that they could keep us warm for ten trillion years, about a thousand times longer than the lifetime of the sun.”

But in the meantime, we can rest easy in the knowledge that we live on a habitable planet around a yellow dwarf star, which affords us not only life, but the ability to get out into space and explore. As always, when it comes to searching for signs of extra-terrestrial life in our Universe, we humans are forced to take the “low hanging fruit approach”.

Basically, the only planet we know of that supports life is Earth, and the only means of space exploration we know how to look for are the ones we ourselves have tried and tested. As a result, we are somewhat limited when it comes to looking for biosignatures (i.e. planets with liquid water, oxygen and nitrogen atmospheres, etc.) or technosignatures (i.e. radio transmissions, chemical rockets, etc.).

As our understanding of what conditions life can emerge under increases, and our own technology advances, we’ll have more to be on the lookout for. And hopefully, despite the additional challenges it may be facing, extra-terrestrial life will be looking for us!

Professor Loeb’s essay was also recently published in Scientific American.

Further Reading: arXiv, arXiv (2), Scientific American

How to Listen to the Background Hum of Gravitational Waves From all the Black Holes Colliding into Each Other

Artist's impression of two merging black holes. Credit: Bohn, Throwe, Hébert, Henriksson, Bunandar, Taylor, Scheel/SXS
Artist's impression of two merging black holes. Credit: Bohn, Throwe, Hébert, Henriksson, Bunandar, Taylor, Scheel/SXS

The first-ever detection of gravitational waves (which took place in September of 2015) triggered a revolution in astronomy. Not only did this event confirm a theory predicted by Einstein’s Theory of General Relativity a century before, it also ushered in a new era where the mergers of distant black holes, supernovae, and neutron stars could be studied by examining their resulting waves.

In addition, scientists have theorized that black hole mergers could actually be a lot more common than previously thought. According to a new study conducted by pair of researchers from Monash University, these mergers happen once every few minutes. By listening to the background noise of the Universe, they claim, we could find evidence of thousands of previously undetected events.

Their study, titled “Optimal Search for an Astrophysical Gravitational-Wave Background“, recently appeared in the journal Physical Review X. The study was conducted by Rory Smith and Eric Thrane, a senior lecturer and a research fellow at Monash University, respectively. Both researchers are also members of the ARC Center of Excellence for Gravitational Wave Discovery (OzGrav).

Drs. Eric Thrane and Rory Smith. Credit: Monash University

As they state in their study, every 2 to 10 minutes, a pair of stellar-mass black holes merge somewhere in the Universe. A small fraction of these are large enough that the resulting gravitational wave event can be detected by advanced instruments like the Laser Interferometer Gravitational-Wave Observatory and Virgo observatory. The rest, however, contribute to a sort of stochastic background noise.

By measuring this noise, scientists may be able to study much more in the way of events and learn a great deal more about gravitational waves. As Dr Thrane explained in a Monash University press statement:

“Measuring the gravitational-wave background will allow us to study populations of black holes at vast distances. Someday, the technique may enable us to see gravitational waves from the Big Bang, hidden behind gravitational waves from black holes and neutron stars.”

Drs Smith and Thrane are no amateurs when it comes to the study of gravitational waves. Last year, they were both involved in a major breakthrough, where researchers from LIGO Scientific Collaboration (LSC) and the Virgo Collaboration measured gravitational waves from a pair of merging neutron stars. This was the first time that a neutron star merger (aka. a kilonova) was observed in both gravitational waves and visible light.

The pair were also part of the Advanced LIGO team that made the first detection of gravitational waves in September 2015. To date, six confirmed gravitational wave events have been confirmed by the LIGO and Virgo Collaborations. But according to Drs Thrane and Smith, there could be as many as 100,000 events happening every year that these detectors simply aren’t equipped to handle.

In February 2016, LIGO detected gravity waves for the first time. As this artist's illustration depicts, the gravitational waves were created by merging black holes. The third detection just announced was also created when two black holes merged. Credit: LIGO/A. Simonnet.
Artist’s impression of merging binary black holes. Credit: LIGO/A. Simonnet.

These waves are what come together to create a gravitational wave background; and while the individual events are too subtle to be detected, researchers have been attempting to develop a method for detecting the general noise for years. Relying on a combination of computer simulations of faint black hole signals and masses of data from known events, Drs. Thrane and Smith claim to have done just that.

From this, the pair were able to produce a signal within the simulated data that they believe is evidence of faint black hole mergers. Looking ahead, Drs Thrane and Smith hope to apply their new method to real data, and are optimistic it will yield results. The researchers will also have access to the new OzSTAR supercomputer, which was installed last month at the Swinburne University of Technology to help scientists to look for gravitational waves in LIGO data.

This computer is different from those used by the LIGO community, which includes the supercomputers at CalTech and MIT. Rather than relying on more traditional central processing units (CPUs), OzGrav uses graphical processor units – which can be hundreds of times faster for some applications. According to Professor Matthew Bailes, the Director of the OzGRav supercomputer:

“It is 125,000 times more powerful than the first supercomputer I built at the institution in 1998… By harnessing the power of GPUs, OzStar has the potential to make big discoveries in gravitational-wave astronomy.”

What has been especially impressive about the study of gravitational waves is how it has progressed so quickly. From the initial detection in 2015, scientists from Advanced LIGO and Virgo have now confirmed six different events and anticipate detecting many more. On top of that, astrophysicists are even coming up with ways to use gravitational waves to learn more about the astronomical phenomena that cause them.

All of this was made possible thanks to improvements in instrumentation and growing collaboration between observatories. And with more sophisticated methods designed to sift through archival data for additional signals and background noise, we stand to learn a great deal more about this mysterious cosmic force.

Further Reading: Monash, Physical Review X

The NewSpace Revolution is About to Bring us Tiny Space Telescopes we can all Control

Space Fab's Waypoint Space Telescope will be the first space telescope available to the general public. Credit: SpaceFab.US

One of the defining characteristics of the modern era of space exploration is the way the public and private aerospace companies (colloquially referred to as the NewSpace industry) and are taking part like never before. Thanks to cheaper launch services and the development of small satellites that can be built using off-the-shelf electronics (aka. CubeSats and microsats), universities and research institutions are also able to conduct research in space.

Looking to the future, there are those who want to take public involvement in space exploration to a whole new level. This includes the California-based aerospace company Space Fab that wants to make space accessible to everyone through the development the Waypoint Space Telescope – the first space telescope that people will be able to access through their smartphones to take pictures of Earth and space.

The company was founded in 2016 by Randy Chung and Sean League with the vision of creating a future where anything could be manufactured in space. Chung began his career developing communications satellites and has a background in integrated circuit design, digital signal processing, CMOS imager design, and software development. He holds sixteen patents in the fields of computer peripherals, imagers, and digital communications.

League, meanwhile, is an astrophysicist who has spent the past few decades developing optics, building and designing remote telescopes, solid state lasers, and has lots of experience with startups, fundraising, computer-aided design (CAD) and machining. Between the two of them, they are ideally suited to creating a new generation of publicly-accessible telescopes. As League told Universe Today via email:

“We have studied over 200 papers on the design of small satellite structures, electronics, navigation, and attitude control. We are rethinking satellite design, not tied down by legacy approaches. That fresh approach leads us to use a Corrected Dall Kirkham telescope design, rather than the standard Richey-Chretien design, an extending secondary mirror, rather than a fixed telescope structure, and a multi-purpose and multi-directional telescope, not a single purpose telescope just for Earth observation or just for astronomy.”

Together, League and Chung launched Space Fab in the hopes of spurring the development of the space industry, where asteroid mining and space manufacturing will provide cheap and abundant resources for all and allow for further exploration of our Solar System. The first step in this long-term plan is to build a profitable space telescope business by creating the first commercial, multipurpose space telescope industry.

“SpaceFab’s primary long term objective is to accelerate man’s access to space and to make the human race a multi-planet species,” said League. “This not only safeguards the human race, but all life that is brought along. We intend to make space resources readily available and dramatically less expensive than today, without environmental impact on Earth.”

What makes the Waypoint Space Telescope especially unique is the way it combines off-the-shelf components with revolutionary instruments. The design is based on a standard 12U CubeSat satellite, which contains the Waypoint telescope. This telescope has extendable optics that consist of a 21 cm silicon carbide primary mirror, a deployable secondary mirror, a 48 Megapixel imager for visible and near-infrared wavelengths, an 8 Megapixel image intensified camera for ultraviolet and visible wavelengths and a 150 band hyper-spectral imager.

“Waypoint’s astronomical capabilities are impressive,” says League. “Without the distorting effects of Earth’s atmosphere, our 48 megapixel imager can take perfect high resolution images every time. We can reach the maximum theoretical resolution for our main mirror at .6 arc seconds per pixel on a single image, and higher resolution is possible through multiple exposures. Contrast will be fantastic, with the blackness of background space not being washed out by Earth’s atmosphere, clouds, moisture, city lights, or the day/night cycle. The Waypoint satellite also includes a complete set of astronomical and earth observations filters.”

The Waypoint Space Telescope will be ready to launch as a secondary payload by the end of 2019 on a rocket like the SpaceX Falcon 9. The company has also completed its first seed round of investment  and is currently crowdfunding through a Kickstarter campaign.

Those who pledge their money will have the honor of getting a “space selfie”, where a favorite photo will be paired with a backdrop of Earth, pictured from orbit. In addition, Space Fab is building its own custom laser communications systems for the telescope optimized for low power, small size, and high speed.

Once deployed, this communication system will allow the telescope to download data back to Earth twice a day using optical ground stations. These images will then be available for upload via smartphone, tablet, computer or other devices. Chung and League’s efforts to create the first accessible telescope is already drawing its share of acolytes. One such person is Dustin Gibson, one of the owners of OPT Telescopes. As he told Universe Today via email:

“So far, the company is on the fast track to success with its first round of investing completed and over target, and the second round just getting started. It looks like this thing is going to fly in 2019! For an astrophotography lover like myself, I can’t think of anything more ground breaking than a consumer controlled space telescope.

“What Space Fab is doing is rewriting not just how we think about ways in which to do land surveys or deep space imaging, but actually redefining the way we are able interact with satellites by giving the common user a level of control over the movements and functionality of the unit itself with something as simple as a cell phone.”

Looking ahead, Space Fab is also busy developing the technology that will allow them to mine asteroids and tap the abundant resources of the Solar System. The company recently filed a patent for their ion accelerator, which is designed to augment the thrust from existing cubesat-sized ion engines.

The company is also focused on creating advanced robotic arms that will be able to wrestle with space debris and repair themselves in the event of mechanical failure or damage. In the meantime, the Waypoint is the first of several space telescopes that Space Fab hopes to deploy in order to generate revenue for these ventures.

“Our space telescopes will be open to everyone, so that is the beginning,” said League. “The revenue these satellites will generate provides us with the funds and knowledge base to conduct metal asteroid mining and manufacturing on a large scale. This will allow the manufacture of large structures, spacecraft, tools or anything thing else that is needed in space. With these available resources, our hope is to accelerate the space economy and colonization.”

In this respect, Space Fab is in good company when it comes to the age of NewSpace. Alongside big-names like SpaceX, Blue Origin, Planetary Resources, and Deep Space Industries, they are part of a constellation of companies that are looking to make space accessible and usher in an age of post-scarcity. And with the help of the general public, they just might succeed!

Further Reading: SpaceFab,

Could We Detect an Ancient Industrial Civilization in the Geological Record?

Human activity is a major cause of air pollution, much of which results from industrial processes. Credit: cherwell.org

As a species, we humans tend to take it for granted that we are the only ones that live in sedentary communities, use tools, and alter our landscape to meet our needs. It is also a foregone conclusion that in the history of planet Earth, humans are the only species to develop machinery, automation, electricity, and mass communications – the hallmarks of industrial civilization.

But what if another industrial civilization existed on Earth millions of years ago? Would we be able to find evidence of it within the geological record today? By examining the impact human industrial civilization has had on Earth, a pair of researchers conducted a study that considers how such a civilization could be found and how this could have implications in the search for extra-terrestrial life.

The study, which recently appeared online under the title “The Silurian Hypothesis: Would it be possible to detect an industrial civilization in the geological record“, was conducted by Gavin A. Schmidt and Adam Frank – a climatologist with the NASA Goddard Institute for Space Studies (NASA GISS) and an astronomer from the University of Rochester, respectively.

Carbon dioxide in Earth’s atmosphere if half of global-warming emissions are not absorbed. Credit: NASA/JPL/GSFC

As they indicate in their study, the search for life on other planets has often involved looking to Earth-analogues to see what kind conditions life could exist under. However, this pursuit also entails the search for extra-terrestrial intelligence (SETI) that would be capable of communicating with us. Naturally, it is assumed that any such civilization would need to develop and industrial base first.

This, in turn, raises the question of how often an industrial civilization might develop – what Schmidt and Frank refer to as the “Silurian Hypothesis”. Naturally, this raises some complications since humanity is the only example of an industrialized species that we know of. In addition, humanity has only been an industrial civilization for the past few centuries – a mere fraction of its existence as a species and a tiny fraction of the time that complex life has existed on Earth.

For the sake of their study, the team first noted the importance of this question to the Drake Equation. To recap, this theory states that the number of civilizations (N) in our galaxy that we might be able to communicate is equal to the average rate of star formation (R*), the fraction of those stars that have planets (fp), the number of planets that can support life (ne), the number of planets that will develop life ( fl), the number of planets that will develop intelligent life (fi), the number civilizations that would develop transmission technologies (fc), and the length of time these civilizations will have to transmit signals into space (L).

This can be expressed mathematically as: N = R* x fp x ne x fl x fi x fc x L

The Drake Equation, a mathematical formula for the probability of finding life or advanced civilizations in the universe. Credit: University of Rochester

As they indicate in their study, the parameters of this equation may change thanks to the addition of the Silurian Hypothesis, as well as recent exoplanets surveys:

“If over the course of a planet’s existence, multiple industrial civilizations can arise over the span of time that life exists at all, the value of fc may in fact be greater than one. This is a particularly cogent issue in light of recent developments in astrobiology in which the first three terms, which all involve purely astronomical observations, have now been fully determined. It is now apparent that most stars harbor families of planets. Indeed, many of those planets will be in the star’s habitable zones.”

In short, thanks to improvements in instrumentation and methodology, scientists have been able to determine the rate at which stars form in our galaxy. Furthermore, recent surveys for extra-solar planets have led some astronomers to estimate that our galaxy could contains as many as 100 billion potentially-habitable planets. If evidence could be found of another civilization in Earth’s history, it would further constrain the Drake Equation.

They then address the likely geologic consequences of human industrial civilization and then compare that fingerprint to potentially similar events in the geologic record. These include the release of isotope anomalies of carbon, oxygen, hydrogen and nitrogen, which are a result of greenhouse gas emissions and nitrogen fertilizers. As they indicate in their study:

“Since the mid-18th Century, humans have released over 0.5 trillion tons of fossil carbon via the burning of coal, oil and natural gas, at a rate orders of magnitude faster than natural long-term sources or sinks. In addition, there has been widespread deforestation and addition of carbon dioxide into the air via biomass burning.”
Based on fossil records, 250 million years ago over 90% of all species on Earth died out, effectively resetting evolution. Credit: Lunar and Planetary Institute

They also consider increased rates of sediment flow in rivers and its deposition in coastal environments, as a result of agricultural processes, deforestation, and the digging of canals. The spread of domesticated animals, rodents and other small animals are also considered – as are the extinction of certain species of animals – as a direct result of industrialization and the growth of cities.

The presence of synthetic materials, plastics, and radioactive elements (caused by nuclear power or nuclear testing) will also leave a mark on the geological record – in the case of radioactive isotopes, sometimes for millions of years. Finally, they compare past extinction level events to determine how they would compare to a hypothetical event where human civilization collapsed. As they state:

“The clearest class of event with such similarities are the hyperthermals, most notably the Paleocene-Eocene Thermal Maximum (56 Ma), but this also includes smaller hyperthermal events, ocean anoxic events in the Cretaceous and Jurassic, and significant (if less well characterized) events of the Paleozoic.”

These events were specifically considered because they coincided with rises in temperatures, increases in carbon and oxygen isotopes, increased sediment, and depletions of oceanic oxygen. Events that had a very clear and distinct cause, such as the Cretaceous-Paleogene extinction event (caused by an asteroid impact and massive volcanism) or the Eocene-Oligocene boundary (the onset of Antarctic glaciation) were not considered.

Artistic rendition of the Chicxulub impactor striking ancient Earth, with Pterosaur observing. Credit: NASA

According to the team, the events they did consider (known as “hyperthermals”) show similarities to the Anthropocene fingerprint that they identified. In particular, according to research cited by the authors, the Paleocene-Eocene Thermal Maximum (PETM) shows signs that could be consistent with anthorpogenic climate change. These include:

 “[A] fascinating sequence of events lasting 100–200 kyr and involving a rapid input (in perhaps less than 5 kyr) of exogenous carbon into the system, possibly related to the intrusion of the North American Igneous Province into organic sediments. Temperatures rose 5–7?C (derived from multiple proxies), and there was a negative spike in carbon isotopes (>3%), and decreased ocean carbonate preservation in the upper ocean.”

Finally, the team addressed some possible research directions that might improve the constraints on this question. This, they claim, could consist of a “deeper exploration of elemental and compositional anomalies in extant sediments spanning previous events be performed”. In other words, the geological record for these extinction events should be examined more closely for anomalies that could be associated with industrial civilization.

If any anomalies are found, they further recommend that the fossil record could be examined for candidate species, which would raise questions about their ultimate fate. Of course, they also acknowledge that more evidence is necessary before the Silurian Hypothesis can be considered viable. For instance, many past events where abrupt Climate Change took place have been linked to changes in volcanic/tectonic activity.

Scientists were able to gauge the rate of water loss on Mars by measuring the ratio of water and HDO from today and 4.3 billion years ago. Credit: Kevin Gill

Second, there is the fact that current changes in our climate are happening faster than in any other geological period. However, this is difficult to say for certain since there are limits when it comes to the chronology of the geological record. In the end, more research will be necessary to determine how long previous extinction events (those that were not due to impacts) took as well.

Beyond Earth, this study may also have implications for the study of past life on planets like Mars and Venus. Here too, the authors suggest how explorations of both could reveal the existence of past civilizations, and maybe even bolster the possibility of finding evidence of past civilizations on Earth.

“We note here that abundant evidence exists of surface water in ancient Martian climates (3.8 Ga), and speculation that early Venus (2 Ga to 0.7 Ga) was habitable (due to a dimmer sun and lower CO2 atmosphere) has been supported by recent modeling studies,” they state. “Conceivably, deep drilling operations could be carried out on either planet in future to assess their geological history. This would constrain consideration of what the fingerprint might be of life, and even organized civilization.”
Two key aspects of the Drake Equation, which addresses the probability of finding life elsewhere in the galaxy, are the sheer number of stars and planets out there and the amount of time life has had to evolve. Until now, it has been assumed that one planet would give rise to one intelligent species capable of advanced technology and communications.
But if this number should prove to be more, we may a find a galaxy filled with civilizations, both past and present. And who knows? The remains of a once advanced and great non-human civilization may very well be right beneath us!

Further Reading: arXiv

Pluto’s Charon Gets Mountains Named After Sci-Fi Authors Octavia Butler and Arthur C. Clarke, as Well as Many Others From History and Legend. I Approve!

Map projection of Charon, the largest of Pluto’s five moons, annotated with its first set of official feature names. With a diameter of about 1215 km, the France-sized moon is one of largest known objects in the Kuiper Belt, the region of icy, rocky bodies beyond Neptune. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute

In 2015, the New Horizons mission made history by being the first spacecraft to conduct a flyby of Pluto. In addition to revealing things about the planet’s atmosphere, its geology and system of moons, the probe also provided the first clear images of the surface of Pluto and its largest moon, Charon. Because of this, scientists are now able to study Pluto and Charon’s many curious surface features and learn more about their evolution.

Another interesting thing that has resulted from this surface imaging has been the ability to name these features. Recently, the IAU Working Group for Planetary System Nomenclature officially approved of a dozen names that had been proposed by NASA’s New Horizons team. These names honor legendary explorers and visionaries, both real and fictitious, and include science fiction authors Octavia Butler and Arthur C. Clarke.

Aside from being Pluto’s largest moon, Charon is also one of the larger bodies in the Kuiper Belt. Because of its immense size, Charon does not orbit Pluto in the strictest sense. In truth, the barycenter of the Pluto-Charon system is outside Pluto, meaning the two bodies almost orbit each other. The moon also has a wealth of features, which include valleys, crevices, and craters similar to what have been seen on other moons.

Artist’s impression of New Horizons’ close encounter with the Pluto–Charon system. Credit: NASA/JHU APL/SwRI/Steve Gribben

For some time, the New Horizons team has been using a series of informal names to describe Charon’s many features. The team gathered most of them during the online public naming campaign they hosted in 2015. Known as  “Our Pluto“, this campaign consisted of people from all over the world contributed their suggestions for naming features on Pluto and Charon.

The New Horizons team also contributed their own suggestions and (according to the IAU) was instrumental in moving the new names through approval. As Dr. Alan Stern, the New Horizon team leader, told Universe Today via email: “We conduced a public feature name bank process in 2015 before flyby. Once flyby was complete our science team created a naming proposal for specific features and sent it to IAU.”

A similar process took place last year, where the IAU officially adopted 14 place names that were suggested by the New Horizons team – many of which were the result of the online naming campaign. Here too, the names were those that the team had been using informally to describe the many regions, mountain ranges, plains, valleys and craters that were discovered during the spacecraft’s flyby.

The names that were ultimately selected honored the spirit of epic exploration, which the New Horizons mission demonstrated by being the first probe to reach Pluto. As such, the names that were adopted honored travelers, explorers, scientists, pioneering journeys, and mysterious destinations. For example, Butler Mons honors Octavia E. Butler, a celebrated author and the first science fiction writer to win a MacArthur fellowship.

Global map of Pluto’s moon Charon pieced together from images taken at different resolutions. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute

Similarly, Clarke Montes honors Sir Arthur C. Clarke, the prolific writer and futurist who co-wrote the screenplay for 2001: A Space Odyssey (which he later turned into a series of novels). Stanley Kubrik, who produced and directed 2001: A Space Odyssey, was also honored with the feature Kubrik Mons. Meanwhile, several craters were named in honor of fictional characters from famous stories and folklore.

The Revati Crater is named after the main character in the Hindu epic narrative Mahabharata while the Nasreddin Crater is named for the protagonist in thousands of folktales told throughout the Middle East, southern Europe and parts of Asia. Nemo Crater honors the captain of the Nautilus in Jule’s Verne’s novels Twenty Thousand Leagues Under the Sea (1870) and The Mysterious Island (1874).

The Pirx Crater is name after the main character in a series of short stories by Polish sci-fi author Stanislaw Lem, while the Dorothy Crater takes its name from the protagonist in The Wizard of Oz, one of several children’s stories by L. Frank Baum that was set in this magical land.

As Rita Schulz, chair of the IAU Working Group for Planetary System Nomenclature, commented, “I am pleased that the features on Charon have been named with international spirit.” Dr. Alan Stern expressed similar sentiments. When asked if he was happy with the new names that have been approved, he said simply, “Very.”

Artist’s impression of NASA’s New Horizons spacecraft encountering 2014 MU69 (Ultima Thule), a Kuiper Belt object that orbits 1.6 billion km (1 billion mi) beyond Pluto, on Jan. 1st, 2019. Credits: NASA/JHUAPL/SwRI/Steve Gribben

Even though the encounter with the Pluto system happened almost three years ago, scientists are still busy studying all the information gathered during the historic flyby. In addition, the New Horizons spacecraft will be making history again in the not-too-distant future. At present, the spacecraft is making its way farther into the outer Solar System with the intention of rendezvousing with two Kuiper Belt Objects.

On Jan. 1st, 2019, it will rendezvous with its first destination, the KBO known as 2014 MU69 (aka. “Ultima Thule“). This object will be the most primitive object ever observed by a spacecraft, and the encounter will the farthest ever achieved in space exploration. Before this intrepid exploration mission is complete, we can expect that a lot more of the outer Solar System will be mapped and named.

Further Reading: IAU