Debris Disks Around Stars Could Point the Way to Giant Exoplanets

This artist's rendering shows a large exoplanet causing small bodies to collide in a disk of dust. Credit: NASA/JPL-Caltech

According to current estimates, there could be as many as 100 billion planets in the Milky Way Galaxy alone. Unfortunately, finding evidence of these planets is tough, time-consuming work. For the most part, astronomers are forced to rely on indirect methods that measure dips in a star’s brightness (the Transit Method) of Doppler measurements of the star’s own motion (the Radial Velocity Method).

Direct imaging is very difficult because of the cancelling effect stars have, where their brightness makes it difficult to spot planets orbiting them. Luckily a new study led by the Infrared Processing and Analysis Center (IPAC) at Caltech has determined that there may be a shortcut to finding exoplanets using direct imaging. The solution, they claim, is to look for systems with a circumstellar debris disk, for they are sure to have at least one giant planet.

The study, titled “A Direct Imaging Survey of Spitzer Detected Debris Disks: Occurrence of Giant Planets in Dusty Systems“, recently appeared in The Astronomical Journal. Tiffany Meshkat, an assistant research scientist at IPAC/Caltech, was the lead author on the study, which she performed while working at NASA’s Jet Propulsion Laboratory as a postdoctoral researcher.

A circumstellar disk of debris around a mature stellar system could indicate the presence of Earth-like planets. Credit: NASA/JPL
Artist’s impression of circumstellar disk of debris around a distant star. Credit: NASA/JPL

For the sake of this study, Dr. Meshkat and her colleagues examined data on 130 different single-star systems with debris disks, which they then compared to 277 stars that do not appear to host disks. These stars were all observed by NASA’s Spitzer Space Telescope and were all relatively young in age (less than 1 billion years). Of these 130 systems, 100 had previously been studied for the sake of finding exoplanets.

Dr. Meshkat and her team then followed up on the remaining 30 systems using data from the W.M. Keck Observatory in Hawaii and the European Southern Observatory’s (ESO) Very Large Telescope (VLT) in Chile. While they did not detect any new planets in these systems, their examinations helped characterize the abundance of planets in systems that had disks.

What they found was that young stars with debris disks are more likely to also have giant exoplanets with wide orbits than those that do not. These planets were also likely to have five times the mass of Jupiter, thus making them “Super-Jupiters”. As Dr. Meshkat explained in a recent NASA press release, this study will be of assistance when it comes time for exoplanet-hunters to select their targets:

“Our research is important for how future missions will plan which stars to observe. Many planets that have been found through direct imaging have been in systems that had debris disks, and now we know the dust could be indicators of undiscovered worlds.”

This artist’s conception shows how collisions between planetesimals can create additional debris. Credit: NASA/JPL-Caltech

This study, which was the largest examination of stars with dusty debris disks, also provided the best evidence to date that giant planets are responsible for keeping debris disks in check. While the research did not directly resolve why the presence of a giant planet would cause debris disks to form, the authors indicate that their results are consistent with predictions that debris disks are the products of giant planets stirring up and causing dust collisions.

In other words, they believe that the gravity of a giant planet would cause planestimals to collide, thus preventing them from forming additional planets. As study co-author Dimitri Mawet, who is also a JPL senior research scientist, explained:

“It’s possible we don’t find small planets in these systems because, early on, these massive bodies destroyed the building blocks of rocky planets, sending them smashing into each other at high speeds instead of gently combining.”

Within the Solar System, the giant planets create debris belts of sorts. For example, between Mars and Jupiter, you have the Main Asteroid Belt, while beyond Neptune lies the Kuiper Belt. Many of the systems examined in this study also have two belts, though they are significantly younger than the Solar System’s own belts – roughly 1 billion years old compared to 4.5 billion years old.

Artist’s impression of Beta Pictoris b. Credit: ESO L. Calçada/N. Risinger (skysurvey.org)

One of the systems examined in the study was Beta Pictoris, a system that has a debris disk, comets, and one confirmed exoplanet. This planet, designated Beta Pictoris b, which has 7 Jupiter masses and orbits the star at a distance of 9 AUs – i.e. nine times the distance between the Earth and the Sun. This system has been directly imaged by astronomers in the past using ground-based telescopes.

Interestingly enough, astronomers predicted the existence of this exoplanet well before it was confirmed, based on the presence and structure of the system’s debris disk. Another system that was studied was HR8799, a system with a debris disk that has two prominent dust belts. In these sorts of systems, the presence of more giant planets is inferred based on the need for these dust belts to be maintained.

This is believed to be case for our own Solar System, where 4 billion years ago, the giant planets diverted passing comets towards the Sun. This resulted in the Late Heavy Bombardment, where the inner planets were subject to countless impacts that are still visible today. Scientists also believe that it was during this period that the migrations of Jupiter, Saturn, Uranus and Neptune deflected dust and small bodies to form the Kuiper Belt and Asteroid Belt.

Dr. Meshkat and her team also noted that the systems they examined contained much more dust than our Solar System, which could be attributable to their differences in age. In the case of systems that are around 1 billion years old, the increased presence of dust could be the result of small bodies that have not yet formed larger bodies colliding. From this, it can be inferred that our Solar System was once much dustier as well.

Artist’s concept of the multi-planet system around HR 8799, initially discovered with Gemini North adaptive optics images. Credit: Gemini Observatory/Lynette Cook”

However, the authors note is also possible that the systems they observed – which have one giant planet and a debris disk – may contain more planets that simply have not been discovered yet. In the end, they concede that more data is needed before these results can be considered conclusive. But in the meantime, this study could serve as an guide as to where exoplanets might be found.

As Karl Stapelfeldt, the chief scientist of NASA’s Exoplanet Exploration Program Office and a co-author on the study, stated:

“By showing astronomers where future missions such as NASA’s James Webb Space Telescope have their best chance to find giant exoplanets, this research paves the way to future discoveries.”

In addition, this study could help inform our own understanding of how the Solar System evolved over the course of billions of years. For some time, astronomers have been debating whether or not planets like Jupiter migrated to their current positions, and how this affected the Solar System’s evolution. And there continues to be debate about how the Main Belt formed (i.e. empty of full).

Last, but not least, it could inform future surveys, letting astronomers know which star systems are developing along the same lines as our own did, billions of years ago. Wherever star systems have debris disks, they an infer the presence of a particularly massive gas giant. And where they have a disk with two prominent dust belts, they can infer that it too will become a system containing many planets and and two belts.

Further Reading: NASA, The Astrophysical Journal

Ancient Hydrothermal Vents Found on Mars, Could Have Been a Cradle for Life

MOLA topographic data, colorized to show the maximum (1,100?m) and minimum (700?m) level of an ancient sea. Credit: NASA/Joseph R. Michalski (et al.)/Nature Communications

It is now a well-understood fact that Mars once had quite a bit of liquid water on its surface. In fact, according to a recent estimate, a large sea in Mars’ southern hemisphere once held almost 10 times as much water as all of North America’s Great Lakes combined. This sea existed roughly 3.7 billion years ago, and was located in the region known today as the Eridania basin.

However, a new study based on data from NASA’s Mars Reconnaissance Orbiter (MRO) detected vast mineral deposits at the bottom of this basin, which could be seen as evidence of ancient hot springs. Since this type of hydrothermal activity is believed to be responsible for the emergence of life on Earth, these results could indicate that this basin once hosted life as well.

The study, titled “Ancient Hydrothermal Seafloor Deposits in Eridania Basin on Mars“, recently appeared in the scientific journal Nature Communications. The study was led by Joseph Michalski of the Department of Earth Sciences and Laboratory for Space Research at the University of Hong Kong, along with researchers from the Planetary Science Institute, the Natural History Museum in London, and NASA’s Johnson Space Center.

 

The Eridania basin of southern Mars is believed to have held a sea about 3.7 billion years ago, with seafloor deposits likely resulting from underwater hydrothermal activity. Credit: NASA

Together, this international team used data obtained by the MRO’s Compact Reconnaissance Spectrometer for Mars (CRISM). Since the MRO reached Mars in 2006, this instrument has been used extensively to search for evidence of mineral residues that form in the presence of water. In this respect, CRISM was essential for documenting how lakes, ponds and rivers once existed on the surface of Mars.

In this case, it identified massive mineral deposits within Mars’ Eridania basin, which lies in a region that has some of the Red Planet’s most ancient exposed crust. The discovery is expected to be a major focal point for scientists seeking to characterize Mars’ once-warm and wet environment. As Paul Niles of NASA’s Johnson Space Center said in a recent NASA press statement:

“Even if we never find evidence that there’s been life on Mars, this site can tell us about the type of environment where life may have begun on Earth. Volcanic activity combined with standing water provided conditions that were likely similar to conditions that existed on Earth at about the same time — when early life was evolving here.”

Today, Mars is a cold, dry place that experiences no volcanic activity. But roughly 3.7 billion years ago, the situation was vastly different. At that time, Mars boasted both flowing and standing bodies of water, which are evidenced by vast fluvial deposits and sedimentary basins. The Gale Crater is a perfect example of this since it was once a major lake bed, which is why it was selected as the landing sight for the Curiosity rover in 2012.

Illustrates showing the origin of some deposits in the Eridania basin of southern Mars resulting from seafloor hydrothermal activity more than 3 billion years ago. Credit: NASA

Since Mars had both surface water and volcanic activity during this time, it would have also experienced hydrothermal activity. This occurs when volcanic vents open into standing bodies of water, filling them with hydrated minerals and heat. On Earth, which still has an active crust, evidence of past hydrothermal activity cannot be preserved. But on Mars, where the crust is solid and erosion is minimal, the evidence has been preserved.

“This site gives us a compelling story for a deep, long-lived sea and a deep-sea hydrothermal environment,” Niles said. “It is evocative of the deep-sea hydrothermal environments on Earth, similar to environments where life might be found on other worlds — life that doesn’t need a nice atmosphere or temperate surface, but just rocks, heat and water.”

Based on their study, the researchers estimate that the Eridania basin once held about 210,000 cubic km (50,000 cubic mi) of water. Not only is this nine times more water than all of the Great Lakes combined, it is as much as all the other lakes and seas on ancient Mars combined. In addition, the region also experienced lava flows that existed  after the sea is believed to have disappeared.

From the CRISM’s spectrometer data, the team identified deposits of serpentine, talc and carbonate. Combined with the shape and texture of the bedrock layers, they concluded that the sea floor was open to volcanic fissures. Beyond indicating that this region could have once hosted life, this study also adds to the diversity of the wet environments which are once believed to have existed on Mars.

A scale model compares the volume of water contained in lakes and seas on the Earth and Mars to the estimated volume of water contained in an ancient Eridania sea. Credit: JJoseph R. Michalski (et al.)/Nature Communications

Between evidence of ancient lakes, rivers, groundwater, deltas, seas, and volcanic eruptions beneath ice, scientists now have evidence of volcanic activity that occurred beneath a standing body of water (aka. hot springs) on Mars. This also represents a new category for astrobiological research, and a possible destination for future missions to the Martian surface.

The study of hydrothermal activity is also significant as far as finding sources of extra-terrestrial, like on the moons of Europa, Enceladus, Titan, and elsewhere. In the future, robotic missions are expected to travel to these worlds in order to peak beneath their icy surfaces, investigate their plumes, or venture into their seas (in Titan’s case) to look for the telltale traces of basic life forms.

The study also has significance beyond Mars and could aid in the study of how life began here on Earth. At present, the earliest evidence of terrestrial life comes from seafloor deposits that are similar in origin and age to those found in the Eridania basin. But since the geological record of this period on Earth is poorly preserved, it has been impossible to determine exactly what conditions were like at this time.

Given Mars’ similarities with Earth, and the fact that its geological record has been well-preserved over the past 3 billion years, scientists can look to mineral deposits and other evidence to gauge how natural processes here on Earth allowed for life to form and evolve over time. It could also advance our understanding of how all the terrestrial planets of the Solar System evolved over billions of years.

Further Reading: NASA

This Meteorite Came From a Volcano on Mars

A sample of nakhlite, a type of volcanic terrain that came to Earth as a Martian meteorite. Credit: University of Glasgow

Today, it is well understood that Mars is a cold, dry, and geologically dead planet. However, billions of years ago when it was still young, the planet boasted a denser atmosphere and had liquid water on its surface. Millions of years ago, it also experienced a significant amount of volcanic activity, which resulted in the formation of it’s massive features – like Olympus Mons, the largest volcano in the Solar System.

Until recently, scientists have understood that Martian volcanic activity has been driven by sources other than tectonic movement, which the planet has been devoid of for billions of years. However, after conducting a study of Martian rock samples, a team of researchers from the UK and United States concluded that eons ago, Mars was more volcanically active than previously thought.

Their study, titled “Taking the Pulse of Mars via Dating of a Plume-fed Volcano“, recently appeared in the scientific journal Nature Communications. Led by Benjamin Cohen, a researcher with the Scottish Universities Environmental Research Center (SUERC) and the School of Geographical and Earth Sciences at the University of Glasgow, the team conducted an analysis of Mars’ volcanic past using samples of Martian meteorites.

Asteroid impacts on Mars have sent samples of Martian rock to Earth in the form of meteorites. Credit: geol.umd.edu

On Earth, the majority of volcanism occurs as a result of plate tectonics, which are driven by convection in the Earth’s mantle. But on Mars, the majority of volcanic activity is the result of mantle plumes, which are highly-localized upwellings of magma that rise from deep within the mantle. This is due to the fact that Mars’ surface has remained static and cool for the past few billion years.

Because of this, Martian volcanoes (though similar in morophology to shield volcanoes on Earth), grow to much larger sizes than those on Earth. Olympus Mons, for example, is not only the largest shield volcano on Mars, but the largest in the Solar System. Whereas the tallest mountain on Earth – Mt. Everest – is 8,848 m (29,029 ft) in height, Olympus Mons stands some 22 km (13.6 mi or 72,000 ft) tall.

For the sake of their study, Dr. Cohen and his colleagues used radioscopic dating techniques, which are commonly used to determine the age and eruption rate of volcanoes on Earth. However, such techniques have not been previously used for shield volcanoes on Mars. As a result, the team’s study of Martian meteorite samples was the first detailed analysis of growth rates in Martian volcanoes.

The six samples they examined are known as nakhlites, a class of Martian meteorite that formed from basaltic magma roughly 1.3 billion years ago. These came to Earth roughly 11 million years ago after being were blasted from the face of Mars by an impact event. By conducting an analysis of Martian meteorites, the team was able to uncover about 90 million years’ worth of new information about Mars’ volcanic past.

Color Mosaic of Olympus Mons on Mars
Color mosaic of Mars’ greatest mountain, Olympus Mons, viewed from orbit. Credit NASA/JPL

As Dr. Cohen explained in a University of Glasgow press release:

“We know from previous studies that the nakhlite meteorites are volcanic rocks, and the development of age-dating techniques in recent years made the nakhlites perfect candidates to help us learn more about volcanoes on Mars.”

The first step was to demonstrate that the rock samples were indeed Martian in origin, which the team confirmed by measuring their exposure to cosmogenic radiation. From this, they determined that the rocks were expelled from the Martian surface 11 million years ago, most likely as a result of an impact event on the Martian surface. They then applied a high-precision radioscopic technique known as 40Ar/39Ar dating.

This consisted of using a noble gas mass spectromomer to measure the amount of argon built up in the samples, which is the result of the natural radioactive decay of potassium. From this, they were able to obtain 90 million years’ worth of new information about the Martian surface. The results of their analysis indicated that there are significant differences in volcanic history between the Earth and Mars. As Dr. Cohen explained:

“We found that the nakhlites formed from at least four eruptions over the course of 90 million years. This is a very long time for a volcano, and much longer than the duration of terrestrial volcanoes, which are typically only active for a few million years. And this is only scratching the surface of the volcano, as only a very small amount of rock would have been ejected by the impact crater – so the volcano must have been active for much longer.”

A triple crater in Elysium Planitia on Mars. Credit: NASA/JPL/University of Arizona

In addition, the team was also able to narrow down which volcanoes their rock samples came from. Previous studies conducted by NASA revealed several candidates for the possible nakhlite source crater. However, only one of the locations matched their results in terms of the age of the volcanic eruptions and the impact that would have ejected the samples into space.

This particular crater (which is currently unnamed) is located in the volcanic plains known as Elysium Planitia, roughly 900 km (560 mi) away from summit of the Elysium Mons volcano  – which stands 12.6 km (7.8 mi) tall. It is also located about 2000 km (1243 mi) north of where the NASA Curiosity rover currently is. As Cohen explained, NASA has some wonderfully detailed satellite images of this particular crater.

“It is 6.5 km wide, and has preserved ejecta rays of debris,” he said. “And we were able to see multiple horizontal bands on the crater walls – which indicating the rocks form layers, with each layer interpreted as a separate lava flow. This study has been able to provide a clearer picture into the history of the nakhlite meteorites, and in turn the largest volcanoes in the solar system.”

In the future, sample return and crewed missions to Mars are sure to clear up this picture even further. Given that Mars, like Earth, is a terrestrial planet, knowing all we can about its geological history will ultimately improve our understanding of how the rocky planets of the Solar System formed. In short, the more we know about Mars’ volcanic history, the most we will be able to learn about the Solar System’s formation and evolution.

Further Reading: University of Glasgow, Nature Communications

 

Scientist Find Treasure Trove of Giant Black Hole Pairs

In February 2016, LIGO detected gravity waves for the first time. As this artist's illustration depicts, the gravitational waves were created by merging black holes. The third detection just announced was also created when two black holes merged. Credit: LIGO/A. Simonnet.
Artist's impression of merging binary black holes. Credit: LIGO/A. Simonnet.

For decades, astronomers have known that Supermassive Black Holes (SMBHs) reside at the center of most massive galaxies. These black holes, which range from being hundreds of thousands to billions of Solar masses, exert a powerful influence on surrounding matter and are believed to be the cause of Active Galactic Nuclei (AGN). For as long as astronomers have known about them, they have sought to understand how SMBHs form and evolve.

In two recently published studies, two international teams of researchers report on the discovery of five newly-discovered black hole pairs at the centers of distant galaxies. This discovery could help astronomers shed new light on how SMBHs form and grow over time, not to mention how black hole mergers produce the strongest gravitational waves in the Universe.

The first four dual black hole candidates were reported in a study titled “Buried AGNs in Advanced Mergers: Mid-Infrared Color Selection as a Dual AGN Finder“, which was led by Shobita Satyapal, a professor of astrophysics at George Mason University. This study was accepted for publication in The Astrophysical Journal and recently appeared online.

Optical and x-ray data on two of the new black hole pairs discovered. Credit: NASA/CXC/Univ. of Victoria/S.Ellison et al./George Mason Univ./S.Satyapal et al./SDSS

The second study, which reported the fifth dual black hole candidate, was led by Sarah Ellison – an astrophysics professor at the University of Victoria. It was recently published in the Monthly Notices of the Royal Astronomical Society under the title “Discovery of a Dual Active Galactic Nucleus with ~8 kpc Separation. The discovery of these five black hole pairs was very fortuitous, given that pairs are a very rare find.

As Shobita Satyapal explained in a Chandra press statement:

“Astronomers find single supermassive black holes all over the universe. But even though we’ve predicted they grow rapidly when they are interacting, growing dual supermassive black holes have been difficult to find.

The black hole pairs were discovered by combining data from a number of different ground-based and space-based instruments. This included optical data from the Sloan Digital Sky Survey (SDSS) and the ground-based Large Binocular Telescope (LBT) in Arizona with near-infrared data from the Wide-Field Infrared Survey Explorer (WISE) and x-ray data from NASA’s Chandra X-ray Observatory.

For the sake of their studies, Satyapal, Ellison, and their respective teams sought to detect dual AGNs, which are believed to be a consequence of galactic mergers. They began by consulting optical data from the SDSS to identify galaxies that appeared to be in the process of merging. Data from the all-sky WISE survey was then used to identify those galaxies that displayed the most powerful AGNs.

Illustration of a pair of black holes. Credit: NASA/CXC/A.Hobart

They then consulted data from the Chandra’s Advanced CCD Imaging Spectrometer (ACIS) and the LBT to identify seven galaxies that appeared to be in an advanced stage of merger. The study led by Ellison also relied on optical data provided by the Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey to pinpoint one of the new black hole pairs.

From the combined data, they found that five out of the seven merging galaxies hosted possible dual AGNs, which were separated by less than 10 kiloparsecs (over 30,000 light years). This was evidenced by the infrared data provided by WISE, which was consistent with what is predicated of rapidly growing supermassive black holes.

In addition, the Chandra data showed closely-separated pairs of x-ray sources, which is also consistent with black holes that have matter slowly being accreted onto them. This infrared and x-ray data also suggested that the supermassive black holes are buried in large amounts of dust and gas. As Ellison indicated, these findings were the result of painstaking work that consisted of sorting through multiple wavelengths of data:

“Our work shows that combining the infrared selection with X-ray follow-up is a very effective way to find these black hole pairs. X-rays and infrared radiation are able to penetrate the obscuring clouds of gas and dust surrounding these black hole pairs, and Chandra’s sharp vision is needed to separate them”.

Artist’s impression of binary black hole system in the process of merging. Credit: Bohn et al.

Before this study, less than ten pairs of growing black holes had been confirmed based on X-ray studies, and these were mostly by chance. This latest work, which detected five black hole pairs using combined data, was therefore both fortunate and significant. Aside from bolstering the hypothesis that supermassive black holes form from the merger of smaller black holes, these studies also have serious implications for gravitational wave research.

“It is important to understand how common supermassive black hole pairs are, to help in predicting the signals for gravitational wave observatories,” said Satyapa. “With experiments already in place and future ones coming online, this is an exciting time to be researching merging black holes. We are in the early stages of a new era in exploring the universe.”

Since 2016, a total of four instances of gravitational waves have been detected by instruments like the Laser Interferometer Gravitational-Wave Observatory (LIGO) and the VIRGO Observatory. However, these detections were the result of black hole mergers where the black holes were all smaller and less massive  – between eight and 36 Solar masses.

Supermassive Black Holes, on the other hand, are much more massive and will likely produce a much larger gravitational wave signature as they continue to draw closer together. And in a few hundred million years, when these pairs eventually do merge, the resulting energy produced by mass being converted into gravitational waves will be incredible.

Artist’s conception of two merging black holes, similar to those detected by LIGO on January 4th, 2017. Credit: LIGO/Caltech

At present, detectors like LIGO and Virgo are not able to detect the gravitational waves created by Supermassive Black Hole pairs. This work is being done by arrays like the North American Nanohertz Observatory for Gravitational Waves (NANOGrav), which relies on high-precision millisecond pulsars to measure the influence of gravitational waves on space-time.

The proposed Laser Interferometer Space Antenna (LISA), which will be the first dedicated space-based gravitational wave detector, is also expected to help in the search. In the meantime, gravitational wave research has already benefited immensely from collaborative efforts like the one that exists between Advanced LIGO and Advanced Virgo.

In the future, scientists also anticipate that they will be able to study the interiors of supernovae through gravitational wave research. This is likely to reveal a great deal about the mechanisms behind black hole formation. Between all of these ongoing efforts and future developments, we can expect to “hear” a great deal more of the Universe and the most powerful forces at work within it.

Be sure to check out this animation that shows what the eventual merger of two of these black hole pairs will look like, courtesy of the Chandra X-ray Observatory:

Further Reading: Chandra HarvardarXiv, MNRAS

New Clues Emerge for the Existence of Planet 9

Artist's impression of Planet Nine, blocking out the Milky Way. The Sun is in the distance, with the orbit of Neptune shown as a ring. Credit: ESO/Tomruen/nagualdesign
Artist's impression of Planet Nine, blocking out the Milky Way. The Sun is in the distance, with the orbit of Neptune shown as a ring. Credit: ESO/Tomruen/nagualdesign

Planet 9 cannot hide forever, and new research has narrowed the range of possible locations further! In January of 2016, astronomers Mike Brown and Konstantin Batygin published the first evidence that there might be another planet in our Solar System. Known as “Planet 9” (“Planet X” to some), this hypothetical body was believed to orbit at an extreme distance from our Sun, as evidenced by the orbits of certain extreme Kuiper Belt Objects (eKBOs).

Since that time, multiple studied have been produced that have attempted to place constraints on Planet 9’s location. The latest study once again comes from Brown and Batygin, who conducted an analytical assessment of all the processes that have indicated the presence of Planet 9 so far. Taken together, these indications show that the existence of this body is not only likely, but also essential to the Solar System as we know it.

The study, titled “Dynamical Evolution Induced by Planet Nine“, recently appeared online and has been accepted for publication in The Astronomical Journal. Whereas previous studies have pointed to the behavior of various populations of KBOs as proof of Planet 9, Brown and Batygin sought to provide a coherent theoretical description of the dynamical mechanisms responsible for these effects.

In the end, they concluded that it would be more difficult to imagine a Solar System without a Planet 9 than with one. As Konstantin Batygin explained in a recent NASA press statement:

“There are now five different lines of observational evidence pointing to the existence of Planet Nine. If you were to remove this explanation and imagine Planet Nine does not exist, then you generate more problems than you solve. All of a sudden, you have five different puzzles, and you must come up with five different theories to explain them.”

In 2016, Brown and Batygin described the first three lines of observational evidence for Planet 9. These include six extreme Kuiper Belt Objects which follow highly elliptical paths around the Sun, which are indicative of an unseen mechanism affecting their orbit. Second is the fact that the orbits of these bodies are all tilted the same way – about 30° “downward” to the plane of the Kuiper Belt.

The third hint came in the form of computer simulations that included Planet 9 as part of the Solar System. Based to these simulations, it was apparent that more objects should be tilted with respect to the Solar plane, on the order of about 90 degrees. Thanks to their research, Brown and Batygin found five such objects that happened to fit this orbital pattern, and suspected that more existed.

Caltech professor Mike Brown and assistant professor Konstanin Batygin have been working together to investigate Planet Nine. Credit: Lance Hayashida/Caltech

Since the publication of the original paper, two more indications have emerged for the existence of Planet 9. Another involved the unexplained orbits of more Kuiper Belt Objects which were found to be orbiting in the opposite direction from everything else in the Solar System. This was a telltale indication that a relatively close body with a powerful gravitational force was affecting their orbits.

And then there was the argument presented in a second paper by the team – which was led by Elizabeth Bailey, Batygin’s graduate student. This study argued that Planet 9 was responsible for tilting the orbits of the Solar planets over the past 4.5 billion years. This not only provided additional evidence for Planet 9, but also answered a long standing mystery in astrophysics – why the planets are tilted 6 degrees relative to the Sun’s equator.

As Batygin indicated, all of this adds up to a solid case for the existence of a yet-to-discovered massive planet in the outer Solar System:

“No other model can explain the weirdness of these high-inclination orbits. It turns out that Planet Nine provides a natural avenue for their generation. These things have been twisted out of the solar system plane with help from Planet Nine and then scattered inward by Neptune.”

A predicted consequence of Planet Nine is that a second set of confined objects (represented in blue) should also exist. Credit: Caltech/R. Hurt (IPAC)

Recent studies have also shed some light on how and where Planet 9 originated. Whereas some suggested that the planet moved to the edge of the Solar System after forming closer to the Sun, others have suggested that it might be an exoplanet that was captured early in the Solar System’s history. At present, the favored theory appears to be that it formed closer to the Sun and migrated outward over time.

Granted, there is not yet a scientific consensus when it comes to Planet 9 and other astronomers have offered other possible explanations for the evidence cited by Batygin and Brown. For instance, a recent analysis based on the Outer Solar System Origins Survey – which discovered more than 800 new Trans-Neptunian Objects (TNOs) – suggests that the evidence could also be consistent with a random distribution of such objects.

In the meantime, all that remains is to find direct evidence of the planet. At present, Batygin and Brown are attempting to do just that, using the Subaru Telescope at the Mauna Kea Observatory in Hawaii. The detection of this planet will not only settle the matter of whether or not it even exists, it will also help resolve a mystery that emerged in recent years thanks to the discovery of thousands of extra-solar planets.

In short, thanks to the discovery of 3,529 confirmed exoplanets in 2,633 solar systems, astronomers have noticed that statistically, the most likely types of planets are “Super-Earths” and “mini-Neptunes” – i.e. planets that are more massive than Earth but not more than about 10 Earth masses. If Planet 9 is confirmed to exist, which is estimated to have 10 times the Mass of Earth, then it could explain this discrepancy.

Planet 9, we know you’re out there and we will find you! Unless you’re not, in which case, disregard this message!

Further Reads: NASA

Not an Alien Megastructure, a Cloud of Dust on a 700-Day Orbit

This illustration depicts a hypothetical uneven ring of dust orbiting KIC 8462852, also known as Boyajian's Star or Tabby's Star. Credit: NASA/JPL-Caltech

The mystery of KIC 8462852 (aka. Boyajian’s Star or Tabby’s Star) continues to excite and intrigue! Ever since it was first seen to be undergoing strange and sudden dips in brightness (back in October of 2015) astronomers have been speculating as to what could be causing this. Since that time, various explanations have been offered, including large asteroids, a large planet, a debris disc or even an alien megastructure.

Many studies have been produced that have sought to assign some other natural explanation to the star’s behavior. The latest comes from an international team of scientists – which included Tabetha Boyajian, the lead author on the original 2016 paper. According to this latest study, which was recently published in The Astrophysical Journal, the star’s long-term dimming patterns are likely the result of an uneven dust cloud moving around the star. Continue reading “Not an Alien Megastructure, a Cloud of Dust on a 700-Day Orbit”

VP Mike Pence Lays Out Administration’s Plan to go Back to the Moon

Vice President Mike Pence delivers opening remarks during the National Space Council's first meeting, Thursday, Oct. 5, 2017 at the Smithsonian National Air and Space Museum's Steven F. Udvar-Hazy Center in Chantilly, Va. Credits: NASA

Looking to the future of space exploration, NASA’s priorities are sometimes subject to change. In 2004, the Bush administration released it’s “Vision for Space Exploration“, which called for the development of rockets that would return astronauts to the Moon. This policy was later replaced by the NASA Authorization Act of 2010, which outlined plans to send humans to an asteroid by 2025 and to Mars in the 2030s.

Earlier today, on Thursday, October 5th, Vice President Mike Pence and several members of the Trump administration announced that their priorities have shifted once again. Instead of proceeding with NASA’s proposed “Journey to Mars“, the administration has set its sights on once again mounting crewed missions to the Moon and establishing a permanent presence on the lunar surface.

The announcement came during the inaugural meeting of the National Space Council, the newly-reestablished executive group that will be guiding US space policy in the coming years. Originally established in 1989 by then-president George H.W. Bush (and disbanded in 1993 by the Clinton administration), this council served the same purpose as the National Aeronautics and Space Council – which oversaw space policy between 1958 and 1973.

Acting NASA Administrator Robert Lightfoot, center, along with Deputy Chief Technology Officer of the United States Michael Kratsios, left, and Director of National Intelligence Daniel Coats, right. Credits: NASA

The meeting, titled “Leading the Next Frontier: An Event with the National Space Council“, took place at the Smithsonian National Air and Space Museum’s (NASM) Steven F. Udvar-Hazy Center in Chantilly, Virginia. The meeting was chaired by Vice President Mike Pence with the participation of NASA Administrator Robert Lightfoot, and was attended by Trump Administration cabinet members, senior officials, and aerospace industry leaders.

During the course of the meeting, which was live-streamed, Vice President Mike Pence laid out the administration’s plans for returning astronauts to the Moon. Emphasizing the need to restore NASA and America’s leadership in space, Pence compared the current situation to the early years of the Space Race and the crowing achievement that was the Apollo 11 mission. As he said:

It is altogether fitting that we chose this week for the first meeting of the National Space Council. Yesterday marked the 60th anniversary of Sputnik, that 184-pound satellite that changed the course of history. On that day, six decades ago yesterday, the race for space began and the then-Soviet Union took an early lead. But the sight of that light blinking across that October sky spurred America to action. We refused to accept a future in space written by the enemies of freedom, and so the United States of America vowed to claim our rightful place as the undisputed leader in the exploration of the heavens. And twelve years later, with “one giant leap for mankind”, America led in space.

Moving to the present, Pence indicated that the reestablishment of he National Space Council would put an end to the ways in which space exploration has stalled in recent decades. He also indicated how a return to the Moon – a goal which diminished in important in the post-Apollo era – would recapture the spirit of the past and reinvigorate modern space exploration.

Lunar footprint left by the Apollo astronauts. Credit: NASA

As he expressed during the course of the meeting, the way space exploration has stalled is in part due to the way in which the Moon (as a destination) has diminished in importance:

“Our struggle to define the direction and purpose of America’s space program dates back decades to the post-Apollo period. We had just won the race to the Moon and suddenly the question became, ‘What should we do? Where should we go next?’ In the debate that followed, sending Americans to the Moon was treated as a triumph to be remembered, but not repeated. Every passing year that the Moon remained squarely in the rear-view mirror further eroded our ability to return to the lunar domain and made it more likely that we would forget why we ever wanted to go in the first place.”

A renewed mission to the Moon, claimed Pence, will put an end to decades in which not a single NASA astronaut has ventured beyond Low Earth Orbit. He further indicated how after the retirement of the Space Shuttle Program, the US has been dependent on Russia to ferry astronauts to the International Space Station. He also voiced criticism for the Obama administration, claiming that it chose “capitulation” when it came to the space race.

While this new policy technically represents a break from the policy of the Obama administration, and a return to the policy of the Bush administration, Pence emphasized that returning to the Moon would be a stepping stone towards an eventual crewed mission to the Red Planet. This announcement also put an end to months of ambiguity regarding the Trump administration’s space policy.

Members of the National Space Council seen during the council’s first meeting on Thursday, Oct. 5, 2017 at the Smithsonian National Air and Space Museum’s Steven F. Udvar-Hazy Center in Chantilly, Va. Credit: NASA/Joel Kowsky

In the past, VP Pence has spoken about the need to return to the Moon and puts boots on Mars, but nothing definitive was said. This ambiguity, it is worth noting, has also been a source of anxiety for those at NASA, who were unsure about the future budget environment. And while this meeting did indicate that the Trump administration has a policy, many aspects of it were already in place before the administration took office.

After the meeting concluded, acting NASA Administrator Robert Lightfoot spoke of the results in a NASA press statement. In reference to the direction VP Pence had indicated for the agency, he said the following:

“Specifically, NASA has been directed to develop a plan for an innovative and sustainable program of exploration with commercial and international partners to enable human expansion across the solar system, returning humans to the Moon for long-term exploration and utilization, followed by human missions to Mars and other destinations.”

Much of the details discussed at the meeting were already established as early as last September. It was at this time that the NASA Transition Authorization Act of 2016, a provisional measure that guaranteed short-term stability for the agency by allocating $19.5 billion in funding for NASA for fiscal year 2017. Intrinsic to the Act was the cancellation for the NASA’s Asteroid Robotic Redirect Missions (ARRM) in favor of a more cost-effective alternative.

Artist’s illustration of a possible astronaut mission to an asteroid. Credit: NASA Human Exploration Framework Team

As Lightfoot indicated, this would still be the case under the current administration’s plan:

“The recommendation to the president would modify the existing National Space Policy to provide focus and direction to some of NASA’s current activities and plans, and remove a previous guideline that NASA should undertake a human mission to an asteroid as the next human spaceflight milestone beyond low-Earth orbit.”

Lighfoot also reiterated what Pence said during the meeting, how renewed missions to the Moon would ultimately assist NASA’s efforts to mount crewed missions to Mars. These included the importance of cis-lunar space to the exploration of both the Moon and the Mars, as well as its use as a proving ground for future mission to Mars and beyond in the Solar System.

“Based on a number of conversations I’ve had with the council,” he said, “we have highlighted a number of initiatives underway in this important area, including a study of an orbital gateway or outpost that could support a sustained cadence of robotic and human missions, as well as ensuing human missions to the lunar and Mars surfaces, and other destinations.”

Artist illustration of Habitation Module. Credit: Lockheed Martin
Artist illustration of the habitation module aboard the Deep Space Gateway. Credit: Lockheed Martin

While this latest announcement does confirm what many have suspected for some time – that the Trump administration would prioritize lunar exploration – much ambiguity remains. While Pence emphasized that the re-establishment of the NSC was intrinsic to restoring American leadership in space, very little appears to have changed since the NASA Transition Authorization Act of 2016.

What’s more, despite Pence’s claims of “capitulation” on behalf of the Obama administration, much of the current administration’s policy represents a continuation of the NASA Authorization Act of 2010. These include the use of the Space Launch System (SLS), the Orion spacecraft, and the restoration of domestic launch capability. In short, much of the Trump administration’s plans to restore American leadership in space are piggybacking on the accomplishments of the Obama administration.

Beyond that, the creation of the Deep Space Gateway appears unaffected, since its existence is central to both renewed mission to the Moon and for crewed missions to Mars. And the long-term plan for the exploration of Mars appear to still be intact. So in many ways, this latest announcement is not much in the way of news, but also good news.

When it comes to organizations like NASA and space exploration in general, continuity is not only preferable, but necessary. And in the meantime, be sure to check out the live coverage of the event:

Further Reading: NASA, NASA (2)

New Study Proposes a Giant, Space-Based Solar Flare Shield for Earth

A massive prominence erupts from the surface of the sun. Credit: NASA Goddard Space Flight Center

In today’s modern, fast-paced world, human activity is very much reliant on electrical infrastructure. If the power grids go down, our climate control systems will shut off, our computers will die, and all electronic forms of commerce and communication will cease. But in addition to that, human activity in the 21st century is also becoming increasingly dependent upon the infrastructure located in Low Earth Orbit (LEO).

Aside from the many telecommunications satellites that are currently in space, there’s also the International Space Station and a fleet of GPS satellites. It is for this reason that solar flare activity is considered a serious hazard, and mitigation of it a priority. Looking to address that, a team of scientists from Harvard University recently released a study that proposes a bold solution – placing a giant magnetic shield in orbit.

The study – which was the work of Doctor Manasavi Lingam and Professor Abraham Loeb from the Harvard Smithsonian Center for Astrophysicist (CfA) – recently appeared online under the title “Impact and Mitigation Strategy for Future Solar Flares“. As they explain, solar flares pose a particularly grave risk in today’s world, and will become an even greater threat due to humanity’s growing presence in LEO.

Solar flares have been a going concern for over 150 years, ever since the famous Carrington Event of 1859. Since that time, a great deal of effort has been dedicated to the study of solar flares from both a theoretical and observational standpoint. And thanks to the advances that have been made in the past 200 years in terms of astronomy and space exploration, much has been learned about the phenomena known as “space weather”.

At the same time, humanity’s increased reliance on electricity and space-based infrastructure have also made us more vulnerable to extreme space weather events. In fact, if the Carrington event were to take place today, it is estimated that it would cause global damage to electric power grids, satellites communications, and global supply chains.

The cumulative worldwide economic losses, according to a 2009 report by the Space Studies Board (“Severe Space Weather Events–Understanding Societal and Economic Impacts”), would be $10 trillion, and recovery would take several years. And yet, as Professor Loeb explained to Universe Today via email, this threat from space has received far less attention than other possible threats.

“In terms of risk from the sky, most of the attention in the past was dedicated to asteroids,” said Loeb. “They killed the dinosaurs and their physical impact in the past was the same as it will be in the future, unless their orbits are deflected. However, solar flares have little biological impact and their main impact is on technology. But a century ago, there was not much technological infrastructure around, and technology is growing exponentially. Therefore, the damage is highly asymmetric between the past and future.”

Artist’s concept of a large asteroid passing by the Earth-Moon system. Credit: A combination of ESO/NASA images courtesy of Jason Major/Lights in the Dark.

To address this, Lingham and Loeb developed a simple mathematical model to assess the economic losses caused by solar flare activity over time. This model considered the increasing risk of damage to technological infrastructure based on two factors. For one, they considered the fact that the energy of a solar flares increases with time, then coupled this with the exponential growth of technology and GDP.

What they determined was that on longer time scales, the rare types of solar flares that are very powerful become much more likely. Coupled with humanity’s growing presence and dependence on spacecraft and satellites in LEO, this will add up to a dangerous conjunction somewhere down the road. Or as Loeb explained:

“We predict that within ~150 years, there will be an event that causes damage comparable to the current US GDP of ~20 trillion dollars, and the damage will increase exponentially at later times until technological development will saturate. Such a forecast was never attempted before. We also suggest a novel idea for how to reduce the damage from energetic particles by a magnetic shield. This was my idea and was not proposed before.”

To address this growing risk, Lingham and Loeb also considered the possibility of placing a magnetic shield between Earth and the Sun. This shield would be placed at the Earth-Sun Lagrange Point 1, where it would be able to deflect charged particles and create an artificial bowshock around Earth. In this sense, this shield would protect Earth’s in a way that is similar to what its magnetic field already does, but to greater effect.

Illustration of the proposed magnetic deflector placed at the Earth-Sun L1 Lagrange Point. Credit: Lingam and Loeb, 2017

Based on their assessment, Lingham and Loeb indicate that such a shield is technically feasible in terms of its basic physical parameters. They were also able to provide a rudimentary timeline for the construction of this shield, not to mention some rough cost assessments. As Loeb indicated, such a shield could be built before this century is over, and at a fraction of the cost of what would be incurred from solar flare damage.

“The engineering project associated with the magnetic shield that we propose could take a few decades to construct in space,” he said. “The cost for lifting the needed infrastructure to space (weighting 100,000 tons) will likely be of order 100 billions of dollars, much less than the expected damage over a century.”

Interestingly enough, the idea of using a magnetic shield to protect planets has been proposed before. For example, this type of shield was also the subject of a presentation at this year’s “Planetary Science Vision 2050 Workshop“, which was hosted by NASA’s Planetary Science Division (PSD). This shield was recommended as a means of enhancing Mars’ atmosphere and facilitating crewed mission to its surface in the future.

During the course of the presentation, titled “A Future Mars Environment for Science and Exploration“, NASA Director Jim Green discussed how a magnetic shield could protect Mars’ tenuous atmosphere from solar wind. This would allow it to replenish over time, which would have the added benefit of warming Mars up and allowing liquid water to again flow on its surface. If this sounds similar to proposals for terraforming Mars, that’s because it is!

Artist’s impression of a flaring red dwarf star, orbited by an exoplanet. Credit: NASA, ESA, and G. Bacon (STScI)

Beyond Earth and the Solar System, the implications for this study are quite overwhelming. In recent years, many terrestrial planets have been found orbiting within nearby M-type (aka. red dwarf) star systems. Because of the way these planets orbit closely to their respective suns, and the variable and unstable nature of M-type stars, scientists have expressed doubts about whether or not these planets could actually be habitable.

In short, scientists have ventured that over the course of billions of years, rocky planets that orbit close to their suns, are tidally-locked with them, and are subject to regular solar flares would lose their atmospheres. In this respect, magnetic shields could be a possible solution to creating extra-solar colonies. Place a large shield in orbit at the L1 Lagrange point, and you never have to worry again about powerful magnetic storms ravaging the planet!

On top of that, this study offers a possible resolution to the Fermi Paradox. When looking for sign of Extra-Terrestrial Intelligence (ETI), it might make sense to monitor distant stars for signs of an orbiting magnetic shield. As Prof. Leob explained, such structures may have already been detected around distant stars, and could explain some of the unusual observations astronomers have made:

“The imprint of a shield built by another civilization could involve the changes it induces in the brightness of the host star due to occultation (similar behavior to Tabby’s star)  if the structure is big enough. The situation could be similar to Dyson’s spheres, but instead of harvesting the energy of the star the purpose of the infrastructure is to protect a technological civilization on a planet from the flares of its host star.”
It is a foregone conclusion that as time and technology progress, humanity’s presence in (and reliance on) space will increase. As such, preparing for the most drastic space weather events the Solar System can throw at us just makes sense. And when it comes to the big questions like “are we alone in the Universe?”, it also makes sense to take our boldest concepts and proposals and consider how they might point the way towards extra-terrestrial intelligence.

Further Reading: arXiv

LIGO Scientists who Detected Gravitational Waves Awarded Nobel Prize in Physics

Barry C. Barish and Kip S. Thorne, two of the recipients for the 2017 Nobel Prize in physics for their work with gravitational wave research. Credit: Caltech

In February of 2016, scientists working for the Laser Interferometer Gravitational-Wave Observatory (LIGO) made history when they announced the first-ever detection of gravitational waves. Since that time, multiple detections have taken place and scientific collaborations between observatories  – like Advanced LIGO and Advanced Virgo – are allowing for unprecedented levels of sensitivity and data sharing.

Not only was the first-time detection of gravity waves an historic accomplishment, it ushered in a new era of astrophysics. It is little wonder then why the three researchers who were central to the first detection have been awarded the 2017 Nobel Prize in Physics. The prize was awarded jointly to Caltech professors emeritus Kip S. Thorne and Barry C. Barish, along with MIT professor emeritus Rainer Weiss.

To put it simply, gravitational waves are ripples in space-time that are formed by major astronomical events – such as the merger of a binary black hole pair. They were first predicted over a century ago by Einstein’s Theory of General Relativity, which indicated that massive perturbations would alter the structure of space-time. However, it was not until recent years that evidence of these waves was observed for the first time.

The first signal was detected by LIGO’s twin observatories – in Hanford, Washington, and Livingston, Louisiana, respectively – and traced to a black mole merger 1.3 billion light-years away. To date, four detections have been, all of which were due to the mergers of black-hole pairs. These took place on December 26, 2015, January 4, 2017, and August 14, 2017, the last being detected by LIGO and the European Virgo gravitational-wave detector.

For the role they played in this accomplishment, one half of the prize was awarded jointly to Caltech’s Barry C. Barish – the Ronald and Maxine Linde Professor of Physics, Emeritus – and Kip S. Thorne, the Richard P. Feynman Professor of Theoretical Physics, Emeritus. The other half was awarded to Rainer Weiss, Professor of Physics, Emeritus, at the Massachusetts Institute of Technology (MIT).

As Caltech president Thomas F. Rosenbaum – the Sonja and William Davidow Presidential Chair and Professor of Physics – said in a recent Caltech press statement:

“I am delighted and honored to congratulate Kip and Barry, as well as Rai Weiss of MIT, on the award this morning of the 2017 Nobel Prize in Physics. The first direct observation of gravitational waves by LIGO is an extraordinary demonstration of scientific vision and persistence. Through four decades of development of exquisitely sensitive instrumentation—pushing the capacity of our imaginations—we are now able to glimpse cosmic processes that were previously undetectable. It is truly the start of a new era in astrophysics.”

This accomplishment was all the more impressive considering that Albert Einstein, who first predicted their existence, believed gravitational waves would be too weak to study. However, by the 1960s, advances in laser technology and new insights into possible astrophysical sources led scientists to conclude that these waves might actually be detectable.

The first gravity wave detectors were built by Joseph Weber, an astrophysics from the University of Maryland. His detectors, which were built in the 1960s, consisted of large aluminum cylinders  that would be driven to vibrate by passing gravitational waves. Other attempts followed, but all proved unsuccessful; prompting a shift towards a new type of detector involving interferometry.

One such instrument was developed by Weiss at MIT, which relied on the technique known as laser interferometry. In this kind of instrument, gravitational waves are measured using widely spaced and separated mirrors that reflect lasers over long distances. When gravitational waves cause space to stretch and squeeze by infinitesimal amounts, it causes the reflected light inside the detector to shift minutely.

At the same time, Thorne – along with his students and postdocs at Caltech – began working to improve the theory of gravitational waves. This included new estimates on the strength and frequency of waves produced by objects like black holes, neutron stars and supernovae. This culminated in a 1972 paper which Throne co-published with his student, Bill Press, which summarized their vision of how gravitational waves could be studied.

That same year, Weiss also published a detailed analysis of interferometers and their potential for astrophysical research. In this paper, he stated that larger-scale operations – measuring several km or more in size – might have a shot at detecting gravitational waves. He also identified the major challenges to detection (such as vibrations from the Earth) and proposed possible solutions for countering them.

Barry C. Barish and Kip S. Thorne, two of three recipients of the 2017 Nobel Prize in Physics. Credit: Caltech

In 1975, Weiss invited Thorne to speak at a NASA committee meeting in Washington, D.C., and the two spent an entire night talking about gravitational experiments. As a result of their conversation, Thorne went back to Calteh and proposed creating a experimental gravity group, which would work on interferometers in parallel with researchers at MIT, the University of Glasgow and the University of Garching (where similar experiments were being conducted).

Development on the first interferometer began shortly thereafter at Caltech, which led to the creation of a 40-meter (130-foot) prototype to test Weiss’ theories about gravitational waves. In 1984, all of the work being conducted by these respective institutions came together. Caltech and MIT, with the support of the National Science Foundation (NSF) formed the LIGO collaboration and began work on its two interferometers in Hanford and Livingston.

The construction of LIGO was a major challenge, both logistically and technically. However, things were helped immensely when Barry Barish (then a Caltech particle physicist) became the Principal Investigator (PI) of LIGO in 1994. After a decade of stalled attempts, he was also made the director of LIGO and put its construction back on track. He also expanded the research team and developed a detailed work plan for the NSF.

As Barish indicated, the work he did with LIGO was something of a dream come true:

“I always wanted to be an experimental physicist and was attracted to the idea of using continuing advances in technology to carry out fundamental science experiments that could not be done otherwise. LIGO is a prime example of what couldn’t be done before. Although it was a very large-scale project, the challenges were very different from the way we build a bridge or carry out other large engineering projects. For LIGO, the challenge was and is how to develop and design advanced instrumentation on a large scale, even as the project evolves.”

LIGO’s two facilities, located in Livingston, Louisiana, and Hanford, Washington. Credit: ligo.caltech.edu

By 1999, construction had wrapped up on the LIGO observatories and by 2002, LIGO began to obtain data. In 2008, work began on improving its original detectors, known as the Advanced LIGO Project. The process of converting the 40-m prototype to LIGO’s current 4-km (2.5 mi) interferometers was a massive undertaking, and therefore needed to be broken down into steps.

The first step took place between 2002 and 2010, when the team built and tested the initial interferometers. While this did not result in any detections, it did demonstrate the observatory’s basic concepts and solved many of the technical obstacles. The next phase – called Advanced LIGO, which took placed between 2010 and 2015 – allowed the detectors to achieve new levels of sensitivity.

These upgrades, which also happened under Barish’s leadership, allowed for the development of several key technologies which ultimately made the first detection possible. As Barish explained:

“In the initial phase of LIGO, in order to isolate the detectors from the earth’s motion, we used a suspension system that consisted of test-mass mirrors hung by piano wire and used a multiple-stage set of passive shock absorbers, similar to those in your car. We knew this probably would not be good enough to detect gravitational waves, so we, in the LIGO Laboratory, developed an ambitious program for Advanced LIGO that incorporated a new suspension system to stabilize the mirrors and an active seismic isolation system to sense and correct for ground motions.”

Rainer Weiss, famed MIT physicist and partial winner of the 2017 Nobel Prize in Physics. Credit: MIT/Bryce Vickmark

Given how central Thorne, Weiss and Barish were to the study of gravitational waves, all three were rightly-recognized as this year’s recipients of the Nobel Prize in Physics. Both Thorne and Barish were notified that they had won in the early morning hours on October 3rd, 2017. In response to the news, both scientists were sure to acknowledge the ongoing efforts of LIGO, the science teams that have contributed to it, and the efforts of Caltech and MIT in creating and maintaining the observatories.

“The prize rightfully belongs to the hundreds of LIGO scientists and engineers who built and perfected our complex gravitational-wave interferometers, and the hundreds of LIGO and Virgo scientists who found the gravitational-wave signals in LIGO’s noisy data and extracted the waves’ information,” said Thorne. “It is unfortunate that, due to the statutes of the Nobel Foundation, the prize has to go to no more than three people, when our marvelous discovery is the work of more than a thousand.”

“I am humbled and honored to receive this award,” said Barish. “The detection of gravitational waves is truly a triumph of modern large-scale experimental physics. Over several decades, our teams at Caltech and MIT developed LIGO into the incredibly sensitive device that made the discovery. When the signal reached LIGO from a collision of two stellar black holes that occurred 1.3 billion years ago, the 1,000-scientist-strong LIGO Scientific Collaboration was able to both identify the candidate event within minutes and perform the detailed analysis that convincingly demonstrated that gravitational waves exist.”

Looking ahead, it is also pretty clear that Advanved LIGO, Advanced Virgo and other gravitational wave observatories around the world are just getting started. In addition to having detected four separate events, recent studies have indicated that gravitational wave detection could also open up new frontiers for astronomical and cosmological research.

For instance, a recent study by a team of researchers from the Monash Center for Astrophysics proposed a theoretical concept known as ‘orphan memory’. According to their research, gravitational waves not only cause waves in space-time, but leave permanent ripples in its structure. By studying the “orphans” of past events, gravitational waves can be studied both as they reach Earth and long after they pass.

In addition, a study was released in August by a team of astronomers from the Center of Cosmology at the University of California Irvine that indicated that black hole mergers are far more common than we thought. After conducting a survey of the cosmos intended to calculate and categorize black holes, the UCI team determined that there could be as many as 100 million black holes in the galaxy.

Another recent study indicated that the Advanced LIGO, GEO 600, and Virgo gravitational-wave detector network could also be used to detect the gravitational waves created by supernovae. By detecting the waves created by star that explode near the end of their lifespans, astronomers could be able to see inside the hearts of collapsing stars for the first time and probe the mechanics of black hole formation.

The Nobel Prize in Physics is one of the highest honors that can be bestowed upon a scientist. But even greater than that is the knowledge that great things resulted from one’s own work. Decades after Thorne, Weiss and Barish began proposing gravitational wave studies and working towards the creation of detectors, scientists from all over the world are making profound discoveries that are revolutionizing the way we think of the Universe.

And as these scientists will surely attest, what we’ve seen so far is just the tip of the iceberg. One can imagine that somewhere, Einstein is also beaming with pride. As with other research pertaining to his theory of General Relativity, the study of gravitational waves is demonstrating that even after a century, his predictions were still bang on!

And be sure to check out this video of the Caltech Press Conference where Barish and Thorn were honored for their accomplishments:

Further Reading: NASA, Caltech

Determining the Mass of the Milky Way Using Hypervelocity Stars

An artist's conception of a hypervelocity star that has escaped the Milky Way. Credit: NASA

For centuries, astronomers have been looking beyond our Solar System to learn more about the Milky Way Galaxy. And yet, there are still many things about it that elude us, such as knowing its precise mass. Determining this is important to understanding the history of galaxy formation and the evolution of our Universe. As such, astronomers have attempted various techniques for measuring the true mass of the Milky Way.

So far, none of these methods have been particularly successful. However, a new study by a team of researchers from the Harvard-Smithsonian Center for Astrophysics proposed a new and interesting way to determine how much mass is in the Milky Way. By using hypervelocity stars (HVSs) that have been ejected from the center of the galaxy as a reference point, they claim that we can constrain the mass of our galaxy.

Their study, titled “Constraining Milky Way Mass with Hypervelocity Stars“, was recently published in the journal Astronomy and Astrophysics. The study was produced by Dr. Giacomo Fragione, an astrophysicist at the University of Rome, and Professor Abraham Loeb – the Frank B. Baird, Jr. Professor of Science, the Chair of the Astronomy Department, and the Director of the Institute for Theory and Computation at Harvard University.

Stars speeding through the Galaxy. Credit: ESA

To be clear, determining the mass of the Milky Way Galaxy is no simple task. On the one hand, observations are difficult because the Solar System lies deep within the disk of the galaxy itself. But at the same time, there’s also the mass of our galaxy’s dark matter halo, which is difficult to measure since it is not “luminous”, and therefore invisible to conventional methods of detection.

Current estimates of the galaxy’s total mass are based on the motions of tidal streamers of gas and globular clusters, which are both influenced by the gravitational mass of the galaxy. But so far, these measurements have produced mass estimates that range from one to several trillion solar-masses. As Professor Loeb explained to Universe Today via email, precisely measuring the mass of the Milky Way is of great importance to astronomers:

“The Milky Way provides a laboratory for testing the standard cosmological model. This model predicts that the number of satellite galaxies of the Milky Way depends sensitively on its mass. When comparing the predictions to the census of known satellite galaxies, it is essential to know the Milky Way mass. Moreover, the total mass calibrates the amount of invisible (dark) matter and sets the depth of the gravitational potential well and implies how fast should stars move for them to escape to intergalactic space.”

For the sake of their study, Prof. Loeb and Dr. Fragione therefore chose to take a novel approach, which involved modeling the motions of HVSs to determine the mass of our galaxy. More than 20 HVSs have been discovered within our galaxy so far, which travel at speeds of up to 700 km/s (435 mi/s) and are located at distances of about 100 to 50,000 light-years from the galactic center.

Artist’s conception of a hyperveloctiy star heading out from a spiral galaxy (similar to the Milky Way) and moving into dark matter nearby. Credit: Ben Bromley, University of Utah

These stars are thought to have been ejected from the center of our galaxy thanks to the interactions of binary stars with the supermassive black hole (SMBH) at the center of our galaxy – aka. Sagittarius A*. While their exact cause is still the subject of debate, the orbits of HVSs can be calculated since they are completely determined by the gravitational field of the galaxy.

As they explain in their study, the researchers used the asymmetry in the radial velocity distribution of stars in the galactic halo to determine the galaxy’s gravitational potential. The velocity of these halo stars is dependent on the potential escape speed of HVSs, provided that the time it takes for the HVSs to complete a single orbit is shorter than the lifetime of the halo stars.

From this, they were able to discriminate between different models for the Milky Way and the gravitational force it exerts. By adopting the nominal travel time of these observed HVSs – which they calculated to about 330 million years, about the same as the average lifetime of halo stars – they were able to derive gravitational estimates for the Milky Way which allowed for estimates on its overall mass.

“By calibrating the minimum speed of unbound stars, we find that the Milky Way mass is in the range of 1.2-1.9 trillions solar masses,” said Loeb. While still subject to a range, this latest estimate is a significant improvement over previous estimates. What’s more, these estimates are consistent our current cosmological models that attempt to account for all visible matter in the Universe, as well as dark matter and dark energy – the Lambda-CDM model.

Distribution of dark matter when the Universe was about 3 billion years old, obtained from a numerical simulation of galaxy formation. Credit: VIRGO Consortium/Alexandre Amblard/ESA

“The inferred Milky Way mass is in the range expected within the standard cosmological model,” said Leob, “where the amount of dark matter is about five times larger than that of ordinary (luminous) matter.”

Based on this breakdown, it can be said that normal matter in our galaxy – i.e. stars, planets, dust and gas – accounts for between 240 and 380 billion Solar Masses. So not only does this latest study provide more precise mass constraints for our galaxy, it could also help us to determine exactly how many star systems are out there – current estimates say that the Milky Way has between 200 to 400 billion stars and 100 billion planets.

Beyond that, this study is also significant to the study of cosmic formation and evolution. By placing more precise estimates on our galaxy’s mass, ones which are consistent with the current breakdown of normal matter and dark matter, cosmologists will be able to construct more accurate accounts of how our Universe came to be. One step clsoer to understanding the Universe on the grandest of scales!

Further Reading: Harvard Smithsonian CfA, Astronomy and Astrophysics