Black holes have been an endless source of fascination ever since Einstein’s Theory of General Relativity predicted their existence. In the past 100 years, the study of black holes has advanced considerably, but the awe and mystery of these objects remains. For instance, scientists have noted that in some cases, black holes have massive jets of charged particles emanating from them that extend for millions of light years.
These “relativistic jets” – so-named because they propel charged particles at a fraction of the speed of light – have puzzled astronomers for years. But thanks to a recent study conducted by an international team of researchers, new insight has been gained into these jets. Consistent with General Relativity, the researchers showed that these jets gradually precess (i.e. change direction) as a result of space-time being dragged into the rotation of the black hole.
For the sake of their study, the team conducted simulations using the Blue Waters supercomputer at the University of Illinois. The simulations they conducted were the first ever to model the behavior of relativistic jets coming from Supermassive Black Holes (SMBHs). With close to a billion computational cells, it was also the highest-resolution simulation of an accreting black hole ever achieved.
“Understanding how rotating black holes drag the space-time around them and how this process affects what we see through the telescopes remains a crucial, difficult-to-crack puzzle. Fortunately, the breakthroughs in code development and leaps in supercomputer architecture are bringing us ever closer to finding the answers.”
Much like all Supermassive Black Holes, rapidly spinning SMBHs regularly engulf (aka. accrete) matter. However, rapidly spinning black holes are also known for the way they emit energy in the form of relativistic jets. The matter that feeds these black holes forms a rotating disk around them – aka. an accretion disk – which is characterized by hot, energized gas and magnetic field lines.
It is the presence of these field lines that allows black holes to propel energy in the form of these jets. Because these jets are so large, they are easier to study than the black holes themselves. In so doing, astronomers are able to understand how quickly the direction of these jets change, which reveals things about the rotation of the black holes themselves – such as the orientation and size of their rotating disks.
Advanced computer simulations are necessary when it comes to the study of black holes, largely because they are not observable in visible light and are typically very far away. For instance, the closest SMBH to Earth is Sagittarius A*, which is located about 26,000 light-years away at the center of our galaxy. As such, simulations are the only way to determine how a highly complex system like a black hole operates.
In previous simulations, scientists operated under the assumption that black hole disks were aligned. However, most SMBHs have been found to have tilted disks – i.e. the disks rotate around a separate axis than the black hole itself. This study was therefore seminal in that it showed how disks can change direction relative to their black hole, leading to precessing jets that periodically change their direction.
This was previously unknown because of the incredibly amount of computing power that is needed to construct 3-D simulations of the region surrounding a rapidly spinning black hole. With the support of a National Science Foundation (NSF) grant, the team was able to achieve this by using the Blue Waters, one of the largest supercomputers in the world.
With this supercomputer at their disposal, the team was able to construct the first black hole simulation code, which they accelerated using graphical processing units (GPUs). Thanks to this combination, the team was able to carry out simulations that had the highest level of resolution ever achieved – i.e. close to a billion computational cells. As Tchekhovskoy explained:
“The high resolution allowed us, for the first time, to ensure that small-scale turbulent disk motions are accurately captured in our models. To our surprise, these motions turned out to be so strong that they caused the disk to fatten up and the disk precession to stop. This suggests that precession can come about in bursts.”
The precession of relativistic jets could explain why light fluctuations have been observed coming from around black holes in the past – which are known as quasi-periodic oscillations (QPOs). These beams, which were first discovered by Michiel van der Klis (one of the co-authors on the study), operate in much the same way as a quasar’s beams, which appear to have a strobing effect.
This study is one of many that is being conducting on rotating black holes around the world, the purpose of which is to gain a better understanding about recent discoveries like gravitational waves, which are caused by the merger of black holes. These studies are also being applied to observations from the Event Horizon Telescope, which captured the first images of Sagittarius A*’s shadow. What they will reveal is sure to excite and amaze, and potentially deepen the mystery of black holes.
In the past century, the study of black holes has advanced considerably – from the purely theoretical, to indirect studies of the effects they have on surrounding matter, to the study of gravitational waves themselves. Perhaps one day, we might actually be able to study them directly or (if it’s not too much to hope for) peer directly inside them!
Welcome back to our series on Exoplanet-Hunting methods! Today, we look at the curious and unique method known as Gravitational Microlensing.
The hunt for extra-solar planets sure has heated up in the past decade. Thanks to improvements made in technology and methodology, the number of exoplanets that have been observed (as of December 1st, 2017) has reached 3,710 planets in 2,780 star systems, with 621 system boasting multiple planets. Unfortunately, due to various limits astronomers are forced to contend with, the vast majority have been discovered using indirect methods.
One of the more commonly-used methods for indirectly detecting exoplanets is known as Gravitational Microlensing. Essentially, this method relies on the gravitational force of distant objects to bend and focus light coming from a star. As a planet passes in front of the star relative to the observer (i.e. makes a transit), the light dips measurably, which can then be used to determine the presence of a planet.
In this respect, Gravitational Microlensing is a scaled-down version of Gravitational Lensing, where an intervening object (like a galaxy cluster) is used to focus light coming from a galaxy or other object located beyond it. It also incorporates a key element of the highly-effective Transit Method, where stars are monitored for dips in brightness to indicate the presence of an exoplanet.
In accordance with Einstein’s Theory of General Relativity, gravity causes the fabric of spacetime to bend. This effect can cause light affected by an object’s gravity to become distorted or bent. It can also act as a lens, causing light to become more focused and making distant objects (like stars) appear brighter to an observer. This effect occurs only when the two stars are almost exactly aligned relative to the observer (i.e. one positioned in front of the other).
These “lensing events” are brief, but plentiful, as Earth and stars in our galaxy are always moving relative to each other. In the past decade, over one thousand such events have been observed, and typically lasted for a few days or weeks at a time. In fact, this effect was used by Sir Arthur Eddington in 1919 to provide the first empirical evidence for General Relativity.
This took place during the solar eclipse of May 29th, 1919, where Eddington and a scientific expedition traveled to the island of Principe off the coast of West Africa to take pictures of the stars that were now visible in the region around the Sun. The pictures confirmed Einstein’s prediction by showing how light from these stars was shifted slightly in response to the Sun’s gravitational field.
The technique was originally proposed by astronomers Shude Mao and Bohdan Paczynski in 1991 as a means of looking for binary companions to stars. Their proposal was refined by Andy Gould and Abraham Loeb in 1992 as a method of detecting exoplanets. This method is most effective when looking for planets towards the center of the galaxy, as the galactic bulge provides a large number of background stars.
Microlensing is the only known method capable of discovering planets at truly great distances from the Earth and is capable of finding the smallest of exoplanets. Whereas the Radial Velocity Method is effective when looking for planets up to 100 light years from Earth and Transit Photometry can detect planets hundreds of light-years away, microlensing can find planets that are thousands of light-years away.
While most other methods have a detection bias towards smaller planets, the microlensing method is the most sensitive means of detecting planets that are around 1-10 astronomical units (AU) away from Sun-like stars. Microlensing is also the only proven means of detecting low-mass planets in wider orbits, where both the transit method and radial velocity are ineffective.
Taken together, these benefits make microlensing the most effective method for finding Earth-like planets around Sun-like stars. In addition, microlensing surveys can be effectively mounted using ground-based facilities. Like Transit Photometry, the Microlensing Method benefits from the fact that it can be used to survey tens of thousands of stars simultaneously.
Because microlensing events are unique and not subject to repeat, any planets detected using this method will not be observable again. In addition, those planets that are detected tend to be very far way, which makes follow-up investigations virtually impossible. Luckily, microlensing detections generally do not require follow-up surveys since they have a very high signal-to-noise ratio.
While confirmation is not necessary, some planetary microlensing events have been confirmed. The planetary signal for event OGLE-2005-BLG-169 was confirmed by HST and Keck observations (Bennett et al. 2015; Batista et al. 2015). In addition, microlensing surveys can only produce rough estimations of a planet’s distance, leaving significant margins for error.
Microlensing is also unable to yield accurate estimates of a planet’s orbital properties, since the only orbital characteristic that can be directly determined with this method is the planet’s current semi-major axis. As such, planet’s with an eccentric orbit will only be detectable for a tiny portion of its orbit (when it is far away from its star).
Finally, microlensing is dependent on rare and random events – the passage of one star precisely in front of another, as seen from Earth – which makes detections both rare and unpredictable.
Examples of Gravitational Microlensing Surveys:
Surveys that rely on the Microlensing Method include the Optical Gravitational Lensing Experiment (OGLE) at the University of Warsaw. Led by Andrzej Udalski, the director of the University’s Astronomical Observatory, this international project uses the 1.3 meter “Warsaw” telescope at Las Campanas, Chile, to search for microlensing events in a field of 100 stars around the galactic bulge.
There is also the Microlensing Observations in Astrophysics (MOA) group, a collaborative effort between researchers in New Zealand and Japan. Led by Professor Yasushi Muraki of Nagoya University, this group uses the Microlensing Method to conduct surveys for dark matter, extra-solar planets, and stellar atmospheres from the southern hemisphere.
And then there’s the Probing Lensing Anomalies NETwork (PLANET), which consists of five 1-meter telescopes distributedaroundthe southernhemisphere. In collaboration with RoboNet, this project is able to provide near-continuous observations for microlensing events caused by planets with masses as low as Earth’s.
The most sensitive survey to date is the Korean Microlensing Telescope Network (KMTNet), a project initiated by the Korea Astronomy and Space Science Institute (KASI) in 2009. KMTNet relies on the instruments at three southern observatories to provide 24-hour continuous monitoring of the Galactic bulge, searching for microlensing events that will point the way towards earth-mass planets orbiting with their stars habitable zones.
Not only was the first-time detection of gravity waves an historic accomplishment, it ushered in a new era of astrophysics. It is little wonder then why the three researchers who were central to the first detection have been awarded the 2017 Nobel Prize in Physics. The prize was awarded jointly to Caltech professors emeritus Kip S. Thorne and Barry C. Barish, along with MIT professor emeritus Rainer Weiss.
To put it simply, gravitational waves are ripples in space-time that are formed by major astronomical events – such as the merger of a binary black hole pair. They were first predicted over a century ago by Einstein’s Theory of General Relativity, which indicated that massive perturbations would alter the structure of space-time. However, it was not until recent years that evidence of these waves was observed for the first time.
The first signal was detected by LIGO’s twin observatories – in Hanford, Washington, and Livingston, Louisiana, respectively – and traced to a black mole merger 1.3 billion light-years away. To date, four detections have been, all of which were due to the mergers of black-hole pairs. These took place on December 26, 2015, January 4, 2017, and August 14, 2017, the last being detected by LIGO and the European Virgo gravitational-wave detector.
For the role they played in this accomplishment, one half of the prize was awarded jointly to Caltech’s Barry C. Barish – the Ronald and Maxine Linde Professor of Physics, Emeritus – and Kip S. Thorne, the Richard P. Feynman Professor of Theoretical Physics, Emeritus. The other half was awarded to Rainer Weiss, Professor of Physics, Emeritus, at the Massachusetts Institute of Technology (MIT).
As Caltech president Thomas F. Rosenbaum – the Sonja and William Davidow Presidential Chair and Professor of Physics – said in a recent Caltech press statement:
“I am delighted and honored to congratulate Kip and Barry, as well as Rai Weiss of MIT, on the award this morning of the 2017 Nobel Prize in Physics. The first direct observation of gravitational waves by LIGO is an extraordinary demonstration of scientific vision and persistence. Through four decades of development of exquisitely sensitive instrumentation—pushing the capacity of our imaginations—we are now able to glimpse cosmic processes that were previously undetectable. It is truly the start of a new era in astrophysics.”
This accomplishment was all the more impressive considering that Albert Einstein, who first predicted their existence, believed gravitational waves would be too weak to study. However, by the 1960s, advances in laser technology and new insights into possible astrophysical sources led scientists to conclude that these waves might actually be detectable.
The first gravity wave detectors were built by Joseph Weber, an astrophysics from the University of Maryland. His detectors, which were built in the 1960s, consisted of large aluminum cylinders that would be driven to vibrate by passing gravitational waves. Other attempts followed, but all proved unsuccessful; prompting a shift towards a new type of detector involving interferometry.
One such instrument was developed by Weiss at MIT, which relied on the technique known as laser interferometry. In this kind of instrument, gravitational waves are measured using widely spaced and separated mirrors that reflect lasers over long distances. When gravitational waves cause space to stretch and squeeze by infinitesimal amounts, it causes the reflected light inside the detector to shift minutely.
At the same time, Thorne – along with his students and postdocs at Caltech – began working to improve the theory of gravitational waves. This included new estimates on the strength and frequency of waves produced by objects like black holes, neutron stars and supernovae. This culminated in a 1972 paper which Throne co-published with his student, Bill Press, which summarized their vision of how gravitational waves could be studied.
That same year, Weiss also published a detailed analysis of interferometers and their potential for astrophysical research. In this paper, he stated that larger-scale operations – measuring several km or more in size – might have a shot at detecting gravitational waves. He also identified the major challenges to detection (such as vibrations from the Earth) and proposed possible solutions for countering them.
In 1975, Weiss invited Thorne to speak at a NASA committee meeting in Washington, D.C., and the two spent an entire night talking about gravitational experiments. As a result of their conversation, Thorne went back to Calteh and proposed creating a experimental gravity group, which would work on interferometers in parallel with researchers at MIT, the University of Glasgow and the University of Garching (where similar experiments were being conducted).
Development on the first interferometer began shortly thereafter at Caltech, which led to the creation of a 40-meter (130-foot) prototype to test Weiss’ theories about gravitational waves. In 1984, all of the work being conducted by these respective institutions came together. Caltech and MIT, with the support of the National Science Foundation (NSF) formed the LIGO collaboration and began work on its two interferometers in Hanford and Livingston.
The construction of LIGO was a major challenge, both logistically and technically. However, things were helped immensely when Barry Barish (then a Caltech particle physicist) became the Principal Investigator (PI) of LIGO in 1994. After a decade of stalled attempts, he was also made the director of LIGO and put its construction back on track. He also expanded the research team and developed a detailed work plan for the NSF.
As Barish indicated, the work he did with LIGO was something of a dream come true:
“I always wanted to be an experimental physicist and was attracted to the idea of using continuing advances in technology to carry out fundamental science experiments that could not be done otherwise. LIGO is a prime example of what couldn’t be done before. Although it was a very large-scale project, the challenges were very different from the way we build a bridge or carry out other large engineering projects. For LIGO, the challenge was and is how to develop and design advanced instrumentation on a large scale, even as the project evolves.”
By 1999, construction had wrapped up on the LIGO observatories and by 2002, LIGO began to obtain data. In 2008, work began on improving its original detectors, known as the Advanced LIGO Project. The process of converting the 40-m prototype to LIGO’s current 4-km (2.5 mi) interferometers was a massive undertaking, and therefore needed to be broken down into steps.
The first step took place between 2002 and 2010, when the team built and tested the initial interferometers. While this did not result in any detections, it did demonstrate the observatory’s basic concepts and solved many of the technical obstacles. The next phase – called Advanced LIGO, which took placed between 2010 and 2015 – allowed the detectors to achieve new levels of sensitivity.
These upgrades, which also happened under Barish’s leadership, allowed for the development of several key technologies which ultimately made the first detection possible. As Barish explained:
“In the initial phase of LIGO, in order to isolate the detectors from the earth’s motion, we used a suspension system that consisted of test-mass mirrors hung by piano wire and used a multiple-stage set of passive shock absorbers, similar to those in your car. We knew this probably would not be good enough to detect gravitational waves, so we, in the LIGO Laboratory, developed an ambitious program for Advanced LIGO that incorporated a new suspension system to stabilize the mirrors and an active seismic isolation system to sense and correct for ground motions.”
Given how central Thorne, Weiss and Barish were to the study of gravitational waves, all three were rightly-recognized as this year’s recipients of the Nobel Prize in Physics. Both Thorne and Barish were notified that they had won in the early morning hours on October 3rd, 2017. In response to the news, both scientists were sure to acknowledge the ongoing efforts of LIGO, the science teams that have contributed to it, and the efforts of Caltech and MIT in creating and maintaining the observatories.
“The prize rightfully belongs to the hundreds of LIGO scientists and engineers who built and perfected our complex gravitational-wave interferometers, and the hundreds of LIGO and Virgo scientists who found the gravitational-wave signals in LIGO’s noisy data and extracted the waves’ information,” said Thorne. “It is unfortunate that, due to the statutes of the Nobel Foundation, the prize has to go to no more than three people, when our marvelous discovery is the work of more than a thousand.”
“I am humbled and honored to receive this award,” said Barish. “The detection of gravitational waves is truly a triumph of modern large-scale experimental physics. Over several decades, our teams at Caltech and MIT developed LIGO into the incredibly sensitive device that made the discovery. When the signal reached LIGO from a collision of two stellar black holes that occurred 1.3 billion years ago, the 1,000-scientist-strong LIGO Scientific Collaboration was able to both identify the candidate event within minutes and perform the detailed analysis that convincingly demonstrated that gravitational waves exist.”
Looking ahead, it is also pretty clear that Advanved LIGO, Advanced Virgo and other gravitational wave observatories around the world are just getting started. In addition to having detected four separate events, recent studies have indicated that gravitational wave detection could also open up new frontiers for astronomical and cosmological research.
For instance, a recent study by a team of researchers from the Monash Center for Astrophysics proposed a theoretical concept known as ‘orphan memory’. According to their research, gravitational waves not only cause waves in space-time, but leave permanent ripples in its structure. By studying the “orphans” of past events, gravitational waves can be studied both as they reach Earth and long after they pass.
In addition, a study was released in August by a team of astronomers from the Center of Cosmology at the University of California Irvine that indicated that black hole mergers are far more common than we thought. After conducting a survey of the cosmos intended to calculate and categorize black holes, the UCI team determined that there could be as many as 100 million black holes in the galaxy.
Another recent study indicated that the Advanced LIGO, GEO 600, and Virgo gravitational-wave detector network could also be used to detect the gravitational waves created by supernovae. By detecting the waves created by star that explode near the end of their lifespans, astronomers could be able to see inside the hearts of collapsing stars for the first time and probe the mechanics of black hole formation.
The Nobel Prize in Physics is one of the highest honors that can be bestowed upon a scientist. But even greater than that is the knowledge that great things resulted from one’s own work. Decades after Thorne, Weiss and Barish began proposing gravitational wave studies and working towards the creation of detectors, scientists from all over the world are making profound discoveries that are revolutionizing the way we think of the Universe.
And as these scientists will surely attest, what we’ve seen so far is just the tip of the iceberg. One can imagine that somewhere, Einstein is also beaming with pride. As with other research pertaining to his theory of General Relativity, the study of gravitational waves is demonstrating that even after a century, his predictions were still bang on!
And be sure to check out this video of the Caltech Press Conference where Barish and Thorn were honored for their accomplishments:
The latest detection took place on August 14th, 2017, when three observatories – the Advanced LIGO and the Advanced Virgo detectors – simultaneously detected the gravitational waves created by merging black holes. This was the first time that gravitational waves were detected by three different facilities from around the world, thus ushering in a new era of globally-networked research into this cosmic phenomena.
Though not the first instance of gravitational waves being detected, this was the first time that an event was detected by three observatories simultaneously. As France Córdova, the director of the NSF, said in a recent LIGO press release:
“Little more than a year and a half ago, NSF announced that its Laser Interferometer Gravitational Wave Observatory had made the first-ever detection of gravitational waves, which resulted from the collision of two black holes in a galaxy a billion light-years away. Today, we are delighted to announce the first discovery made in partnership between the Virgo gravitational-wave observatory and the LIGO Scientific Collaboration, the first time a gravitational wave detection was observed by these observatories, located thousands of miles apart. This is an exciting milestone in the growing international scientific effort to unlock the extraordinary mysteries of our universe.”
Based on the waves detected, the LIGO Scientific Collaboration (LSC) and Virgo collaboration were able to determine the type of event, as well as the mass of the objects involved. According to their study, the event was triggered by the merger of two black holes – which were 31 and 25 Solar Masses, respectively. The event took place about 1.8 billion light years from Earth, and resulted in the formation of a spinning black hole with about 53 Solar Masses.
What this means is that about three Solar Masses were converted into gravitational-wave energy during the merger, which was then detected by LIGO and Virgo. While impressive on its own, this latest detection is merely a taste of what gravitational wave detectors like the LIGO and Virgo collaborations can do now that they have entered their advanced stages, and into cooperation with each other.
Both Advanced LIGO and Advanced Virgo are second-generation gravitational-wave detectors that have taken over from previous ones. The LIGO facilities, which were conceived, built, and are operated by Caltech and MIT, collected data unsuccessfully between 2002 and 2010. However, as of September of 2015, Advanced LIGO went online and began conducting two observing runs – O1 and O2.
Meanwhile, the original Virgo detector conducted observations between 2003 and October of 2011, once again without success. By February of 2017, the integration of the Advanced Virgo detector began, and the instruments went online by the following April. In 2007, Virgo and LIGO also partnered to share and jointly analyze the data recorded by their respective detectors.
In August of 2017, the Virgo detector joined the O2 run, and the first-ever simultaneous detection took place on August 14th, with data being gathered by all three LIGO and Virgo instruments. As LSC spokesperson David Shoemaker – a researcher with the Massachusetts Institute of Technology (MIT) – indicated, this detection is just the first of many anticipated events.
“This is just the beginning of observations with the network enabled by Virgo and LIGO working together,” he said. “With the next observing run planned for fall 2018, we can expect such detections weekly or even more often.”
Not only will this mean that scientists have a better shot of detecting future events, but they will also be able to pinpoint them with far greater accuracy. In fact, the transition from a two- to a three-detector network is expected to increase the likelihood of pinpointing the source of GW170814 by a factory of 20. The sky region for GW170814 is just 60 square degrees – more than 10 times smaller than with data from LIGO’s interferometers alone.
In addition, the accuracy with which the distance to the source is measured has also benefited from this partnership. As Laura Cadonati, a Georgia Tech professor and the deputy spokesperson of the LSC, explained:
“This increased precision will allow the entire astrophysical community to eventually make even more exciting discoveries, including multi-messenger observations. A smaller search area enables follow-up observations with telescopes and satellites for cosmic events that produce gravitational waves and emissions of light, such as the collision of neutron stars.”
In the end, bringing more detectors into the gravitational-wave network will also allow for more detailed test’s of Einstein’s theory of General Relativity. Caltech’s David H. Reitze, the executive director of the LIGO Laboratory, also praised the new partnership and what it will allow for.
“With this first joint detection by the Advanced LIGO and Virgo detectors, we have taken one step further into the gravitational-wave cosmos,” he said. “Virgo brings a powerful new capability to detect and better locate gravitational-wave sources, one that will undoubtedly lead to exciting and unanticipated results in the future.”
The study of gravitational waves is a testament to the growing capability of the world’s science teams and the science of interferometry. For decades, the existence of gravitational waves was merely a theory; and by the turn of the century, all attempts to detect them had yielded nothing. But in just the past eighteen months, multiple detections have been made, and dozens more are expected in the coming years.
What’s more, thanks to the new global network and the improved instruments and methods, these events are sure to tell us volumes about our Universe and the physics that govern it.
However, according to a team of astronomers from Glasgow and Arizona, astronomers need not limit themselves to detecting waves caused by massive gravitational mergers. According to a study they recently produced, the Advanced LIGO, GEO 600, and Virgo gravitational-wave detector network could also detect the gravitational waves created by supernova. In so doing, astronomers will able to see inside the hearts of collapsing stars for the first time.
Otherwise known as Type II supernovae, CCSNe are what happens when a massive star reaches the end of its lifespan and experiences rapid collapse. This triggers a massive explosion that blows off the outer layers of the star, leaving behind a remnant neutron star that may eventually become a black hole. In order for a star to undergo such collapse, it must be at least 8 times (but no more than 40 to 50 times) the mass of the Sun.
When these types of supernovae take place, it is believed that neutrinos produced in the core transfer gravitational energy released by core collapse to the cooler outer regions of the star. Dr. Powell and her colleagues believe that this gravitational energy could be detected using current and future instruments. As they explain in their study:
“Although no CCSNe have currently been detected by gravitational-wave detectors, previous studies indicate that an advanced detector network may be sensitive to these sources out to the Large Magellanic Cloud (LMC). A CCSN would be an ideal multi-messenger source for aLIGO and AdV, as neutrino and electromagnetic counterparts to the signal would be expected. The gravitational waves are emitted from deep inside the core of CCSNe, which may allow astrophysical parameters, such as the equation of state (EOS), to be measured from the reconstruction of the gravitational-wave signal.”
Dr. Powell and her also outline a procedure in their study that could be implemented using the Supernova model Evidence Extractor (SMEE). The team then conducted simulations using the latest three-dimensional models of gravitational-wave core collapse supernovae to determine if background noise could be eliminated and proper detection of CCSNe signals made.
As Dr. Powell explained to Universe Today via email:
“The Supernova Model Evidence Extractor (SMEE) is an algorithm that we use to determine how supernovae get the huge amount of energy they need to explode. It uses Bayesian statistics to distinguish between different possible explosion models. The first model we consider in the paper is that the explosion energy comes from the neutrinos emitted by the star. In the second model the explosion energy comes from rapid rotation and extremely strong magnetic fields.”
From this, the team concluded that in a three-detector network researchers could correctly determine the explosion mechanics for rapidly-rotating supernovae, depending on their distance. At a distance of 10 kiloparsecs (32,615 light-years) they would be able to detect signals of CCSNe with 100% accuracy, and signals at 2 kiloparsecs (6,523 light-years) with 95% accuracy.
In other words, if and when a supernova takes place in the local galaxy, the global network formed by the Advanced LIGO, Virgo and GEO 600 gravitational wave detectors would have an excellent chance of picking up on it. The detection of these signals would also allow for some groundbreaking science, enabling scientists to “see” inside of exploding stars for the first time. As Dr. Powell explained:
“The gravitational waves are emitted from deep inside the core of the star where no electromagnetic radiation can escape. This allows a gravitational wave detection to tell us information about the explosion mechanism that can not be determined with other methods. We may also be able to determine other parameters such as how rapidly the star is rotating.”
Dr. Powell, having recently completed work on her PhD will also be taking up a postdoc position with the RC Centre of Excellence for Gravitational Wave Discovery (OzGrav), the gravitational wave program hosted by the University of Swinburne in Australia. In the meantime, she and her colleagues will be conducting targeted searchers for supernovae that occurred during the first and seconds advanced detector observing runs.
While there are no guarantees at this point that they will find the sought-after signals that would demonstrate that supernovae are detectable, the team has high hopes. And given the possibilities that this research holds for astrophysics and astronomy, they are hardly alone!
According to a new study by a team of astronomers from the Center of Cosmology at the University of California Irvine, such mergers are far more common than we thought. After conducting a survey of the cosmos intended to calculate and categorize black holes, the UCI team determined that there could be as many as 100 million black holes in the galaxy, a finding which has significant implications for the study of gravitational waves.
Their study began roughly a year and a half ago, shortly after LIGO announced the first detection of gravitational waves. These waves were created by the merger of two distant black holes, each of which was equivalent in mass to about 30 Suns. As James Bullock, a professor of physics and astronomy at UC Irvine and a co-author on the paper, explained in a UCI press release:
“Fundamentally, the detection of gravitational waves was a huge deal, as it was a confirmation of a key prediction of Einstein’s general theory of relativity. But then we looked closer at the astrophysics of the actual result, a merger of two 30-solar-mass black holes. That was simply astounding and had us asking, ‘How common are black holes of this size, and how often do they merge?’”
Traditionally, astronomers have been of the opinion that black holes would typically be about the same mass as our Sun. As such, they sought to interpret the multiple gravitational wave detections made by LIGO in terms of what is known about galaxy formation. Beyond this, they also sought to create a framework for predicting future black hole mergers.
From this, they concluded that the Milky Way Galaxy would be home to up to 100 million black holes, 10 millions of which would have an estimated mass of about 30 Solar masses – i.e. similar to those that merged and created the first gravitational waves detected by LIGO in 2016. Meanwhile, dwarf galaxies – like the Draco Dwarf, which orbits at a distance of about 250,000 ly from the center of our galaxy – would host about 100 black holes.
They further determined that today, most low-mass black holes (~10 Solar masses) reside within galaxies of 1 trillion Solar masses (massive galaxies) while massive black holes (~50 Solar masses) reside within galaxies that have about 10 billion Solar masses (i.e. dwarf galaxies). After considering the relationship between galaxy mass and stellar metallicity, they interpreted a galaxy’s black hole count as a function of its stellar mass.
In addition, they also sought to determine how often black holes occur in pairs, how often they merge and how long this would take. Their analysis indicated that only a tiny fraction of black holes would need to be involved in mergers to accommodate what LIGO observed. It also offered predictions that showed how even larger black holes could be merging within the next decade.
As Manoj Kaplinghat, also a UCI professor of physics and astronomy and the second co-author on the study, explained:
“We show that only 0.1 to 1 percent of the black holes formed have to merge to explain what LIGO saw. Of course, the black holes have to get close enough to merge in a reasonable time, which is an open problem… If the current ideas about stellar evolution are right, then our calculations indicate that mergers of even 50-solar-mass black holes will be detected in a few years.”
In other words, our galaxy could be teeming with black holes, and mergers could be happening in a regular basis (relative to cosmological timescales). As such, we can expect that many more gravity wave detections will be possible in the coming years. This should come as no surprise, seeing as how LIGO has made twoadditional detections since the winter of 2016.
With many more expected to come, astronomers will have many opportunities to study black holes mergers, not to mention the physics that drive them!
At the center of our galaxy, roughly 26,000 light years from Earth, lies the Supermassive Black Hole (SMBH) known as Sagittarius A*. Measuring 44 million km across, this object is roughly 4 million times as massive as our Sun and exerts a tremendous gravitational pull. Since astronomers cannot detect black holes directly, its existence has been determined largely from the effect it has on the small group of stars orbiting it.
In this respect, scientists have found that observing Sagittarius A* is an effective way of testing the physics of gravity. For instance, in the course of observing these stars, a team of German and Czech astronomers noted subtle effects caused by the black hole’s gravity. In so doing, they were able to yet again confirm some of the predictions made by Einstein’s famous Theory of General Relativity.
From this, they measured the orbits of the stars that orbit Sagittarius A* to test predictions made by classical Newtonian physics (i.e. Universal Gravitation), as well as predictions based on general relativity. What they found was that one of the stars (S2) showed deviations in its orbit which were defied the former, but were consistent with the latter.
This star, which has 15 times the mass of our Sun, follows an elliptical orbit around the SMBH, completing a single orbit in about 15.6 years. At its closest, it gets to within 17 light hours of the black hole, which is the equivalent of 120 times the distance between the Sun and the Earth (120 AU). Essentially, the research team noted that S2 had the most elliptical orbit of any star orbiting the Supermassive Black Hole.
They also noted a slight change in its orbit – a few percent in the shape and about one-sixth of a degree in orientation. This could only be explained as being due to the relativistic effects caused by Sagittarius A* intense gravity, which cause a precession in its orbit. What this means is, the elliptical loop of S2’s orbit rotates around the SMBH over time, with its perihelion point aimed in different directions.
Interestingly enough, this is similar to the effect that was observed in Mercury’s orbit – aka. the “perihelion precession of Mercury” – during the late 19th century. This observation challenged classical Newtonian mechanics and led scientists to conclude that Newton’s theory of gravity was incomplete. It is also what prompted Einstein to develop his theory of General Relativity, which offered a satisfactory explanation for the issue.
Should the results of their study be confirmed, this will be the first time that the effects of general relativity have been precisely calculated using the stars that orbit a Supermassive Black Hole. Marzieh Parsa – a PhD student at the University of Cologne, Germany and lead author of the paper – was understandably excited with these results. As she stated in an ESO press statement:
“The Galactic Center really is the best laboratory to study the motion of stars in a relativistic environment. I was amazed how well we could apply the methods we developed with simulated stars to the high-precision data for the innermost high-velocity stars close to the supermassive black hole.“
This study was made possible thanks to the high-accuracy of the VLT’s instruments; in particular, the adaptive optics on the NACO camera and the SINFONI near-infrared spectrometer. These instruments were vital in tracking the star’s close approach and retreat from the black hole, which allowed for the team to precisely determine the shape of its orbit and thusly determine the relativistic effects on the star.
In addition to the more precise information about S2’s orbit, the team’s analysis also provided new and more accurate estimates of Sagittarius A* mass, as well as its distance from Earth. This could open up new avenues of research for this and other Supermassive Black Holes, as well as additional experiments that could help scientists to learn more about the physics of gravity.
The results also provided a preview of the measurements and tests that will be taking place next year. In 2018, the star S2 will be making a very close approach to Sagittarius A*. Scientists from around the world will be using this opportunity to test the GRAVITY instrument, a second-generation instrument that was recently installed on the Very Large Telescope Interferometer (VLTI).
Developed by an international consortium led by the Max Planck Institute for Extraterrestrial Physics, this instrument has been conducting observations of the Galactic Center since 2016. In 2018, it will be used to measure the orbit of S2 with even greater precision, which is expected to be most revealing. At this time, astrophysicists will be seeking to make additional measurements of the SMBH’s general relativistic effects.
Beyond that, they also hope to detect additional deviations in the star’s orbit that could hint at the existence of new physics! With the right tools trained on the right place, and at the right time, scientists just might find that even Einstein’s theories of gravity were not entirely complete. But in the meantime, it looks like the late and great theoretical physicist was right again!
And be sure to check out this video of the recent study, courtesy of the ESO:
It’s been over a century since Einstein firs proposed his Theory of General Relativity, his groundbreaking proposal for how gravity worked on large scales throughout the cosmos. And yet, after all that time, experiments are still being conducted that show that Einstein’s field equations were right on the money. And in some cases, old experiments are finding new uses, helping astronomers to unlock other astronomical mysteries.
Case in point: using the Hubble Space Telescope, NASA astronomers have repeated a century-old test of General Relativity to determine the mass of a white dwarf star. In the past, this test was used to determine how it deflects light from a background star. In this case, it was used to provide new insights into theories about the structure and composition of the burned-out remnants of a star.
White dwarfs are what become of a star after it has exited the Main Sequence of its lifespan after exhausting their nuclear fuel. This is followed by the star expelling most of its outer material, usually through a massive explosion (aka. a supernova). What is left behind is a small and extreme dense (second only to a neutron star) which exerts an incredible gravitational force.
This attribute is what makes white dwarfs a good means for testing General Relativity. By measuring how much they deflect the light from a background star, astronomers are able to see the effect gravity has on the curvature of spacetime. This is precisely similar to what British astronomer Sir Arthur Eddington did in 1919, when he led an expedition to determine how much the Sun’s gravity deflected the light of a background star during a solar eclipse.
Known as gravitational microlensing, this same experiment was repeated by the NASA team. Using the Hubble Space Telescope, they observed Stein 2051B – a white dwarf located just 17 light-years from Earth – on seven different occasions during a two-year period. During this period, it passed in front of a background star located about 5000 light-years distant, which produced a visible deviation in the path of the star’s light.
The resulting deviation was incredibly small – only 2 milliarseconds from its actual position – and was only discernible thanks to the optical resolution of Hubble’s Wide Field Camera 3 (WFC3). Such a deviation would have been impossible to detect using instruments that predate Hubble. And more importantly, the results were consistent with what Einstein predicted a century ago.
As Kailash Sahu, an astronomer at the Space Telescope Science Institute (STScI) and the lead researcher on the project, explained in a NASA press release, this method is also an effective way to test a star’s mass. “This microlensing method is a very independent and direct way to determine the mass of a star,” he said. “It’s like placing the star on a scale: the deflection is analogous to the movement of the needle on the scale.”
The deflection measurement yielded highly-accurate results concerning the mass of the white dwarf star – roughly 68 percent of the Sun’s mass (aka. 0.68 Solar masses) – which was also consistent with theoretical predictions. This is highly significant, in that it opens the door to a new and interesting method for determining the mass of distant stars that do not have companions.
In the past, astronomers have typically determined the mass of stars by observing binary pairs and calculating their orbital motions. Much in the same way that radial velocity measurements are used by astronomers to determine if a planet has a system of exoplanets, measuring the influence two stars have on each other is used to determine how much mass each possesses.
This was how astronomers determined the mass of the Sirius star system, which is located about 8.6 light years from Earth. This binary star system consists of a white supergiant (Sirius A) and a white dwarf companion (Sirius B) which orbit each other with a radial velocity of 5.5 km/s. These measurements helped astronomers determine that Sirius A has a mass of about 2.02 Solar masses while Sirius B weighs in at 0.978 Solar masses.
And while Stein 2051B has a companion (a bright red dwarf), astronomers cannot accurately measure its mass because the stars are too far apart – at least 8 billion km (5 billion mi). Hence, this method could be used in the future wherever companion stars are unavailable or too distant. The Hubble observations also helped the team to independently verify the theory that a white dwarf’s radius can be determined by its mass.
This theory was first proposed by Subrahmanyan Chandrasekhar in 1935, the Indian-American astronomer whose theoretical work on the evolution of stars (and black holes) earned him the Nobel Prize for Physics in 1983. They could also help astronomers to learn more about the internal composition of white dwarfs. But even with an instrument as sophisticated as the WFC3, obtaining these measurements was not without its share of difficulties.
As Jay Anderson, an astronomer with the STScI who led the analysis to precisely measure the positions of stars in the Hubble images, explained:
“Stein 2051B appears 400 times brighter than the distant background star. So measuring the extremely small deflection is like trying to see a firefly move next to a light bulb. The movement of the insect is very small, and the glow of the light bulb makes it difficult to see the insect moving.”
Dr. Sahu presented his team’s findings yesterday (June 7th) at the American Astronomical Society meeting in Austin, Texas. The team’s result will also appear in the journal Science on June 9th. And in the future, the researchers plan to use Hubble to conduct a similar microlensing study on Proxima Centauri, our solar system’s closest stellar neighbor and home to the closest exoplanet to Earth (Proxima b).
It is important to note that this is by no means the only modern experiment that has validated Einstein’s theories. In recent years, General Relativity has been confirmed through observations of rapidly spinning pulsars, 3D simulations of cosmic evolution, and (most importantly) the discovery of gravitational waves. Even in death, Einstein is still making valued contributions to astrophysics!
And now, a little over a year later, a team of researchers from the Monash Center for Astrophysics has announced another potential revelation. Based on their ongoing studies of gravitational waves, the team recently proposed a theoretical concept known as ‘orphan memory’. If true, this concept could revolutionize the way we think about gravitational waves and spacetime.
Researchers from Monash Center for Astrophysics are part of what is known as the LIGO Scientific Collaboration (LSC) – a group of scientists dedicated to developing the hardware and software needed to study gravitational waves. In addition to creating a system for vetting detections, the team played a key role in data analysis – observing and interpreting the data that was gathered – and were also instrumental in the design of the LIGO mirrors.
Looking beyond what LIGO and other experiments (like the Virgo Interferometer) observed, the research team sought to address how these detectors capabilities could be extended further by finding the “memory” of gravitational waves. The study that describes this theory was recently published in the Physical Review Letters under the title “Detecting Gravitational Wave Memory without Parent Signals“.
According to their new theory, spacetime does not return to its normal state after a cataclysmic event generates gravitational waves that cause it to stretch out. Instead, it remains stretched, which they refer to as “orphan memory” – the word “orphan” alluding to the fact the “parent wave” is not directly detectable. While this effect has yet to be observed, it could open up some very interesting opportunities for gravitational wave research.
At present, detectors like LIGO and Virgo are only able to discern the presence of gravitational waves at certain frequencies. As such, researchers are only able to study waves generated by specific types of events and trace them back to their source. As Lucy McNeill, a researchers from the Monash Center for Astrophysics and the lead author on the paper, said in a recent University press statement:
“If there are exotic sources of gravitational waves out there, for example, from micro black holes, LIGO would not hear them because they are too high-frequency. But this study shows LIGO can be used to probe the universe for gravitational waves that were once thought to be invisible to it.”
As they indicate in their study, high-frequency gravitational-wave bursts (i.e. ones that are in or below the kilohertz range) would produce orphan memory that the LIGO and Virgo detectors would be able to pick up. This would not only increase the bandwidth of these detectors exponentially, but open up the possibility of finding evidence of gravity wave bursts in previous searches that went unnoticed.
Dr Eric Thrane, a lecturer at the Monash School of Physics and Astronomy and another a member of the LSC team, was also one of the co-authors of the new study. As he stated, “These waves could open the way for studying physics currently inaccessible to our technology.”
But as they admit in their study, such sources might not even exist and more research is needed to confirm that “orphan memory” is in fact real. Nevertheless, they maintain that searching for high-frequency sources is a useful way to probe for new physics, and it just might reveal things we weren’t expecting to find.
“A dedicated gravitational-wave memory search is desirable. It will have enhanced sensitivity compared to current burst searches,” they state. “Further, a dedicated search can be used to determine whether a detection candidate is consistent with a memory burst by checking to see if the residuals (following signal subtraction) are consistent with Gaussian noise.”
Alas, such searches may have to wait upon the proposed successors to the Advanced LIGO experiment. These include the Einstein Telescope and Cosmic Explorer, two proposed third-generation gravitational wave detectors. Depending on what future surveys find, we may discover that spacetime not only stretches from the creation of gravitational waves, but also bears the “stretch marks” to prove it!
Dark Matter has been something of a mystery ever since it was first proposed. In addition to trying to find some direct evidence of its existence, scientists have also spent the past few decades developing theoretical models to explain how it works. In recent years, the popular conception has been that Dark Matter is “cold”, and distributed in clumps throughout the Universe, an observation supported by the Planck mission data.
However, a new study produced by an international team of researchers paints a different picture. Using data from the Kilo Degree Survey (KiDS), these researchers studied how the light coming from millions of distant galaxies was affected by the gravitational influence of matter on the largest of scales. What they found was that Dark Matter appears to more smoothly distributed throughout space than previously thought.
For the past five years, the KiDS survey has been using the VLT Survey Telescope (VST) – the largest telescope at the ESO’s La Silla Paranal Observatory in Chile – to survey 1500 square degrees of the southern night sky. This volume of space has been monitored in four bands (UV, IR, green and red) using weak gravitational lensing and photometric redshift measurements.
Consistent with Einstein’s Theory of General Relativity, gravitational lensing involves studying how the gravitational field of a massive object will bend light. Meanwhile, redshift attempts to gauge the speed at which other galaxies are moving away from ours by measuring the extent to which their light is shifted towards the red end of the spectrum (i.e. its wavelength becomes longer the faster the source is moving away).
Gravitational lensing is especially useful when it comes to determining how the Universe came to be. Our current cosmological model, known as the Lambda Cold Dark Matter (Lambda CDM) model, states that Dark Energy is responsible for the late-time acceleration in the expansion of the Universe, and that Dark Matter is made up of massive particles that are responsible for cosmological structure formation.
Using a slight variation on this technique known as cosmic sheer, the research team studied light from distant galaxies to determine how it is warped by the presence of the largest structures in the Universe (such as superclusters and filaments). As Dr. Hendrik Hildebrandt – an astronomer from the Argelander Institute for Astronomy (AIfA) and the lead author of the paper – told Universe Today via email:
“Usually one thinks of one big mass like a galaxy cluster that causes this light deflection. But there is also matter all throughout the Universe. The light from distant galaxies gets continuously deflected by this so-called large-scale structure. This results in galaxies that are close on the sky to be “pointing” in the same direction. It’s a tiny effect but it can be measured with statistical methods from large samples of galaxies.When we have measured how strongly galaxies are “pointing” in the same direction we can infer from this the statistical properties of the large-scale structure, e.g. the mean matter density and how strongly the matter is clumped/clustered.”
Using this technique, the research team conducted an analysis of 450 square degrees of KiDS data, which corresponds to about 1% of the entire sky. Within this volume of space, the observed how the light coming from about 15 million galaxies interacted with all the matter that lies between them and Earth.
Combining the extremely sharp images obtained by VST with advanced computer software, the team was able to carry out one of the most precise measurements ever made of cosmic shear. Interestingly enough, the results were not consistent with those produced by the ESA’s Planck mission, which has been the most comprehensive mapper of the Universe to date.
The Planck mission has provided some wonderfully detailed and accurate information about the Cosmic Microwave Background (CMB). This has helped astronomers to map the early Universe, as well as develop theories of how matter was distributed during this period. As Hildebrandt explained:
“Planck measures many cosmological parameters with exquisite precision from the temperature fluctuations of the cosmic microwave background, i.e. physical processes that happened 400,000 years after the Big Bang. Two of those parameters are the mean matter density of the Universe and a measure of how strongly this matter is clumped. With cosmic shear, we also measure these two parameters but a much later cosmic times (a few billion years ago or ~10 billion years after the Big Bang), i.e. in our more recent past.”
However, Hildebrandt and his team found values for these parameters that were significantly lower than those found by Planck. Basically, their cosmic shear results suggest that there is less matter in the Universe and that it is less clustered than what the Planck results predicted. These results are likely to have an impact on cosmological studies and theoretical physics in the coming years.
As it stands, Dark Matter remains undetectable using standard methods. Like black holes, its existence can only be inferred from the observable gravitational effects it has on visible matter. In this case, its presence and fundamental nature are measured by how it has affected the evolution of the Universe over the past 13.8 billion years. But since the results appear to be conflicting, astronomers may now have to reconsider some of their previously held notions.
“There are several options: because we do not understand the dominant ingredients of the Universe (dark matter and dark energy) we can play with the properties of both,” said Hildebrandt. “For example, different forms of dark energy (more complex than the simplest possibility, which is Einstein’s “cosmological constant”) could explain our measurements. Another exciting possibility is that this is a sign that the laws of gravity on the scale of the Universe are different from General Relativity. All we can say for now is that something appears to be not quite right!”