In April of 2019, the Event Horizon Telescope collaboration history made history when it released the first image of a black hole ever taken. This accomplishment was decades in the making and triggered an international media circus. The picture was the result of a technique known as interferometry, where observatories across the world combined light from their telescopes to create a composite image.
This image showed what astrophysicists have predicted for a long time, that extreme gravitational bending causes photons to fall in around the event horizon, contributing to the bright rings that surround them. Last week, on March 18th, a team of researchers from the Harvard-Smithsonian Center for Astrophysics (CfA) announced new research that shows how black hole images could reveal an intricate substructure within them.
Since the 1960s, astronomers have theorized that all the visible matter in the Universe (aka. baryonic or “luminous matter) constitutes just a small fraction of what’s actually there. In order for the predominant and time-tested theory of gravity to work (as defined by General Relativity), scientists have had to postulate that roughly 85% of the mass in the Universe consists of “Dark Matter”.
Despite many decades of study, scientists have yet to find any direct evidence of Dark Matter and the constituent particle and its origins remain a mystery. However, a team of physicists from the University of York in the UK has proposed a new candidate particle that was just recently discovered. Known as the d-star hexaquark, this particle could have formed the “Dark Matter” in the Universe during the Big Bang.
Einstein’s Theory of General Relativity predicted that black holes would form and eventually collide. It also predicted the creation of gravitational waves from the collision. But how often does this happen, and can we calculate how many stars this will happen to?
A new study from a physicist at Vanderbilt University sought to answer these questions.
The theory of general relativity is packed with strange predictions about how space and time are affected by massive bodies. Everything from gravitational waves to the lensing of light by dark matter. But one of the oddest predictions is an effect known as frame-dragging. The effect is so subtle it was first measured just a decade ago. Now astronomers have measured the effect around a white dwarf, and it tells us how some supernovae occur.
At the center of our galaxy lies a region where roughly 10 million stars are packed into just 1 parsec (3.25 light-years) of space. At the center of this lies the supermassive black hole (SMBH) known as Sagittarius A*, which has a mass of over 4 million Suns. For decades, astronomers have been trying to get a better look at this region in the hopes of understanding the incredible forces at work and how they have affected the evolution of our galaxy.
What they’ve found includes a series of stars that orbit very closely to Sagittarius A* (like S1 and S2), which have been used to test Einstein’s Theory of General Relativity. And recently, a team from UCLA’s Galactic Center Orbits Initiative detected a series of compact objects that also orbit the SMBH. These objects look like clouds of gas but behave like stars, depending on how close they are in their orbits to Sagittarius A*.
Black holes are one of the most awesome and mysterious forces in the Universe. Originally predicted by Einstein’s Theory of General Relativity, these points in spacetime are formed when massive stars undergo gravitational collapse at the end of their lives. Despite decades of study and observation, there is still much we don’t know about this phenomenon.
For example, scientists are still largely in the dark about how the matter that falls into orbit around a black hole and is gradually fed onto it (accretion disks) behave. Thanks to a recent study, where an international team of researchers conducted the most detailed simulations of a black hole to date, a number of theoretical predictions regarding accretion disks have finally been validated.
Special Relativity. It’s been the bane of space explorers, futurists and science fiction authors since Albert Einstein first proposed it in 1905. For those of us who dream of humans one-day becoming an interstellar species, this scientific fact is like a wet blanket. Luckily, there are a few theoretical concepts that have been proposed that indicate that Faster-Than-Light (FTL) travel might still be possible someday.
A popular example is the idea of a wormhole: a speculative structure that links two distant points in space time that would enable interstellar space travel. Recently, a team of Ivy League scientists conducted a study that indicated how “traversable wormholes” could actually be a reality. The bad news is that their results indicate that these wormholes aren’t exactly shortcuts, and could be the cosmic equivalent of “taking the long way”!
At the center of our galaxy resides a Supermassive Black Hole (SMBH) known as Sagittarius A. Based on ongoing observations, astronomers have determined that this SMBH measures 44 million km (27.34 million mi) in diameter and has an estimated mass of 4.31 million Solar Masses. On occasion, a star will wander too close to Sag A and be torn apart in a violent process known as a tidal disruption event (TDE).
These events cause the release of bright flares of radiation, which let astronomers know that a star has been consumed. Unfortunately, for decades, astronomers have been unable to distinguish these events from other galactic phenomena. But thanks to a new study from by an international team of astrophysicists, astronomers now have a unified model that explains recent observations of these extreme events.
As Enrico Ramirez-Ruiz – the professor and chair of astronomy and astrophysics at UC Santa Cruz, the Niels Bohr Professor at the University of Copenhagen, and a co-author on the paper – explained in a UCSC press release:
“Only in the last decade or so have we been able to distinguish TDEs from other galactic phenomena, and the new model will provide us with the basic framework for understanding these rare events.”
In most galaxies, SMBHs do not actively consume any material and therefore do not emit any light, which distinguishes them from galaxies that have Active Galactic Nuclei (AGNs). Tidal disruption events are therefore rare, occurring only once about every 10,000 years in a typical galaxy. However, when a star does get torn apart, it results in the release of an intense amount of radiation. As Dr. Dai explained:
“It is interesting to see how materials get their way into the black hole under such extreme conditions. As the black hole is eating the stellar gas, a vast amount of radiation is emitted. The radiation is what we can observe, and using it we can understand the physics and calculate the black hole properties. This makes it extremely interesting to go hunting for tidal disruption events.”
In the past few years, a few dozen candidates for tidal disruption events (TDEs) have been detected using wide-field optical and UV transient surveys as well as X-ray telescopes. While the physics are expected to be the same for all TDEs, astronomers have noted that a few distinct classes of TDEs appear to exist. While some emit mostly x-rays, others emit mostly visible and ultraviolet light.
As a result, theorists have struggled to understand the diverse properties observed and create a coherent model that can explain them all. For the sake of their model, Dr. Dai and her colleagues combined elements from general relativity, magnetic fields, radiation, and gas hydrodynamics. The team also relied on state-of-the-art computational tools and some recently-acquired large computer clusters funded by the Villum Foundation for Jens Hjorth (head of DARK Cosmology Center), the U.S. National Science Foundation and NASA.
Using the model that resulted, the team concluded that it is the viewing angle of the observer that accounts for the differences in observation. Essentially, different galaxies are oriented randomly with respect to observers on Earth, who see different aspects of TDEs depending on their orientation. As Ramirez-Ruiz explained:
“It is like there is a veil that covers part of a beast. From some angles we see an exposed beast, but from other angles we see a covered beast. The beast is the same, but our perceptions are different.”
In the coming years, a number of planned survey projects are expected to provide much more data on TDEs, which will help expand the field of research into this phenomena. These include the Young Supernova Experiment (YSE) transient survey, which will be led by the DARK Cosmology Center at the Niels Bohr Institute and UC Santa Cruz, and the Large Synoptic Survey Telescopes (LSST) being built in Chile.
According to Dr. Dai, this new model shows what astronomers can expect to see when viewing TDEs from different angles and will allow them to fit different events into a coherent framework. “We will observe hundreds to thousands of tidal disruption events in a few years,” she said. “This will give us a lot of ‘laboratories’ to test our model and use it to understand more about black holes.”
This improved understanding of how black holes occasionally consume stars will also provide additional tests for general relativity, gravitational wave research, and help astronomers to learn more about the evolution of galaxies.
In the early 1960s, scientists developed the gravity-assist method, where a spacecraft would conduct a flyby of a major body in order to increase its speed. Many notable missions have used this technique, including the Pioneer, Voyager,Galileo, Cassini, andNew Horizons missions. In the course of many of these flybys, scientists have noted an anomaly where the increase in the spacecraft’s speed did not accord with orbital models.
This has come to be known as the “flyby anomaly”, which has endured despite decades of study and resisted all previous attempts at explanation. To address this, a team of researchers from the University Institute of Multidisciplinary Mathematics at the Universitat Politecnica de Valencia have developed a new orbital model based on the maneuvers conducted by the Juno probe.
The study, which recently appeared online under the title “A Possible Flyby Anomaly for Juno at Jupiter“, was conducted by Luis Acedo, Pedro Piqueras and Jose A. Morano. Together, they examined the possible causes of the so-called “flyby anomaly” using the perijove orbit of the Juno probe. Based on Juno’s many pole-to-pole orbits, they not only determined that it too experienced an anomaly, but offered a possible explanation for this.
To break it down, the speed of a spacecraft is determined by measuring the Doppler shift of radio signals from the spacecraft to the antennas on the Deep Space Network (DSN). During the 1970s when the Pioneer 10 and 11 probes were launched, visiting Jupiter and Saturn before heading off towards the edge of the Solar System, these probes both experienced something strange as they passed between 20 to 70 AU (Uranus to the Kuiper Belt) from the Sun.
Basically, the probes were both 386,000 km (240,000 mi) farther from where existing models predicted they would be. This came to be known as the “Pioneer anomaly“, which became common lore within the space physics community. While the Pioneer anomaly was resolved, the same phenomena has occurred many times since then with subsequent missions. As Dr. Acebo told Universe Today via email:
“The “flyby anomaly” is a problem in astrodynamics discovered by a JPL’s team of researchers lead by John Anderson in the early 90s. When they tried to fit the whole trajectory of the Galileo spacecraft as it approached the Earth on December, 8th, 1990, they found that this only can be done by considering that the ingoing and outgoing pieces of the trajectory correspond to asymptotic velocities that differ in 3.92 mm/s from what is expected in theory.
“The effect appears both in the Doppler data and in the ranging data, so it is not a consequence of the measurement technique. Later on, it has also been found in several flybys performed by Galileo again in 1992, the NEAR [Near Earth Asteroid Rendezvous mission] in 1998, Cassini in 1999 or Rosetta and Messenger in 2005. The largest discrepancy was found for the NEAR (around 13 mm/s) and this is attributed to the very close distance of 532 Km to the surface of the Earth at the perigee.”
Another mystery is that while in some cases the anomaly was clear, in others it was on the threshold of detectability or simply absent – as was the case with Juno‘s flyby of Earth in October of 2013. The absence of any convincing explanation has led to a number of explanations, ranging from the influence or dark matter and tidal effects to extensions of General Relativity and the existence of new physics.
However, none of these have produced a substantive explanation that could account for flyby anomalies. To address this, Acedo and his colleagues sought to create a model that was optimized for the Juno mission while at perijove – i.e. the point in the probe’s orbit where it is closest to Jupiter’s center. As Acedo explained:
“After the arrival of Juno at Jupiter on July, 4th, 2016, we had the idea of developing our independent orbital model to compare with the fitted trajectories that were being calculated by the JPL team at NASA. After all, Juno is performing very close flybys of Jupiter because the altitude over the top clouds (around 4000 km) is a small fraction of the planet’s radius. So, we expected to find the anomaly here. This would be an interesting addition to our knowledge of this effect because it would prove that it is not only a particular problem with Earth flybys but that it is universal.”
Their model took into account the tidal forces exerted by the Sun and by Jupiter’s larger satellites – Io, Europa, Ganymede and Callisto – and also the contributions of the known zonal harmonics. They also accounted for Jupiter’s multipolar fields, which are the result of the planet oblate shape, since these play a far more important role than tidal forces as Juno reaches perijove.
In the end, they determined that an anomaly could also be present during the Juno flybys of Jupiter. They also noted a significant radial component in this anomaly, one which decayed the farther the probe got from the center of Jupiter. As Acebo explained:
“Our conclusion is that an anomalous acceleration is also acting upon the Juno spacecraft in the vicinity of the perijove (in this case, the asymptotic velocity is not a useful concept because the trajectory is closed). This acceleration is almost one hundred times larger than the typical anomalous accelerations responsible for the anomaly in the case of the Earth flybys. This was already expected in connection with Anderson et al.’s initial intuition that the effect increases with the angular rotational velocity of the planet (a period of 9.8 hours for Jupiter vs the 24 hours of the Earth), the radius of the planet and probably its mass.”
They also determined that this anomaly appears to be dependent on the ratio between the spacecraft’s radial velocity and the speed of light, and that this decreases very fast as the craft’s altitude over Jupiter’s clouds changes. These issues were not predicted by General Relativity, so there is a chance that flyby anomalies are the result of novel gravitational phenomena – or perhaps, a more conventional effect that has been overlooked.
In the end, the model that resulted from their calculations accorded closely with telemetry data provided by the Juno mission, though questions remain. “Further research is necessary because the pattern of the anomaly seems very complex and a single orbit (or a sequence of similar orbits as in the case of Juno) cannot map the whole field,” said Acebo. “A dedicated mission is required but financial cuts and limited interest in experimental gravity may prevent us to see this mission in the near future.”
It is a testament to the complexities of physics that even after sixty years of space exploration – and one hundred years since General Relativity was first proposed – that we are still refining our models. Perhaps someday we will find there are no mysteries left to solve, and the Universe will make perfect sense to us. What a terrible day that will be!
Since the late 1920s, astronomers have been aware of the fact that the Universe is in a state of expansion. Initially predicted by Einstein’s Theory of General Relativity, this realization has gone on to inform the most widely-accepted cosmological model – the Big Bang Theory. However, things became somewhat confusing during the 1990s, when improved observations showed that the Universe’s rate of expansion has been accelerating for billions of years.
This led to the theory of Dark Energy, a mysterious invisible force that is driving the expansion of the cosmos. Much like Dark Matter which explained the “missing mass”, it then became necessary to find this elusive energy, or at least provide a coherent theoretical framework for it. A new study from the University of British Columbia (UBC) seeks to do just that by postulating the Universe is expanding due to fluctuations in space and time.
The study – which was recently published in the journal Physical Review D – was led by Qingdi Wang, a PhD student with the Department of Physics and Astronomy at UBC. Under the supervisions of UBC Professor William Unruh (the man who proposed the Unruh Effect) and with assistance from Zhen Zhu (another PhD student at UBC), they provide a new take on Dark Energy.
The team began by addressing the inconsistencies arising out of the two main theories that together explain all natural phenomena in the Universe. These theories are none other than General Relativity and quantum mechanics, which effectively explain how the Universe behaves on the largest of scales (i.e. stars, galaxies, clusters) and the smallest (subatomic particles).
Unfortunately, these two theories are not consistent when it comes to a little matter known as gravity, which scientists are still unable to explain in terms of quantum mechanics. The existence of Dark Energy and the expansion of the Universe are another point of disagreement. For starters, candidates theories like vacuum energy – which is one of the most popular explanations for Dark Energy – present serious incongruities.
According to quantum mechanics, vacuum energy would have an incredibly large energy density to it. But if this is true, then General Relativity predicts that this energy would have an incredibly strong gravitational effect, one which would be powerful enough to cause the Universe to explode in size. As Prof. Unruh shared with Universe Today via email:
“The problem is that any naive calculation of the vacuum energy gives huge values. If one assumes that there is some sort of cutoff so one cannot get energy densities much greater than the Planck energy density (or about 1095 Joules/meter³) then one finds that one gets a Hubble constant – the time scale on which the Universe roughly doubles in size – of the order of 10-44 sec. So, the usual approach is to say that somehow something reduces that down so that one gets the actual expansion rate of about 10 billion years instead. But that ‘somehow’ is pretty mysterious and no one has come up with an even half convincing mechanism.”
Whereas other scientists have sought to modify the theories of General Relativity and quantum mechanics in order to resolve these inconsistencies, Wang and his colleagues sought a different approach. As Wang explained to Universe Today via email:
“Previous studies are either trying to modify quantum mechanics in some way to make vacuum energy small or trying to modify General Relativity in some way to make gravity numb for vacuum energy. However, quantum mechanics and General Relativity are the two most successful theories that explain how our Universe works…Instead of trying to modify quantum mechanics or General Relativity, we believe that we should first understand them better. We takes the large vacuum energy density predicted by quantum mechanics seriously and just let them gravitate according to General Relativity without modifying either of them.”
For the sake of their study, Wang and his colleagues performed new sets of calculations on vacuum energy that took its predicted high energy density into account. They then considered the possibility that on the tiniest of scales – billions of times smaller than electrons – the fabric of spacetime is subject to wild fluctuations, oscillating at every point between expansion and contraction.
As it swings back and forth, the result of these oscillations is a net effect where the Universe expands slowly, but at an accelerating rate. After performing their calculations, they noted that such an explanation was consistent with both the existence of quantum vacuum energy density and General Relativity. On top of that, it is also consistent with what scientists have been observing in our Universe for almost a century. As Unruh described it:
“Our calculations showed that one could consistently regard [that] the Universe on the tiniest scales is actually expanding and contracting at an absurdly fast rate; but that on a large scale, because of an averaging over those tiny scales, physics would not notice that ‘quantum foam’. It has a tiny residual effect in giving an effective cosmological constant (dark energy type effect). In some ways it is like waves on the ocean which travel as if the ocean were perfectly smooth but really we know that there is this incredible dance of the atoms that make up the water, and waves average over those fluctuations, and act as if the surface was smooth.”
In contrast to conflicting theories of a Universe where the various forces that govern it cannot be resolved and must cancel each other out, Wang and his colleagues presents a picture where the Universe is constantly in motion. In this scenario, the effects of vacuum energy are actually self-cancelling, and also give rise to the expansion and acceleration we have been observing all this time.
While it may be too soon to tell, this image of a Universe that is highly-dynamic (even on the tiniest scales) could revolutionize our understanding of spacetime. At the very least, these theoretical findings are sure to stimulate debate within the scientific community, as well as experiments designed to offer direct evidence. And that, as we know, is the only way we can advance our understanding of this thing known as the Universe.