At the center of our galaxy lies a region where roughly 10 million stars are packed into just 1 parsec (3.25 light-years) of space. At the center of this lies the supermassive black hole (SMBH) known as Sagittarius A*, which has a mass of over 4 million Suns. For decades, astronomers have been trying to get a better look at this region in the hopes of understanding the incredible forces at work and how they have affected the evolution of our galaxy.
What they’ve found includes a series of stars that orbit very closely to Sagittarius A* (like S1 and S2), which have been used to test Einstein’s Theory of General Relativity. And recently, a team from UCLA’s Galactic Center Orbits Initiative detected a series of compact objects that also orbit the SMBH. These objects look like clouds of gas but behave like stars, depending on how close they are in their orbits to Sagittarius A*.
Black holes are one of the most awesome and mysterious forces in the Universe. Originally predicted by Einstein’s Theory of General Relativity, these points in spacetime are formed when massive stars undergo gravitational collapse at the end of their lives. Despite decades of study and observation, there is still much we don’t know about this phenomenon.
For example, scientists are still largely in the dark about how the matter that falls into orbit around a black hole and is gradually fed onto it (accretion disks) behave. Thanks to a recent study, where an international team of researchers conducted the most detailed simulations of a black hole to date, a number of theoretical predictions regarding accretion disks have finally been validated.
Special Relativity. It’s been the bane of space explorers, futurists and science fiction authors since Albert Einstein first proposed it in 1905. For those of us who dream of humans one-day becoming an interstellar species, this scientific fact is like a wet blanket. Luckily, there are a few theoretical concepts that have been proposed that indicate that Faster-Than-Light (FTL) travel might still be possible someday.
A popular example is the idea of a wormhole: a speculative structure that links two distant points in space time that would enable interstellar space travel. Recently, a team of Ivy League scientists conducted a study that indicated how “traversable wormholes” could actually be a reality. The bad news is that their results indicate that these wormholes aren’t exactly shortcuts, and could be the cosmic equivalent of “taking the long way”!
At the center of our galaxy resides a Supermassive Black Hole (SMBH) known as Sagittarius A. Based on ongoing observations, astronomers have determined that this SMBH measures 44 million km (27.34 million mi) in diameter and has an estimated mass of 4.31 million Solar Masses. On occasion, a star will wander too close to Sag A and be torn apart in a violent process known as a tidal disruption event (TDE).
These events cause the release of bright flares of radiation, which let astronomers know that a star has been consumed. Unfortunately, for decades, astronomers have been unable to distinguish these events from other galactic phenomena. But thanks to a new study from by an international team of astrophysicists, astronomers now have a unified model that explains recent observations of these extreme events.
As Enrico Ramirez-Ruiz – the professor and chair of astronomy and astrophysics at UC Santa Cruz, the Niels Bohr Professor at the University of Copenhagen, and a co-author on the paper – explained in a UCSC press release:
“Only in the last decade or so have we been able to distinguish TDEs from other galactic phenomena, and the new model will provide us with the basic framework for understanding these rare events.”
In most galaxies, SMBHs do not actively consume any material and therefore do not emit any light, which distinguishes them from galaxies that have Active Galactic Nuclei (AGNs). Tidal disruption events are therefore rare, occurring only once about every 10,000 years in a typical galaxy. However, when a star does get torn apart, it results in the release of an intense amount of radiation. As Dr. Dai explained:
“It is interesting to see how materials get their way into the black hole under such extreme conditions. As the black hole is eating the stellar gas, a vast amount of radiation is emitted. The radiation is what we can observe, and using it we can understand the physics and calculate the black hole properties. This makes it extremely interesting to go hunting for tidal disruption events.”
In the past few years, a few dozen candidates for tidal disruption events (TDEs) have been detected using wide-field optical and UV transient surveys as well as X-ray telescopes. While the physics are expected to be the same for all TDEs, astronomers have noted that a few distinct classes of TDEs appear to exist. While some emit mostly x-rays, others emit mostly visible and ultraviolet light.
As a result, theorists have struggled to understand the diverse properties observed and create a coherent model that can explain them all. For the sake of their model, Dr. Dai and her colleagues combined elements from general relativity, magnetic fields, radiation, and gas hydrodynamics. The team also relied on state-of-the-art computational tools and some recently-acquired large computer clusters funded by the Villum Foundation for Jens Hjorth (head of DARK Cosmology Center), the U.S. National Science Foundation and NASA.
Using the model that resulted, the team concluded that it is the viewing angle of the observer that accounts for the differences in observation. Essentially, different galaxies are oriented randomly with respect to observers on Earth, who see different aspects of TDEs depending on their orientation. As Ramirez-Ruiz explained:
“It is like there is a veil that covers part of a beast. From some angles we see an exposed beast, but from other angles we see a covered beast. The beast is the same, but our perceptions are different.”
In the coming years, a number of planned survey projects are expected to provide much more data on TDEs, which will help expand the field of research into this phenomena. These include the Young Supernova Experiment (YSE) transient survey, which will be led by the DARK Cosmology Center at the Niels Bohr Institute and UC Santa Cruz, and the Large Synoptic Survey Telescopes (LSST) being built in Chile.
According to Dr. Dai, this new model shows what astronomers can expect to see when viewing TDEs from different angles and will allow them to fit different events into a coherent framework. “We will observe hundreds to thousands of tidal disruption events in a few years,” she said. “This will give us a lot of ‘laboratories’ to test our model and use it to understand more about black holes.”
This improved understanding of how black holes occasionally consume stars will also provide additional tests for general relativity, gravitational wave research, and help astronomers to learn more about the evolution of galaxies.
In the early 1960s, scientists developed the gravity-assist method, where a spacecraft would conduct a flyby of a major body in order to increase its speed. Many notable missions have used this technique, including the Pioneer, Voyager,Galileo, Cassini, andNew Horizons missions. In the course of many of these flybys, scientists have noted an anomaly where the increase in the spacecraft’s speed did not accord with orbital models.
This has come to be known as the “flyby anomaly”, which has endured despite decades of study and resisted all previous attempts at explanation. To address this, a team of researchers from the University Institute of Multidisciplinary Mathematics at the Universitat Politecnica de Valencia have developed a new orbital model based on the maneuvers conducted by the Juno probe.
The study, which recently appeared online under the title “A Possible Flyby Anomaly for Juno at Jupiter“, was conducted by Luis Acedo, Pedro Piqueras and Jose A. Morano. Together, they examined the possible causes of the so-called “flyby anomaly” using the perijove orbit of the Juno probe. Based on Juno’s many pole-to-pole orbits, they not only determined that it too experienced an anomaly, but offered a possible explanation for this.
To break it down, the speed of a spacecraft is determined by measuring the Doppler shift of radio signals from the spacecraft to the antennas on the Deep Space Network (DSN). During the 1970s when the Pioneer 10 and 11 probes were launched, visiting Jupiter and Saturn before heading off towards the edge of the Solar System, these probes both experienced something strange as they passed between 20 to 70 AU (Uranus to the Kuiper Belt) from the Sun.
Basically, the probes were both 386,000 km (240,000 mi) farther from where existing models predicted they would be. This came to be known as the “Pioneer anomaly“, which became common lore within the space physics community. While the Pioneer anomaly was resolved, the same phenomena has occurred many times since then with subsequent missions. As Dr. Acebo told Universe Today via email:
“The “flyby anomaly” is a problem in astrodynamics discovered by a JPL’s team of researchers lead by John Anderson in the early 90s. When they tried to fit the whole trajectory of the Galileo spacecraft as it approached the Earth on December, 8th, 1990, they found that this only can be done by considering that the ingoing and outgoing pieces of the trajectory correspond to asymptotic velocities that differ in 3.92 mm/s from what is expected in theory.
“The effect appears both in the Doppler data and in the ranging data, so it is not a consequence of the measurement technique. Later on, it has also been found in several flybys performed by Galileo again in 1992, the NEAR [Near Earth Asteroid Rendezvous mission] in 1998, Cassini in 1999 or Rosetta and Messenger in 2005. The largest discrepancy was found for the NEAR (around 13 mm/s) and this is attributed to the very close distance of 532 Km to the surface of the Earth at the perigee.”
Another mystery is that while in some cases the anomaly was clear, in others it was on the threshold of detectability or simply absent – as was the case with Juno‘s flyby of Earth in October of 2013. The absence of any convincing explanation has led to a number of explanations, ranging from the influence or dark matter and tidal effects to extensions of General Relativity and the existence of new physics.
However, none of these have produced a substantive explanation that could account for flyby anomalies. To address this, Acedo and his colleagues sought to create a model that was optimized for the Juno mission while at perijove – i.e. the point in the probe’s orbit where it is closest to Jupiter’s center. As Acedo explained:
“After the arrival of Juno at Jupiter on July, 4th, 2016, we had the idea of developing our independent orbital model to compare with the fitted trajectories that were being calculated by the JPL team at NASA. After all, Juno is performing very close flybys of Jupiter because the altitude over the top clouds (around 4000 km) is a small fraction of the planet’s radius. So, we expected to find the anomaly here. This would be an interesting addition to our knowledge of this effect because it would prove that it is not only a particular problem with Earth flybys but that it is universal.”
Their model took into account the tidal forces exerted by the Sun and by Jupiter’s larger satellites – Io, Europa, Ganymede and Callisto – and also the contributions of the known zonal harmonics. They also accounted for Jupiter’s multipolar fields, which are the result of the planet oblate shape, since these play a far more important role than tidal forces as Juno reaches perijove.
In the end, they determined that an anomaly could also be present during the Juno flybys of Jupiter. They also noted a significant radial component in this anomaly, one which decayed the farther the probe got from the center of Jupiter. As Acebo explained:
“Our conclusion is that an anomalous acceleration is also acting upon the Juno spacecraft in the vicinity of the perijove (in this case, the asymptotic velocity is not a useful concept because the trajectory is closed). This acceleration is almost one hundred times larger than the typical anomalous accelerations responsible for the anomaly in the case of the Earth flybys. This was already expected in connection with Anderson et al.’s initial intuition that the effect increases with the angular rotational velocity of the planet (a period of 9.8 hours for Jupiter vs the 24 hours of the Earth), the radius of the planet and probably its mass.”
They also determined that this anomaly appears to be dependent on the ratio between the spacecraft’s radial velocity and the speed of light, and that this decreases very fast as the craft’s altitude over Jupiter’s clouds changes. These issues were not predicted by General Relativity, so there is a chance that flyby anomalies are the result of novel gravitational phenomena – or perhaps, a more conventional effect that has been overlooked.
In the end, the model that resulted from their calculations accorded closely with telemetry data provided by the Juno mission, though questions remain. “Further research is necessary because the pattern of the anomaly seems very complex and a single orbit (or a sequence of similar orbits as in the case of Juno) cannot map the whole field,” said Acebo. “A dedicated mission is required but financial cuts and limited interest in experimental gravity may prevent us to see this mission in the near future.”
It is a testament to the complexities of physics that even after sixty years of space exploration – and one hundred years since General Relativity was first proposed – that we are still refining our models. Perhaps someday we will find there are no mysteries left to solve, and the Universe will make perfect sense to us. What a terrible day that will be!
Since the late 1920s, astronomers have been aware of the fact that the Universe is in a state of expansion. Initially predicted by Einstein’s Theory of General Relativity, this realization has gone on to inform the most widely-accepted cosmological model – the Big Bang Theory. However, things became somewhat confusing during the 1990s, when improved observations showed that the Universe’s rate of expansion has been accelerating for billions of years.
This led to the theory of Dark Energy, a mysterious invisible force that is driving the expansion of the cosmos. Much like Dark Matter which explained the “missing mass”, it then became necessary to find this elusive energy, or at least provide a coherent theoretical framework for it. A new study from the University of British Columbia (UBC) seeks to do just that by postulating the Universe is expanding due to fluctuations in space and time.
The study – which was recently published in the journal Physical Review D – was led by Qingdi Wang, a PhD student with the Department of Physics and Astronomy at UBC. Under the supervisions of UBC Professor William Unruh (the man who proposed the Unruh Effect) and with assistance from Zhen Zhu (another PhD student at UBC), they provide a new take on Dark Energy.
The team began by addressing the inconsistencies arising out of the two main theories that together explain all natural phenomena in the Universe. These theories are none other than General Relativity and quantum mechanics, which effectively explain how the Universe behaves on the largest of scales (i.e. stars, galaxies, clusters) and the smallest (subatomic particles).
Unfortunately, these two theories are not consistent when it comes to a little matter known as gravity, which scientists are still unable to explain in terms of quantum mechanics. The existence of Dark Energy and the expansion of the Universe are another point of disagreement. For starters, candidates theories like vacuum energy – which is one of the most popular explanations for Dark Energy – present serious incongruities.
According to quantum mechanics, vacuum energy would have an incredibly large energy density to it. But if this is true, then General Relativity predicts that this energy would have an incredibly strong gravitational effect, one which would be powerful enough to cause the Universe to explode in size. As Prof. Unruh shared with Universe Today via email:
“The problem is that any naive calculation of the vacuum energy gives huge values. If one assumes that there is some sort of cutoff so one cannot get energy densities much greater than the Planck energy density (or about 1095 Joules/meter³) then one finds that one gets a Hubble constant – the time scale on which the Universe roughly doubles in size – of the order of 10-44 sec. So, the usual approach is to say that somehow something reduces that down so that one gets the actual expansion rate of about 10 billion years instead. But that ‘somehow’ is pretty mysterious and no one has come up with an even half convincing mechanism.”
Whereas other scientists have sought to modify the theories of General Relativity and quantum mechanics in order to resolve these inconsistencies, Wang and his colleagues sought a different approach. As Wang explained to Universe Today via email:
“Previous studies are either trying to modify quantum mechanics in some way to make vacuum energy small or trying to modify General Relativity in some way to make gravity numb for vacuum energy. However, quantum mechanics and General Relativity are the two most successful theories that explain how our Universe works…Instead of trying to modify quantum mechanics or General Relativity, we believe that we should first understand them better. We takes the large vacuum energy density predicted by quantum mechanics seriously and just let them gravitate according to General Relativity without modifying either of them.”
For the sake of their study, Wang and his colleagues performed new sets of calculations on vacuum energy that took its predicted high energy density into account. They then considered the possibility that on the tiniest of scales – billions of times smaller than electrons – the fabric of spacetime is subject to wild fluctuations, oscillating at every point between expansion and contraction.
As it swings back and forth, the result of these oscillations is a net effect where the Universe expands slowly, but at an accelerating rate. After performing their calculations, they noted that such an explanation was consistent with both the existence of quantum vacuum energy density and General Relativity. On top of that, it is also consistent with what scientists have been observing in our Universe for almost a century. As Unruh described it:
“Our calculations showed that one could consistently regard [that] the Universe on the tiniest scales is actually expanding and contracting at an absurdly fast rate; but that on a large scale, because of an averaging over those tiny scales, physics would not notice that ‘quantum foam’. It has a tiny residual effect in giving an effective cosmological constant (dark energy type effect). In some ways it is like waves on the ocean which travel as if the ocean were perfectly smooth but really we know that there is this incredible dance of the atoms that make up the water, and waves average over those fluctuations, and act as if the surface was smooth.”
In contrast to conflicting theories of a Universe where the various forces that govern it cannot be resolved and must cancel each other out, Wang and his colleagues presents a picture where the Universe is constantly in motion. In this scenario, the effects of vacuum energy are actually self-cancelling, and also give rise to the expansion and acceleration we have been observing all this time.
While it may be too soon to tell, this image of a Universe that is highly-dynamic (even on the tiniest scales) could revolutionize our understanding of spacetime. At the very least, these theoretical findings are sure to stimulate debate within the scientific community, as well as experiments designed to offer direct evidence. And that, as we know, is the only way we can advance our understanding of this thing known as the Universe.
Update: This year’s Nobel Prize in Physics has been awarded to David J. Thouless (University of Washington), F. Duncan M. Haldane (Princeton University), and J. Michael Kosterlitz of Brown University for “theoretical discoveries of topological phase transitions and topological phases of matter”. One half of the prize was awarded to Thouless while the other half was jointly awarded to Haldane and Kosterlitz.
The Nobel Prize in physics is a coveted award. Every year, the prize is bestowed upon the individual who is deemed to have made the greatest contribution to the field of physics during the preceding year. And this year, the groundbreaking discovery of gravitational waves is anticipated to be the main focus.
This discovery, which was announced on February 11th, 2016, was made possible thanks to the development of the Laser Interferometer Gravitational-Wave Observatory (LIGO). As such, it is expected that the three scientists that are most responsible for the invention of the technology will receive the Nobel Prize for their work. However, there are those in the scientific community who feel that another scientist – Barry Barish – should also be recognized.
But first, some background is needed to help put all this into perspective. For starers, gravitational waves are ripples in the curvature of spacetime that are generated by certain gravitational interactions and which propagate at the speed of light. The existence of such waves has been postulated since the late 19th century.
However, it was not until the late 20th century, thanks in large part to Einstein and his theory of General Relativity, that gravitational-wave research began to emerge as a branch of astronomy. Since the 1960s, various gravitational-wave detectors have been built, which includes the LIGO observatory.
Founded as a Caltech/MIT project, LIGO was officially approved by the National Science Board (NSF) in 1984. A decade later, construction began on the facility’s two locations – in Hanford, Washington and Livingston, Louisiana. By 2002, it began to obtain data, and work began on improving its original detectors in 2008 (known as the Advanced LIGO Project).
The credit for the creation of LIGO goes to three scientists, which includes Rainer Weiss, a professor of physics emeritus at the Massachusetts Institute of Technology (MIT); Ronald Drever, an experimental physics who was professor emeritus at the California Institute of Technology and a professor at Glasgow University; and Kip Thorne, the Feynman Professor of Theoretical Physics at Caltech.
In 1967 and 68, Weiss and Thorne initiated efforts to construct prototype detectors, and produced theoretical work to prove that gravitational waves could be successfully analyzed. By the 1970s, using different methods, Weiss and Denver both succeeded in building detectors. In the coming years, all three men remained pivotal and influential, helping to make gravitational astronomy a legitimate field of research.
However, it has been argued that without Barish – a particle physicist at Caltech – the discovery would never have been made. Having become the Principal Investigator of LIGO in 1994, he inherited the project at a very crucial time. It had begun funding a decade prior, but coordinating the work of Wiess, Thorne and Drever (from MIT, Caltech and the University of Glasgow, respectively) proved difficult.
As such, it was decided that a single director was needed. Between 1987 and 1994, Rochus Vogt – a professor emeritus of Physics at Caltech – was appointed by the NSF to fill this role. While Vogt brought the initial team together and helped to get the construction of the project approved, he proved difficult when it came to dealing with bureaucracy and documenting his researchers progress.
As such, between 1989 through 1994, LIGO failed to progress technically and organizationally, and had trouble acquiring funding as well. By 1994, Caltech eased Vogt out of his position and appointed Barish to the position of director. Barish got to work quickly, making significant changes to the way LIGO was administered, expanding the research team, and developing a detailed work plan for the NSF.
By 1999, construction had wrapped up on the LIGO observatories, and by 2002, they began taking their first bits of data. By 2004, the funding and groundwork was laid for the next phase of LIGO development, which involved a multi-year shut-down while the detectors were replaced with improved “Advanced LIGO” versions.
All of this was made possible by Barish, who retired in 2005 to head up other projects. Thanks to his sweeping reforms, LIGO got to work after an abortive start, began to produce data, procured funding, crucial partnerships, and now has more than 1000 collaborators worldwide, thanks to the LSC program he established.
Little wonder then why some scientists think the Nobel Prize should be split four-ways, awarding the three scientists who conceived of LIGO and the one scientist who made it happen. And as Barish himself was quoted as saying by Science:
“I think there’s a bit of truth that LIGO wouldn’t be here if I didn’t do it, so I don’t think I’m undeserving. If they wait a year and give it to these three guys, at least I’ll feel that they thought about it,” he says. “If they decide [to give it to them] this October, I’ll have more bad feelings because they won’t have done their homework.”
What’s more, in the past, the Nobel Prize in physics has tended to be awarded to those responsible for the intellectual contributions leading to a major breakthrough, rather than to those who did the leg work. Out of the last six Prizes issued (between 2010 and 2015), five have been awarded for the development of experimental methods, observational studies, and theoretical discoveries.
Only one award was given for a technical development. This was the case in 2014 where the award was given jointly to Isamu Akasaki, Hiroshi Amano and Shuji Nakamura for “the invention of efficient blue light-emitting diodes which has enabled bright and energy-saving white light sources”.
Basically, the Nobel Prize is a complicated matter. Every year, it is awarded to those who made a considerable contribution to science, or were responsible for a major breakthrough. But contributions and breakthroughs are perhaps a bit relative. Whom we choose to honor, and for what, can also be seen as an indication of what is valued most in the scientific community.
In the end, this year’s award may serve to highlight how significant contributions do not just entail the development of new ideas and methods, but also in bringing them to fruition.
One of the most interesting topics in the field of science is the concept of General Relativity. You know, this idea that strange things happen as you near the speed of light. There are strange changes to the length of things, bizarre shifting of wavelengths. And most puzzling of all, there’s the concept of dilation: how you can literally experience more or less time based on how fast you’re traveling compared to someone else.
And even stranger than that? As we saw in the movie Interstellar, just spending time near a very massive object, like a black hole, can cause these same relativistic effects. Because mass and acceleration are sort of the same thing?
Honestly, it’s enough to give you a massive headache.
But just because I find the concept baffling, I’m still going to keep chipping away, trying to understand more about it and help you wrap your brain around it too. For my own benefit, for your benefit, but mostly for my benefit.
There’s a great anecdote in the history of physics – it’s probably not what actually happened, but I still love it.
One of the most famous astronomers of the 20th century was Sir Arthur Eddington, played by a dashing David Tennant in the 2008 movie, Einstein and Eddington. Which, you should really see, if you haven’t already.
So anyway, Doctor Who, I mean Eddington, had worked out how stars generate energy (through fusion) and personally confirmed that Einstein’s predictions of General Relativity were correct when he observed a total Solar Eclipse in 1919.
Apparently during a lecture by Sir Arthur Eddington, someone asked, “Professor Eddington, you must be one of the three people in the world who understands General Relativity.” He paused for a moment, and then said, “yes, but I’m trying to think of who the third person is.”
It’s definitely not me, but I know someone who does have a handle on General Relativity, and that’s Dr. Brian Koberlein, an astrophysics professor at the Rochester Institute of Technology. He covers this topic all the time on his blog, One Universe At A Time, which you should totally visit and read at briankoberlein.com.
In fact, just to demonstrate how this works, Brian has conveniently pushed his RIT office to nearly light speed, and is hurtling towards us right now.
Dr. Brian Koberlein:
Hi Fraser, thanks for having me. If you can hang on one second, I just have to slow down.
What just happened there? Why were you all slowed down?
It’s actually an interesting effect known as time dilation. One of the things about light is that no matter what frame of reference you’re in, no matter how you’re moving through the Universe, you’ll always measure the speed of light in a vacuum to be the same. About 300,000 kilometres per second.
And in order to do that, if you are moving relative to me, or if I’m moving relative to you, our references for time and space have to shift to keep the speed of light constant. As I move faster away from you, my time according to you has to appear to slow down. On the same hand, your time will appear to slow down relative to me.
And that time dilation effect is necessary to keep the speed of light constant.
Does this only happen when you’re moving?
Time dilation doesn’t just occur because of relative motion, it can also occur because of gravity. Einstein’s theory of relativity says that gravity is a property of the warping of space and time. So when you have a mass like Earth, it actually warps space and time.
If you’re standing on the Earth, your time appears to move a little bit more slowly than someone up in space, because of the difference in gravity.
Now, for Earth, that doesn’t really matter that much, but for something like a black hole, it could matter a great deal. As you get closer and closer to a black hole, your time will appear to slow down more and more and more.
What would this mean for space travel?
In many times in science fiction, you’ll see the idea of a rocket moving very close to the speed of light, and using time dilation to travel to distant stars.
But you could actually do the same thing with gravity. If you had a black hole that was going out to another star or another galaxy, you could actually take your spaceship and orbit it very close to the black hole. And your time would seem to slow down. While you’re orbiting the black hole, the black hole would take its time to get to another star or another galaxy, and for you it would seem really quick.
So that’s another way that you could use time dilation to travel to the stars, at least in science fiction.
All right Brian, I’ve got one final question for you. If you get more massive as you get closer to the speed of light, could you get so much mass that you turn into a black hole? I’d like you to answer this question in the form of a blog post on briankoberlein.com and on the Google+ post we’re going to link right here.
Thanks Fraser, I’ll have that answer up on my website.
Once again, we visited the baffling realm of time dilation, and returned relatively unscathed. It doesn’t mean that I understand it any better, but I hope you do, anyway. Once again, a big thanks to Dr. Koberlein for taking a few minutes out of his relativistic travel to answer our questions. Make sure you visit his blog and read his answer to my question.
Since it was first discovered in 1974, astronomers have been dying to get a better look at the Supermassive Black Hole (SBH) at the center of our galaxy. Known as Sagittarius A*, scientists have only been able to gauge the position and mass of this SBH by measuring the effect it has on the stars that orbit it. But so far, more detailed observations have eluded them, thanks in part to all the gas and dust that obscures it.
Luckily, the European Southern Observatory (ESO) recently began work with the GRAVITY interferometer, the latest component in their Very Large Telescope (VLT). Using this instrument, which combines near-infrared imaging, adaptive-optics, and vastly improved resolution and accuracy, they have managed to capture images of the stars orbiting Sagittarius A*. And what they have observed was quite fascinating.
One of the primary purposes of GRAVITY is to study the gravitational field around Sagittarius A* in order to make precise measurements of the stars that orbit it. In so doing, the GRAVITY team – which consists of astronomers from the ESO, the Max Planck Institute, and multiple European research institutes – will be able to test Einstein’s theory of General Relativity like never before.
In what was the first observation conducted using the new instrument, the GRAVITY team used its powerful interferometric imaging capabilities to study S2, a faint star which orbits Sagittarius A* with a period of only 16 years. This test demonstrated the effectiveness of the GRAVITY instrument – which is 15 times more sensitive than the individual 8.2-metre Unit Telescopes the VLT currently relies on.
This was an historic accomplishment, as a clear view of the center of our galaxy is something that has eluded astronomers in the past. As GRAVITY’s lead scientist, Frank Eisenhauer – from the Max Planck Institute for Extraterrestrial Physics in Garching, Germany – explained to Universe Today via email:
“First, the Galactic Center is hidden behind a huge amount of interstellar dust, and it is practically invisible at optical wavelengths. The stars are only observable in the infrared, so we first had to develop the necessary technology and instruments for that. Second, there are so many stars concentrated in the Galactic Center that a normal telescope is not sharp enough to resolve them. It was only in the late 1990′ and in the beginning of this century when we learned to sharpen the images with the help of speckle interferometry and adaptive optics to see the stars and observe their dance around the central black hole.”
But more than that, the observation of S2 was very well timed. In 2018, the star will be at the closest point in its orbit to the Sagittarius A* – just 17 light-hours from it. As you can see from the video below, it is at this point that S2 will be moving much faster than at any other point in its orbit (the orbit of S2 is highlighted in red and the position of the central black hole is marked with a red cross).
When it makes its closest approach, S2 will accelerate to speeds of almost 30 million km per hour, which is 2.5% the speed of light. Another opportunity to view this star reach such high speeds will not come again for another 16 years – in 2034. And having shown just how sensitive the instrument is already, the GRAVITY team expects to be able make very precise measurements of the star’s position.
In fact, they anticipate that the level of accuracy will be comparable to that of measuring the positions of objects on the surface of the Moon, right down to the centimeter-scale. As such, they will be able to determine whether the motion of the star as it orbits the black hole are consistent with Einstein’s theories of general relativity.
“[I]t is not the speed itself to cause the general relativistic effects,” explained Eisenhauer, “but the strong gravitation around the black hole. But the very high orbital speed is a direct consequence and measure of the gravitation, so we refer to it in the press release because the comparison with the speed of light and the ISS illustrates so nicely the extreme conditions.
As recent simulations of the expansion of galaxies in the Universe have shown, Einstein’s theories are still holding up after many decades. However, these tests will offer hard evidence, obtained through direct observation. A star traveling at a portion of the speed of light around a supermassive black hole at the center of our galaxy will certainly prove to be a fitting test.
And Eisenhauer and his colleagues expect to see some very interesting things. “We hope to see a “kick” in the orbit.” he said. “The general relativistic effects increase very strongly when you approach the black hole, and when the star swings by, these effects will slightly change the direction of the orbit.”
While those of us here at Earth will not be able to “star gaze” on this occasion and see R2 whipping past Sagittarius A*, we will still be privy to all the results. And then, we just might see if Einstein really was correct when he proposed what is still the predominant theory of gravitation in physics, over a century later.
On June 30th, 1905, Albert Einstein started a revolution with the publication of theory of Special Relativity. This theory, among other things, stated that the speed of light in a vacuum is the same for all observers, regardless of the source. In 1915, he followed this up with the publication of his theory of General Relativity, which asserted that gravity has a warping effect on space-time. For over a century, these theories have been an essential tool in astrophysics, explaining the behavior of the Universe on the large scale.
However, since the 1990s, astronomers have been aware of the fact that the Universe is expanding at an accelerated rate. In an effort to explain the mechanics behind this, suggestions have ranged from the possible existence of an invisible energy (i.e. Dark Energy) to the possibility that Einstein’s field equations of General Relativity could be breaking down. But thanks to the recent work of an international research team, it is now known that Einstein had it right all along.