In April of 2019, the Event Horizon Telescope collaboration history made history when it released the first image of a black hole ever taken. This accomplishment was decades in the making and triggered an international media circus. The picture was the result of a technique known as interferometry, where observatories across the world combined light from their telescopes to create a composite image.
This image showed what astrophysicists have predicted for a long time, that extreme gravitational bending causes photons to fall in around the event horizon, contributing to the bright rings that surround them. Last week, on March 18th, a team of researchers from the Harvard-Smithsonian Center for Astrophysics (CfA) announced new research that shows how black hole images could reveal an intricate substructure within them.
The field of exoplanet research continues to grow by leaps and bounds. Thanks to missions like the Kepler Space Telescope, over four-thousand planets have been discovered beyond our Solar System, with more being confirmed all the time. Thanks to these discoveries and all that we’ve learned from them, the focus has begun to transition from the process of discovery to characterization.
For instance, a group of astronomers was able to image the surface of a planet orbiting a red dwarf star for the first time. Using data from the NASA Spitzer Space Telescope, the team was able to provide a rare glimpse at the conditions on the planet’s surface. And while those conditions were rather inhospitable – akin to something like Hades, but with less air to breathe – this represents a major breakthrough in the study of exoplanets.
As of March 1st, 2018, 3,741 exoplanets have been confirmed in 2,794 systems, with 622 systems having more than one planet. Most of the credit for these discoveries goes to the Kepler space telescope, which has discovered roughly 3500 planets and 4500 planetary candidates. In the wake of all these discoveries, the focus has shifted from pure discovery to research and characterization.
In this respect, planets detected using the Transit Method are especially valuable since they allow for the study of these planets in detail. For example, a team of astronomers recently discovered three Super-Earths orbiting a star known GJ 9827, which is located just 100 light years (30 parsecs) from Earth. The proximity of the star, and the fact that it is orbited by multiple Super-Earths, makes this system ideal for detailed exoplanet studies.
As with all Kepler discoveries, these planets were discovered using the Transit Method (aka. Transit Photometry), where stars are monitored for periodic dips of brightness. These dips are the result of exoplanets passing in front of the star (i.e. transiting) relative to the observer. While this method is ideal for placing constraints on the size and orbital periods of a planet, it can also allow for exoplanet characterization.
Basically, scientists are able to learn things about their atmospheres by measuring the spectra produced by the star’s light as it passes through the planet’s atmosphere. Combined with radial velocity measurements of the star, scientists can also place constraints on the planet’s mass and radius and can determine things about the planet’s interior structure.
For the sake of their study, the team analyzed data obtained by the K2 mission, which showed the presence of three Super-Earths around the star GJ 9827 (GJ 9827 b, c, and d). Since they initially submitted their research paper back in September of 2017, the presence of these planets has been confirmed by another team of astronomers. As Dr. Rodriguez told Universe Today via email:
“We detected three super-Earth sized planets orbiting in a very compact configuration. Specifically, the three planets have radii of 1.6, 1.2, and 2.1 times the radius of Earth and all orbit their host star within 6.2 days. We note that this system was independently discovered (simultaneously) by another team from Wesleyan University (Niraula et al. 2017).”
These three exoplanets are especially interesting because the larger of the two have radii that place them in the range between being rocky or gaseous. Few such exoplanets have been discovered so far, which makes these three a prime target for research. As Dr. Rodriguez explained:
“Super Earth sized planets are the most common type of planet we know of but we do not have one in our own solar system, limiting our ability to understand them. They are especially important because their radii span the rock to gas transition (as I discuss below in one of the other responses). Essentially, planets larger then 1.6 times the radius of the Earth are less dense and have thick hydrogen/helium atmospheres while planets smaller are very dense with little to no atmosphere.”
Another interesting thing about these super-Earths is how their short orbital periods – which are 1.2, 3.6 and 6.2 days, respectively – would result in fairly hot temperatures. In short, the team estimates that the three super-Earths experience surface temperatures of 1172 K (899 °C; 1650 °F), 811 K (538 °C; 1000 °F), and 680 K (407 °C; 764 °F), respectively.
By comparison, Venus – the hottest planet in the Solar System – experiences surface temperatures of 735 K (462 °C; 863 °F). So while temperatures on Venus are hot enough to melt lead, conditions on GJ 9827 b are almost hot enough to melt bronze.
However, the most significant thing about this discovery is the opportunities it could provide for exoplanet characterization. At just 100 light-years from Earth, it will be relatively easy for the next-generation telescopes (such as the James Webb Space Telescope) to conduct studies of their atmospheres and provide a more detailed picture of this system of planets.
In addition, these three strange planets are all in the same system, which makes conducting observation campaigns that much easier. As Rodriguez concluded:
“The GJ 9827 system is unique because one planet is smaller than this cutoff, one planet is larger, and the third planet has a radius of ~1.6 times the radius of the Earth, right on that border. So in one system, we have planets that span this rock to gas transition. This is important because we can study the atmosphere’s of these planets, look for differences in the composition of their atmospheres and begin to understand why this transition occurs at 1.6 times the radius of the Earth. Since all three planets orbit the same star, the effect of the host star is kept constant in this “experiment”. Therefore, if these three planets in GJ 9827 were instead orbiting three separate stars, we would have to worry about how the host star is influencing or affecting the planet’s atmosphere. In the GJ 9827 system, we do not have to worry about this since they orbit the same star.”
On October 19th, 2017, the Panoramic Survey Telescope and Rapid Response System-1 (Pan-STARRS-1) in Hawaii announced the first-ever detection of an interstellar asteroid, named 1I/2017 U1 (aka. ‘Oumuamua). Originally thought to be a comet, this interstellar visitor quickly became the focus of follow-up studies that sought to determine its origin, structure, composition, and rule out the possibility that it was an alien spacecraft!
While ‘Oumuamua is the first known example of an interstellar asteroid reaching our Solar System, scientists have long suspected that such visitors are a regular occurrence. Aiming to determine just how common, a team of researchers from Harvard University conducted a study to measure the capture rate of interstellar asteroids and comets, and what role they may play in the spread of life throughout the Universe.
For the sake of their study, Lingam and Loeb constructed a three-body gravitational model, where the physics of three bodies are used to compute their respective trajectories and interactions with one another. In Lingam and Loeb’s model, Jupiter and the Sun served as the two massive bodies while a far less massive interstellar object served as the third. As Dr. Loeb explained to Universe Today via email:
“The combined gravity of the Sun and Jupiter acts as a ‘fishing net’. We suggest a new approach to searching for life, which is to examine the interstellar objects captured by this fishing net instead of the traditional approach of looking through telescope or traveling with spacecrafts to distant environments to do the same.”
Using this model, the pair then began calculating the rate at which objects comparable in size to ‘Oumuamua would be captured by the Solar System, and how often such objects would collide with the Earth over the course of its entire history. They also considered the Alpha Centauri system as a separate case for the sake of comparison. In this binary system, Alpha Centauri A and B serve as the two massive bodies and an interstellar asteroid as the third.
As Dr. Lingam indicated:
“The frequency of these objects is determined from the number density of such objects, which has been recently updated based on the discovery of ‘Oumuamua. The size distribution of these objects is unknown (and serves as a free parameter in our model), but for the sake of obtaining quantitative results, we assumed that it was similar to that of comets within our Solar System.”
In the end, they determined that a few thousands captured objects might be found within the Solar system at any time – the largest of which would be tens of km in radius. For the Alpha Centauri system, the results were even more interesting. Based on the likely rate of capture, and the maximum size of a captured object, they determined that even Earth-sized objects could have been captured in the course of the system’s history.
In other words, Alpha Centauri may have picked up some rogue planets over time, which would have had drastic impact on the evolution of the system. In this vein, the authors also explored how objects like ‘Oumuamua could have played a role in the distribution of life throughout the Universe via rocky bodies. This is a variation on the theory of lithopanspermia, where microbial life is shared between planets thanks to asteroids, comets and meteors.
In this scenario, interstellar asteroids, which originate in distant star systems, would be the be carriers of microbial life from one system to another. If such asteroids collided with Earth in the past, they could be responsible for seeding our planet and leading to the emergence of life as we know it. As Lingam explained:
“These interstellar objects could either crash directly into a planet and thus seed it with life, or be captured into the planetary system and undergo further collisions within that system to yield interplanetary panspermia (the second scenario is more likely when the captured object is large, for e.g. a fraction of the Earth’s radius).”
In addition, Lingam and Loeb offered suggestions on how future visitors to our Solar System could be studied. As Lingam summarized, the key would be to look for specific kinds of spectra from objects in our Solar Systems:
“It may be possible to look for interstellar objects (captured/unbound) in our Solar system by looking at their trajectories in detail. Alternatively, since many objects within the Solar system have similar ratios of oxygen isotopes, finding objects with very different isotopic ratios could indicate their interstellar origin. The isotope ratios can be determined through high-resolution spectroscopy if and when interstellar comets approach close to the Sun.”
“The simplest way to single out the objects who originated outside the Solar System, is to examine the abundance ratio of oxygen isotopes in the water vapor that makes their cometary tails,” added Loeb. “This can be done through high resolution spectroscopy. After identifying a trapped interstellar object, we could launch a probe that will search on its surface for signatures of primitive life or artifacts of a technological civilization.”
It would be no exaggeration to say that the discovery of ‘Oumuamua has set off something of a revolution in astronomy. In addition to validating something astronomers have long suspected, it has also provided new opportunities for research and the testing of scientific theories (such as lithopanspermia).
In the future, with any luck, robotic missions will be dispatched to these bodies to conduct direct studies and maybe even sample return missions. What these reveal about our Universe, and maybe even the spread of life throughout, is sure to be very illuminating!
The core of the Milky Way Galaxy has always been a source of mystery and fascination to astronomers. This is due in part to the fact that our Solar System is embedded within the disk of the Milky Way – the flattened region that extends outwards from the core. This has made seeing into the bulge at the center of our galaxy rather difficult. Nevertheless, what we’ve been able to learn over the years has proven to be immensely interesting.
For instance, in the 1970s, astronomers became aware of the Supermassive Black Hole (SMBH) at the center of our galaxy, known as Sagittarius A* (Sgr A*). In 2016, astronomers also noticed a curved filament that appeared to be extending from Sgr A*. Using a pioneering technique, a team of astronomers from the Harvard-Smithsonian Center for Astrophysics (CfA) recently produced the highest-quality images of this structure to date.
As Mark Morris – a professor of astronomy at the UCLA and the lead authority the study – explained in a CfA press release:
“With our improved image, we can now follow this filament much closer to the Galaxy’s central black hole, and it is now close enough to indicate to us that it must originate there. However, we still have more work to do to find out what the true nature of this filament is.”
After examining the filament, the research team came up with three possible explanations for its existence. The first is that the filament is the result of inflowing gas, which would produce a rotating, vertical tower of magnetic field as it approaches and threads Sgr A*’s event horizon. Within this tower, particles would produce radio emissions as they are accelerated and spiral in around magnetic field lines extending from the black hole.
The second possibility is that the filament is a theoretical object known as a cosmic string. These are basically long, extremely thin cosmic structures that carry mass and electric currents that are hypothesized to migrate from the centers of galaxies. In this case, the string could have been captured by Sgr A* once it came too close and a portion crossed its event horizon.
The third and final possibility is that there is no real association between the filament and Sgr A* and the positioning and direction it has shown is merely coincidental. This would imply that there are many such filaments in the Universe and this one just happened to be found near the center of our galaxy. However, the team is confident that such a coincidence is highly unlikely.
As Jun-Hui Zhao of the Harvard-Smithsonian Center for Astrophysics in Cambridge, and a co-author on the paper, said:
“Part of the thrill of science is stumbling across a mystery that is not easy to solve. While we don’t have the answer yet, the path to finding it is fascinating. This result is motivating astronomers to build next generation radio telescopes with cutting edge technology.”
All of these scenarios are currently being investigated, and each poses its own share of implications. If the first possibility is true – in which the filament is caused by particles being ejected by Sgr A* – then astronomers would be able to gleam vital information about how magnetic fields operate in such an environment. In short, it could show that near an SMBH, magnetic fields are orderly rather than chaotic.
This could be proven by examining particles farther away from Sgr A* to see if they are less energetic than those that are closer to it. The second possibility, the cosmic string theory, could be tested by conducting follow-up observations with the VLA to determine if the position of the filament is shifting and its particles are moving at a fraction of the speed of light.
If the latter should prove to be the case, it would constitute the first evidence that theoretical cosmic strings actually exists. It would also allow astronomers to conduct further tests of General Relativity, examining how gravity works under such conditions and how space-time is affected. The team also noted that, even if the filament is not physically connected to Sgr A*, the bend in the filament is still rather telling.
In short, the bend appears to be coincide with a shock wave, the kind that would be caused by an exploding star. This could mean that one of the massive stars which surrounds Sgr A* exploded in proximity to the filament in the past, producing the necessary shock wave that altered the course of the inflowing gas and its magnetic field. All of these mysteries will be the subject of follow-up surveys conducted with the VLA.
As co-author Miller Goss from the National Radio Astronomy Observatory in New Mexico (and a co-author on the study) said, “We will keep hunting until we have a solid explanation for this object. And we are aiming to next produce even better, more revealing images.”
In the hunt for extra-terrestrial life, scientists tend to take what is known as the “low-hanging fruit approach”. This consists of looking for conditions similar to what we experience here on Earth, which include at oxygen, organic molecules, and plenty of liquid water. Interestingly enough, some of the places where these ingredients are present in abundance include the interiors of icy moons like Europa, Ganymede, Enceladus and Titan.
Whereas there is only one terrestrial planet in our Solar System that is capable of supporting life (Earth), there are multiple “Ocean Worlds” like these moons. Taking this a step further, a team of researchers from the Harvard Smithsonian Center for Astrophysics (CfA) conducted a study that showed how potentially-habitable icy moons with interior oceans are far more likely than terrestrial planets in the Universe.
To begin, Lingam and Loeb address the tendency to confuse habitable zones (HZs) with habitability, or to treat the two concepts as interchangeable. For instance, planets that are located within an HZ are not necessarily capable of supporting life – in this respect, Mars and Venus are perfect examples. Whereas Mars is too cold and it’s atmosphere too thin to support life, Venus suffered a runaway greenhouse effect that caused it to become a hot, hellish place.
On the other hand, bodies that are located beyond HZs have been found to be capable of having liquid water and the necessary ingredients to give rise to life. In this case, the moons of Europa, Ganymede, Enceladus, Dione, Titan, and several others serve as perfect examples. Thanks to the prevalence of water and geothermal heating caused by tidal forces, these moons all have interior oceans that could very well support life.
As Lingam, a post-doctoral researcher at the ITC and CfA and the lead author on the study, told Universe Today via email:
“The conventional notion of planetary habitability is the habitable zone (HZ), namely the concept that the “planet” must be situated at the right distance from the star such that it may be capable of having liquid water on its surface. However, this definition assumes that life is: (a) surface-based, (b) on a planet orbiting a star, and (c) based on liquid water (as the solvent) and carbon compounds. In contrast, our work relaxes assumptions (a) and (b), although we still retain (c).”
As such, Lingam and Loeb widen their consideration of habitability to include worlds that could have subsurface biospheres. Such environments go beyond icy moons such as Europa and Enceladus and could include many other types deep subterranean environments. On top of that, it has also been speculated that life could exist in Titan’s methane lakes (i.e. methanogenic organisms). However, Lingam and Loeb chose to focus on icy moons instead.
“Even though we consider life in subsurface oceans under ice/rock envelopes, life could also exist in hydrated rocks (i.e. with water) beneath the surface; the latter is sometimes referred to as subterranean life,” said Lingam. “We did not delve into the second possibility since many of the conclusions (but not all of them) for subsurface oceans are also applicable to these worlds. Similarly, as noted above, we do not consider lifeforms based on exotic chemistries and solvents, since it is not easy to predict their properties.”
Ultimately, Lingam and Loeb chose to focus on worlds that would orbit stars and likely contain subsurface life humanity would be capable of recognizing. They then went about assessing the likelihood that such bodies are habitable, what advantages and challenges life will have to deal with in these environments, and the likelihood of such worlds existing beyond our Solar System (compared to potentially-habitable terrestrial planets).
For starters, “Ocean Worlds” have several advantages when it comes to supporting life. Within the Jovian system (Jupiter and its moons) radiation is a major problem, which is the result of charged particles becoming trapped in the gas giants powerful magnetic field. Between that and the moon’s tenuous atmospheres, life would have a very hard time surviving on the surface, but life dwelling beneath the ice would fare far better.
“One major advantage that icy worlds have is that the subsurface oceans are mostly sealed off from the surface,” said Lingam. “Hence, UV radiation and cosmic rays (energetic particles), which are typically detrimental to surface-based life in high doses, are unlikely to affect putative life in these subsurface oceans.”
“On the negative side,’ he continued, “the absence of sunlight as a plentiful energy source could lead to a biosphere that has far less organisms (per unit volume) than Earth. In addition, most organisms in these biospheres are likely to be microbial, and the probability of complex life evolving may be low compared to Earth. Another issue is the potential availability of nutrients (e.g. phosphorus) necessary for life; we suggest that these nutrients might be available only in lower concentrations than Earth on these worlds.”
In the end, Lingam and Loeb determined that a wide range of worlds with ice shells of moderate thickness may exist in a wide range of habitats throughout the cosmos. Based on how statistically likely such worlds are, they concluded that “Ocean Worlds” like Europa, Enceladus, and others like them are about 1000 times more common than rocky planets that exist within the HZs of stars.
These findings have some drastic implications for the search for extra-terrestrial and extra-solar life. It also has significant implications for how life may be distributed through the Universe. As Lingam summarized:
“We conclude that life on these worlds will undoubtedly face noteworthy challenges. However, on the other hand, there is no definitive factor that prevents life (especially microbial life) from evolving on these planets and moons. In terms of panspermia, we considered the possibility that a free-floating planet containing subsurface exolife could be temporarily “captured” by a star, and that it may perhaps seed other planets (orbiting that star) with life. As there are many variables involved, not all of them can be quantified accurately.”
Professor Leob – the Frank B. Baird Jr. Professor of Science at Harvard University, the director of the ITC, and the study’s co-author – added that finding examples of this life presents its own share of challenges. As he told Universe Today via email:
“It is very difficult to detect sub-surface life remotely (from a large distance) using telescopes. One could search for excess heat but that can result from natural sources, such as volcanos. The most reliable way to find sub-surface life is to land on such a planet or moon and drill through the surface ice sheet. This is the approach contemplated for a future NASA mission to Europa in the solar system.”
Exploring the implications for panspermia further, Lingam and Loeb also considered what might happen if a planet like Earth were ever ejected from the Solar System. As they note in their study, previous research has indicated how planets with thick atmospheres or subsurface oceans could still support life while floating in interstellar space. As Loeb explained, they also considered what would happen if this ever happened with Earth someday:
“An interesting question is what would happen to the Earth if it was ejected from the solar system into cold space without being warmed by the Sun. We have found that the oceans would freeze down to a depth of 4.4 kilometers but pockets of liquid water would survive in the deepest regions of the Earth’s ocean, such as the Mariana Trench, and life could survive in these remaining sub-surface lakes. This implies that sub-surface life could be transferred between planetary systems.”
This study also serves as a reminder that as humanity explores more of the Solar System (largely for the sake of finding extra-terrestrial life) what we find also has implications in the hunt for life in the rest of the Universe. This is one of the benefits of the “low-hanging fruit” approach. What we don’t know is informed but what we do, and what we find helps inform our expectations of what else we might find.
And of course, it’s a very vast Universe out there. What we may find is likely to go far beyond what we are currently capable of recognizing!
Since the 18th century, astronomers have been aware that our Solar System is embedded in a vast disk of stars and gas known as the Milky Way Galaxy. Since that time, the greatest scientific minds have been attempting to obtain accurate distance measurements in order to determine just how large the Milky Way is. This has been no easy task, since the fact that we are embedded in our galaxy’s disk means that we cannot view it head-on.
To do this, the team relied on a technique first applied by Freidrich Wilhelm Bessel in 1838 to measure the distance to the star 61 Cygni. Known as trigonometric parallax, this technique involves viewing an object from opposite sides of the Earth’s orbit around the Sun, and then measuring the angle of the object’s apparent shift in position. In this way, astronomers are able to use simple trigonometry to calculate the distance to that object.
In short, the smaller the measured angle, the greater the distance to the object. These measurements were performed using data from the Bar and Spiral Structure Legacy (BeSSeL) Survey, which was named in honor of Freidrich Wilhelm Bessel. But whereas Bessel and his contemporaries were forced to measure parallax using basic instruments, the VLBA has ten dish antennas distributed across North America, Hawaii, and the Caribbean.
With such an array at its disposal, the VLBA is capable of measuring parallaxes with one thousand times the accuracy of those performed by astronomers in Bessel’s time. And rather than being confined to nearby star systems, the VLBA is capable of measuring the minuscule angles associated with vast cosmological distances. As Sanna explained in a recent MPIfR press release:
“Using the VLBA, we now can accurately map the whole extent of our Galaxy. Most of the stars and gas in our Galaxy are within this newly-measured distance from the Sun. With the VLBA, we now have the capability to measure enough distances to accurately trace the Galaxy’s spiral arms and learn their true shapes.”
The VLBA observations, which were conducted in 2014 and 2015, measured the distance to the star-forming region known as G007.47+00.05. Like all star-forming regions, this one contains molecules of water and methanol, which act as natural amplifiers of radio signals. This results in masers (the radio-wave equivalent of lasers), an effect that makes the radio signals appear bright and readily observable with radio telescopes.
This particular region is located over 66,000 light years from Earth and at on opposite side of the Milky Way, relative to our Solar System. The previous record for a parallax measurement was about 36,000 light-years, roughly 11,000 light years farther than the distance between our Solar System and the center of our galaxy. As Sanna explained, this accomplishment in radio astronomy will enable surveys that reach much farther than previous ones:
“Most of the stars and gas in our Galaxy are within this newly-measured distance from the Sun. With the VLBA, we now have the capability to measure enough distances to accurately trace the Galaxy’s spiral arms and learn their true shapes.”
Hundreds of star-forming regions exist within the Milky Way. But as Karl Menten – a member of the MPIfR and a co-author on the study – explained, this study was significant because of where this one is located. “So we have plenty of ‘mileposts’ to use for our mapping project,” he said. “But this one is special: Looking all the way through the Milky Way, past its center, way out into the other side.”
In the coming years, Sanna and his colleagues hope to conduct additional observations of G007.47+00.05 and other distant star-forming regions of the Milky Way. Ultimately, the goal is to gain a complete understanding of our galaxy, one that is so accurate that scientists will be able to finally place precise constraints on its size, mass, and its total number of stars.
With the necessary tools now in hand, Sanna and his team even estimate that a complete picture of the Milky Way could be available in about ten years time. Imagine that! Future generations will be able to study the Milky Way with the same ease as one that is located nearby, and which they can view edge-on. At long last, all those artist’s impression of our Milky Way will be to scale!
In today’s modern, fast-paced world, human activity is very much reliant on electrical infrastructure. If the power grids go down, our climate control systems will shut off, our computers will die, and all electronic forms of commerce and communication will cease. But in addition to that, human activity in the 21st century is also becoming increasingly dependent upon the infrastructure located in Low Earth Orbit (LEO).
Aside from the many telecommunications satellites that are currently in space, there’s also the International Space Station and a fleet of GPS satellites. It is for this reason that solar flare activity is considered a serious hazard, and mitigation of it a priority. Looking to address that, a team of scientists from Harvard University recently released a study that proposes a bold solution – placing a giant magnetic shield in orbit.
The study – which was the work of Doctor Manasavi Lingam and Professor Abraham Loeb from the Harvard Smithsonian Center for Astrophysicist (CfA) – recently appeared online under the title “Impact and Mitigation Strategy for Future Solar Flares“. As they explain, solar flares pose a particularly grave risk in today’s world, and will become an even greater threat due to humanity’s growing presence in LEO.
Solar flares have been a going concern for over 150 years, ever since the famous Carrington Event of 1859. Since that time, a great deal of effort has been dedicated to the study of solar flares from both a theoretical and observational standpoint. And thanks to the advances that have been made in the past 200 years in terms of astronomy and space exploration, much has been learned about the phenomena known as “space weather”.
At the same time, humanity’s increased reliance on electricity and space-based infrastructure have also made us more vulnerable to extreme space weather events. In fact, if the Carrington event were to take place today, it is estimated that it would cause global damage to electric power grids, satellites communications, and global supply chains.
The cumulative worldwide economic losses, according to a 2009 report by the Space Studies Board (“Severe Space Weather Events–Understanding Societal and Economic Impacts”), would be $10 trillion, and recovery would take several years. And yet, as Professor Loeb explained to Universe Today via email, this threat from space has received far less attention than other possible threats.
“In terms of risk from the sky, most of the attention in the past was dedicated to asteroids,” said Loeb. “They killed the dinosaurs and their physical impact in the past was the same as it will be in the future, unless their orbits are deflected. However, solar flares have little biological impact and their main impact is on technology. But a century ago, there was not much technological infrastructure around, and technology is growing exponentially. Therefore, the damage is highly asymmetric between the past and future.”
To address this, Lingham and Loeb developed a simple mathematical model to assess the economic losses caused by solar flare activity over time. This model considered the increasing risk of damage to technological infrastructure based on two factors. For one, they considered the fact that the energy of a solar flares increases with time, then coupled this with the exponential growth of technology and GDP.
What they determined was that on longer time scales, the rare types of solar flares that are very powerful become much more likely. Coupled with humanity’s growing presence and dependence on spacecraft and satellites in LEO, this will add up to a dangerous conjunction somewhere down the road. Or as Loeb explained:
“We predict that within ~150 years, there will be an event that causes damage comparable to the current US GDP of ~20 trillion dollars, and the damage will increase exponentially at later times until technological development will saturate. Such a forecast was never attempted before. We also suggest a novel idea for how to reduce the damage from energetic particles by a magnetic shield. This was my idea and was not proposed before.”
To address this growing risk, Lingham and Loeb also considered the possibility of placing a magnetic shield between Earth and the Sun. This shield would be placed at the Earth-Sun Lagrange Point 1, where it would be able to deflect charged particles and create an artificial bowshock around Earth. In this sense, this shield would protect Earth’s in a way that is similar to what its magnetic field already does, but to greater effect.
Based on their assessment, Lingham and Loeb indicate that such a shield is technically feasible in terms of its basic physical parameters. They were also able to provide a rudimentary timeline for the construction of this shield, not to mention some rough cost assessments. As Loeb indicated, such a shield could be built before this century is over, and at a fraction of the cost of what would be incurred from solar flare damage.
“The engineering project associated with the magnetic shield that we propose could take a few decades to construct in space,” he said. “The cost for lifting the needed infrastructure to space (weighting 100,000 tons) will likely be of order 100 billions of dollars, much less than the expected damage over a century.”
Interestingly enough, the idea of using a magnetic shield to protect planets has been proposed before. For example, this type of shield was also the subject of a presentation at this year’s “Planetary Science Vision 2050 Workshop“, which was hosted by NASA’s Planetary Science Division (PSD). This shield was recommended as a means of enhancing Mars’ atmosphere and facilitating crewed mission to its surface in the future.
During the course of the presentation, titled “A Future Mars Environment for Science and Exploration“, NASA Director Jim Green discussed how a magnetic shield could protect Mars’ tenuous atmosphere from solar wind. This would allow it to replenish over time, which would have the added benefit of warming Mars up and allowing liquid water to again flow on its surface. If this sounds similar to proposals for terraforming Mars, that’s because it is!
Beyond Earth and the Solar System, the implications for this study are quite overwhelming. In recent years, many terrestrial planets have been found orbiting within nearby M-type (aka. red dwarf) star systems. Because of the way these planets orbit closely to their respective suns, and the variable and unstable nature of M-type stars, scientists have expressed doubts about whether or not these planets could actually be habitable.
In short, scientists have ventured that over the course of billions of years, rocky planets that orbit close to their suns, are tidally-locked with them, and are subject to regular solar flares would lose their atmospheres. In this respect, magnetic shields could be a possible solution to creating extra-solar colonies. Place a large shield in orbit at the L1 Lagrange point, and you never have to worry again about powerful magnetic storms ravaging the planet!
On top of that, this study offers a possible resolution to the Fermi Paradox. When looking for sign of Extra-Terrestrial Intelligence (ETI), it might make sense to monitor distant stars for signs of an orbiting magnetic shield. As Prof. Leob explained, such structures may have already been detected around distant stars, and could explain some of the unusual observations astronomers have made:
“The imprint of a shield built by another civilization could involve the changes it induces in the brightness of the host star due to occultation (similar behavior to Tabby’s star) if the structure is big enough. The situation could be similar to Dyson’s spheres, but instead of harvesting the energy of the star the purpose of the infrastructure is to protect a technological civilization on a planet from the flares of its host star.”
It is a foregone conclusion that as time and technology progress, humanity’s presence in (and reliance on) space will increase. As such, preparing for the most drastic space weather events the Solar System can throw at us just makes sense. And when it comes to the big questions like “are we alone in the Universe?”, it also makes sense to take our boldest concepts and proposals and consider how they might point the way towards extra-terrestrial intelligence.
For centuries, astronomers have been looking beyond our Solar System to learn more about the Milky Way Galaxy. And yet, there are still many things about it that elude us, such as knowing its precise mass. Determining this is important to understanding the history of galaxy formation and the evolution of our Universe. As such, astronomers have attempted various techniques for measuring the true mass of the Milky Way.
So far, none of these methods have been particularly successful. However, a new study by a team of researchers from the Harvard-Smithsonian Center for Astrophysics proposed a new and interesting way to determine how much mass is in the Milky Way. By using hypervelocity stars (HVSs) that have been ejected from the center of the galaxy as a reference point, they claim that we can constrain the mass of our galaxy.
To be clear, determining the mass of the Milky Way Galaxy is no simple task. On the one hand, observations are difficult because the Solar System lies deep within the disk of the galaxy itself. But at the same time, there’s also the mass of our galaxy’s dark matter halo, which is difficult to measure since it is not “luminous”, and therefore invisible to conventional methods of detection.
Current estimates of the galaxy’s total mass are based on the motions of tidal streamers of gas and globular clusters, which are both influenced by the gravitational mass of the galaxy. But so far, these measurements have produced mass estimates that range from one to several trillion solar-masses. As Professor Loeb explained to Universe Today via email, precisely measuring the mass of the Milky Way is of great importance to astronomers:
“The Milky Way provides a laboratory for testing the standard cosmological model. This model predicts that the number of satellite galaxies of the Milky Way depends sensitively on its mass. When comparing the predictions to the census of known satellite galaxies, it is essential to know the Milky Way mass. Moreover, the total mass calibrates the amount of invisible (dark) matter and sets the depth of the gravitational potential well and implies how fast should stars move for them to escape to intergalactic space.”
For the sake of their study, Prof. Loeb and Dr. Fragione therefore chose to take a novel approach, which involved modeling the motions of HVSs to determine the mass of our galaxy. More than 20 HVSs have been discovered within our galaxy so far, which travel at speeds of up to 700 km/s (435 mi/s) and are located at distances of about 100 to 50,000 light-years from the galactic center.
These stars are thought to have been ejected from the center of our galaxy thanks to the interactions of binary stars with the supermassive black hole (SMBH) at the center of our galaxy – aka. Sagittarius A*. While their exact cause is still the subject of debate, the orbits of HVSs can be calculated since they are completely determined by the gravitational field of the galaxy.
As they explain in their study, the researchers used the asymmetry in the radial velocity distribution of stars in the galactic halo to determine the galaxy’s gravitational potential. The velocity of these halo stars is dependent on the potential escape speed of HVSs, provided that the time it takes for the HVSs to complete a single orbit is shorter than the lifetime of the halo stars.
From this, they were able to discriminate between different models for the Milky Way and the gravitational force it exerts. By adopting the nominal travel time of these observed HVSs – which they calculated to about 330 million years, about the same as the average lifetime of halo stars – they were able to derive gravitational estimates for the Milky Way which allowed for estimates on its overall mass.
“By calibrating the minimum speed of unbound stars, we find that the Milky Way mass is in the range of 1.2-1.9 trillions solar masses,” said Loeb. While still subject to a range, this latest estimate is a significant improvement over previous estimates. What’s more, these estimates are consistent our current cosmological models that attempt to account for all visible matter in the Universe, as well as dark matter and dark energy – the Lambda-CDM model.
“The inferred Milky Way mass is in the range expected within the standard cosmological model,” said Leob, “where the amount of dark matter is about five times larger than that of ordinary (luminous) matter.”
Based on this breakdown, it can be said that normal matter in our galaxy – i.e. stars, planets, dust and gas – accounts for between 240 and 380 billion Solar Masses. So not only does this latest study provide more precise mass constraints for our galaxy, it could also help us to determine exactly how many star systems are out there – current estimates say that the Milky Way has between 200 to 400 billion stars and 100 billion planets.
Beyond that, this study is also significant to the study of cosmic formation and evolution. By placing more precise estimates on our galaxy’s mass, ones which are consistent with the current breakdown of normal matter and dark matter, cosmologists will be able to construct more accurate accounts of how our Universe came to be. One step clsoer to understanding the Universe on the grandest of scales!
When astronomers first noted the detection of a Fast Radio Burst (FRB) in 2007 (aka. the Lorimer Burst), they were both astounded and intrigued. This high-energy burst of radio pulses, which lasted only a few milliseconds, appeared to be coming from outside of our galaxy. Since that time, astronomers have found evidence of many FRBs in previously-recorded data, and are still speculating as to what causes them.
Thanks to subsequent discoveries and research, astronomers now know that FRBs are far more common than previously thought. In fact, according to a new study by a team of researchers from the Harvard-Smithsonian Center for Astrophysics (CfA), FRBs may occur once every second within the observable Universe. If true, FRBs could be a powerful tool for researching the origins and evolution of the cosmos.
As noted, FRBs have remained something of a mystery since they were first discovered. Not only do their causes remain unknown, but much about their true nature is still not understood. As Dr. Fialkov told Universe Today via email:
“FRBs (or fast radio bursts) are astrophysical signals of an undetermined nature. The observed bursts are short (or millisecond duration), bright pulses in the radio part of the electromagnetic spectrum (at GHz frequencies). Only 24 bursts have been observed so far and we still do not know for sure which physical processes trigger them. The most plausible explanation is that they are launched by rotating magnetized neutron stars. However, this theory is to be confirmed.”
For the sake of their study, Fialkov and Loeb relied on observations made by multiple telescopes of the repeating fast radio burst known as FRB 121102. This FRB was first observed in 2012 by researchers using the Arecibo radio telescope in Puerto Rico, and has since been confirmed to be coming from a galaxy located 3 billion light years away in the direction of the Auriga constellation.
Since it was discovered, additional bursts have been detected coming from its location, making FRB 121102 the only known example of a repeating FRB. This repetitive nature has also allowed astronomers to conduct more detailed studies of it than any other FRB. As Prof. Loeb told Universe Today via email, these and other reasons made it an ideal target for their study:
“FRB 121102 is the only FRB for which a host galaxy and a distance were identified. It is also the only repeating FRB source from which we detected hundreds of FRBs by now. The radio spectrum of its FRBs is centered on a characteristic frequency and not covering a very broad band. This has important implications for the detectability of such FRBs, because in order to find them the radio observatory needs to be tuned to their frequency.”
Based on what is known about FRB 121102, Fialkov and Loeb conducted a series of calculations that assumed that it’s behavior was representative of all FRBs. They then projected how many FRBs would exist across the entire sky and determined that within the observable Universe, a FRB would likely be taking place once every second. As Dr. Fialkov explained:
“Assuming that FRBs are produced by galaxies of a particular type (e.g., similar to FRB 121102) we can calculate how many FRBs have to be produced by each galaxy to explain the existing observations (i.e., 2000 per sky per day). With this number in mind we can infer the production rate for the entire population of galaxies. This calculation shows that an FRB occurs every second when accounting for all the faint events.”
While the exact nature and origins of FRBs are still unknown – suggestions include rotating neutron stars and even alien intelligence! – Fialkov and Loeb indicate that they could be used to study the structure and evolution of the Universe. If indeed they occur with such regular frequency throughout the cosmos, then more distant sources could act as probes which astronomers would then rely on to plumb the depths of space.
For instance, over vast cosmic distances, there is a significant amount of intervening material that makes it difficult for astronomers to study the Cosmic Microwave Background (CMB) – the leftover radiation from the Big Bang. Studies of this intervening material could lead to a new estimates of just how dense space is – i.e. how much of it is composed of ordinary matter, dark matter, and dark energy – and how rapidly it is expanding.
And as Prof. Loeb indicated, FRBs could also be used to explore enduring cosmlogical questions, like how the “Dark Age” of the Universe ended:
“FRBs can be used to measure the column of free electrons towards their source. This can be used to measure the density of ordinary matter between galaxies in the present-day universe. In addition, FRBs at early cosmic times can be used to find out when the ultraviolet light from the first stars broke up the primordial atoms of hydrogen left over from the Big Bang into their constituent electrons and protons.”
The “Dark Age”, which occurred between 380,000 and 150 million years after the Big Bang, was characterized by a “fog” of hydrogen atoms interacting with photons. As a result of this, the radiation of this period is undetectable by our current instruments. At present, scientists are still attempting to resolve how the Universe made the transition between these “Dark Ages” and subsequent epochs when the Universe was filled with light.
This period of “reionization”, which took place 150 million to 1 billion years after the Big Bang, was when the first stars and quasars formed. It is generally believed that UV light from the first stars in the Universe traveled outwards to ionize the hydrogen gas (thus clearing the fog). A recent study also suggested that black holes that existed in the early Universe created the necessary “winds” that allowed this ionizing radiation to escape.
To this end, FRBs could be used to probe into this early period of the Universe and determine what broke down this “fog” and allowed light to escape. Studying very distant FRBs could allow scientists to study where, when and how this process of “reionization” occurred. Looking ahead, Fialkov and Loeb explained how future radio telescopes will be able to discover many FRBs.
“Future radio observatories, like the Square Kilometer Array, will be sensitive enough to detect FRBs from the first generation of galaxies at the edge of the observable universe,” said Prof. Loeb. “Our work provides the first estimate of the number and properties of the first flashes of radio waves that lit up in the infant universe.”
“[W]e find that a next generation telescope (with a much better sensitivity than the existing ones) is expected to see many more FRBs than what is observed today,” said Dr. Fialkov. “This would allow to characterize the population of FRBs and identify their origin. Understanding the nature of FRBs will be a major breakthrough. Once the properties of these sources are known, FRBs can be used as cosmic beacons to explore the Universe. One application is to study the history of reionization (cosmic phase transition when the inter-galactic gas was ionized by stars).”
It is an inspired thought, using natural cosmic phenomena as research tools. In that respect, using FRBs to probe the most distant objects in space (and as far back in time as we can) is kind of like using quasars as navigational beacons. In the end, advancing our knowledge of the Universe allows us to explore more of it.