Matt Williams is the Curator of Universe Today's Guide to Space. He is also a freelance writer, a science fiction author and a Taekwon-Do instructor. He lives with his family on Vancouver Island in beautiful British Columbia.
Scientists have understood for some time that the most abundant elements in the Universe are simple gases like hydrogen and helium. These make up the vast majority of its observable mass, dwarfing all the heavier elements combined (and by a wide margin). And between the two, helium is the second lightest and second most abundant element, being present in about 24% of observable Universe’s elemental mass.
Whereas we tend to think of Helium as the hilarious gas that does strange things to your voice and allows balloons to float, it is actually a crucial part of our existence. In addition to being a key component of stars, helium is also a major constituent in gas giants. This is due in part to its very high nuclear binding energy, plus the fact that is produced by both nuclear fusion and radioactive decay. And yet, scientists have only been aware of its existence since the late 19th century.
On January 20th, 2016, researchers Konstantin Batygin and Michael E. Brown of Caltech announced that they had found evidence that hinted at the existence of a massive planet at the edge of the Solar System. Based on mathematical modeling and computer simulations, they predicted that this planet would be a super-Earth, two to four times Earth’s size and 10 times as massive. They also estimated that, given its distance and highly elliptical orbit, it would take 10,000 – 20,000 years to orbit the Sun.
Since that time, many researchers have responded with their own studies about the possible existence of this mysterious “Planet 9”. One of the latest comes from the University of Arizona, where a research team from the Lunar and Planetary Laboratory have indicated that the extreme eccentricity of distant Kuiper Belt Objects (KBOs) might indicate that they crossed paths with a massive planet in the past.
For some time now, it has been understood that there are a few known KBOs who’s dynamics are different than those of other belt objects. Whereas most are significantly controlled by the gravity of the gas giants planets in their current orbits (particularly Neptune), certain members of the scattered disk population of the Kuiper Belt have unusually closely-spaced orbits.
When Batygin and Brown first announced their findings back in January, they indicated that these objects instead appeared to be highly clustered with respect to their perihelion positions and orbital planes. What’s more, their calculation showed that the odds of this being a chance occurrence were extremely low (they calculated a probability of 0.007%).
Instead, they theorized that it was a distant eccentric planet that was responsible for maintaining the orbits of these KBOs. In order to do this, the planet in question would have to be over ten times as massive as Earth, and have an orbit that lay roughly on the same plane (but with a perihelion oriented 180° away from those of the KBOs).
Such a planet not only offered an explanation for the presence of high-perihelion Sedna-like objects – i.e. planetoids that have extremely eccentric orbits around the Sun. It would also help to explain where distant and highly inclined objects in the outer Solar System come from, since their origins have been unclear up until this point.
In a paper titled “Coralling a distant planet with extreme resonant Kuiper belt objects“, the University of Arizona research team – which included Professor Renu Malhotra, Dr. Kathryn Volk, and Xianyu Wang – looked at things from another angle. If in fact Planet 9 were crossing paths with certain high-eccentricity KBOs, they reasoned, it was a good bet that its orbit was in resonance with these objects.
To break it down, small bodies are ejected from the Solar System all the time due to encounters with larger objects that perturb their orbits. In order to avoid being ejected, smaller bodies need to be protected by orbital resonances. While the smaller and larger objects may pass within each others’ orbital path, they are never close enough that they would able to exert a significant influence on each other.
This is how Pluto has remained a part of the Solar System, despite having an eccentric orbit that periodically cross Neptune’s path. Though Neptune and Pluto cross each others orbit, they are never close enough to each other that Neptune’s influence would force Pluto out of our Solar System. Using this same reasoning, they hypothesized that the KBOs examined by Batygin and Brown might be in an orbital resonance with the Planet 9.
As Dr. Malhotra, Volk and Wang told Universe Today via email:
“The extreme Kuiper belt objects we investigate in our paper are distinct from the others because they all have very distant, very elliptical orbits, but their closest approach to the Sun isn’t really close enough for them to meaningfully interact with Neptune. So we have these six observed objects whose orbits are currently fairly unaffected by the known planets in our Solar System. But if there’s another, as yet unobserved planet located a few hundred AU from the Sun, these six objects would be affected by that planet.”
After examining the orbital periods of these six KBOs – Sedna, 2010 GB174, 2004 VN112, 2012 VP113, and 2013 GP136 – they concluded that a hypothetical planet with an orbital period of about 17,117 years (or a semimajor axis of about 665 AU), would have the necessary period ratios with these four objects. This would fall within the parameters estimated by Batygin and Brown for the planet’s orbital period (10,000 – 20,000 years).
Their analysis also offered suggestions as to what kind of resonance the planet has with the KBOs in question. Whereas Sedna’s orbital period would have a 3:2 resonance with the planet, 2010 GB174 would be in a 5:2 resonance, 2994 VN112 in a 3:1, 2004 VP113 in 4:1, and 2013 GP136 in 9:1. These sort of resonances are simply not likely without the presence of a larger planet.
“For a resonance to be dynamically meaningful in the outer Solar System, you need one of the objects to have enough mass to have a reasonably strong gravitational effect on the other,” said the research team. “The extreme Kuiper belt objects aren’t really massive enough to be in resonances with each other, but the fact that their orbital periods fall along simple ratios might mean that they each are in resonance with a massive, unseen object.”
But what is perhaps most exciting is that their findings could help to narrow the range of Planet 9’s possible location. Since each orbital resonance provides a geometric relationship between the bodies involved, the resonant configurations of these KBOs can help point astronomers to the right spot in our Solar System to find it.
But of course, Malhotra and her colleagues freely admit that several unknowns remain, and further observation and study is necessary before Planet 9 can be confirmed:
“There are a lot of uncertainties here. The orbits of these extreme Kuiper belt objects are not very well known because they move very slowly on the sky and we’ve only observed very small portions of their orbital motion. So their orbital periods might differ from the current estimates, which could make some of them not resonant with the hypothetical planet. It could also just be chance that the orbital periods of the objects are related; we haven’t observed very many of these types of objects, so we have a limited set of data to work with.”
Ultimately, astronomers and the rest of us will simply have to wait on further observations and calculations. But in the meantime, I think we can all agree that the possibility of a 9th Planet is certainly an intriguing one! For those who grew up thinking that the Solar System had nine planets, these past few years (where Pluto was demoted and that number fell to eight) have been hard to swallow.
But with the possible confirmation of this Super-Earth at the outer edge of the Solar System, that number could be pushed back up to nine soon enough!
In about 4 billion years, scientists estimate that the Andromeda and the Milky Way galaxies are expected to collide, based on data from the Hubble Space Telescope. And when they merge, they will give rise to a super-galaxy that some are already calling Milkomeda or Milkdromeda (I know, awful isn’t it?) While this may sound like a cataclysmic event, these sorts of galactic collisions are quite common on a cosmic timescale.
As an international group of researchers from Japan and California have found, galactic “hookups” were quite common during the early universe. Using data from the Hubble Space Telescope and the Subaru Telescope at in Mauna Kea, Hawaii, they have discovered that 1.2 billion years after the Big Bang, galactic clumps grew to become large galaxies by merging. As part of the Hubble Space Telescope (HST) “Cosmic Evolution Survey (COSMOS)”, this information could tell us a great about the formation of the early universe.
Continuing with our “Definitive Guide to Terraforming“, Universe Today is happy to present to our guide to terraforming Venus. It might be possible to do this someday, when our technology advances far enough. But the challenges are numerous and quite specific.
The planet Venus is often referred to as Earth’s “Sister Planet”, and rightly so. In addition to being almost the same size, Venus and Earth are similar in mass and have very similar compositions (both being terrestrial planets). As a neighboring planet to Earth, Venus also orbits the Sun within its “Goldilocks Zone” (aka. habitable zone). But of course, there are many key difference between the planets that make Venus uninhabitable.
For starters, it’s atmosphere over 90 times thicker than Earth’s, its average surface temperature is hot enough to melt lead, and the air is a toxic fume consisting of carbon dioxide and sulfuric acid. As such, if humans want to live there, some serious ecological engineering – aka. terraforming – is needed first. And given its similarities to Earth, many scientists think Venus would be a prime candidate for terraforming, even more so than Mars!
Over the past century, the concept of terraforming Venus has appeared multiple times, both in terms of science fiction and as the subject of scholarly study. Whereas treatments of the subject were largely fantastical in the early 20th century, a transition occurred with the beginning of the Space Age. As our knowledge of Venus improved, so too did the proposals for altering the landscape to be more suitable for human habitation.
Examples in Fiction:
Since the early 20th century, the idea of ecologically transforming Venus has been explored in fiction. The earliest known example is Olaf Stapleton’sLast And First Men(1930), two chapters of which are dedicated to describing how humanity’s descendants terraform Venus after Earth becomes uninhabitable; and in the process, commit genocide against the native aquatic life.
By the 1950s and 60s, owing to the beginning of the Space Age, terraforming began to appear in many works of science fiction. Poul Anderson also wrote extensively about terraforming in the 1950s. In his 1954 novel, The Big Rain, Venus is altered through planetary engineering techniques over a very long period of time. The book was so influential that the term term “Big Rain” has since come to be synonymous with the terraforming of Venus.
In 1991, author G. David Nordley suggested in his short story (“The Snows of Venus”) that Venus might be spun-up to a day-length of 30 Earth days by exporting its atmosphere of Venus via mass drivers. Author Kim Stanley Robinson became famous for his realistic depiction of terraforming in the Mars Trilogy – which included Red Mars, Green Mars and Blue Mars.
In 2012, he followed this series up with the release of 2312, a science fiction novel that dealt with the colonization of the entire Solar System – which includes Venus. The novel also explored the many ways in which Venus could be terraformed, ranging from global cooling to carbon sequestration, all of which were based on scholarly studies and proposals.
The first proposed method of terraforming Venus was made in 1961 by Carl Sagan. In a paper titled “The Planet Venus“, he argued for the use of genetically engineered bacteria to transform the carbon in the atmosphere into organic molecules. However, this was rendered impractical due to the subsequent discovery of sulfuric acid in Venus’ clouds and the effects of solar wind.
In his 1991 study “Terraforming Venus Quickly“, British scientist Paul Birch proposed bombarding Venus’ atmosphere with hydrogen. The resulting reaction would produce graphite and water, the latter of which would fall to the surface and cover roughly 80% of the surface in oceans. Given the amount of hydrogen needed, it would have to harvested directly from one of the gas giant’s or their moon’s ice.
The proposal would also require iron aerosol to be added to the atmosphere, which could be derived from a number of sources (i.e. the Moon, asteroids, Mercury). The remaining atmosphere, estimated to be around 3 bars (three times that of Earth), would mainly be composed of nitrogen, some of which will dissolve into the new oceans, reducing atmospheric pressure further.
Another idea is to bombard Venus with refined magnesium and calcium, which would sequester carbon in the form of calcium and magnesium carbonates. In their 1996 paper, “The stability of climate on Venus“, Mark Bullock and David H. Grinspoon of the University of Colorado at Boulder indicated that Venus’ own deposits of calcium and magnesium oxides could be used for this process. Through mining, these minerals could be exposed to the surface, thus acting as carbon sinks.
However, Bullock and Grinspoon also claim this would have a limited cooling effect – to about 400 K (126.85 °C; 260.33 °F) and would only reduce the atmospheric pressure to an estimated 43 bars. Hence, additional supplies of calcium and magnesium would be needed to achieve the 8×1020 kg of calcium or 5×1020 kg of magnesium required, which would most likely have to be mined from asteroids.
The concept of solar shades has also been explored, which would involve using either a series of small spacecraft or a single large lens to divert sunlight from a planet’s surface, thus reducing global temperatures. For Venus, which absorbs twice as much sunlight as Earth, solar radiation is believed to have played a major role in the runaway greenhouse effect that has made it what it is today.
Such a shade could be space-based, located in the Sun–Venus L1 Lagrangian point, where it would prevent some sunlight from reaching Venus. In addition, this shade would also serve to block the solar wind, thus reducing the amount of radiation Venus’ surface is exposed to (another key issue when it comes to habitability). This cooling would result in the liquefaction or freezing of atmospheric CO², which would then be depsotied on the surface as dry ice (which could be shipped off-world or sequestered underground).
Alternately, solar reflectors could be placed in the atmosphere or on the surface. This could consist of large reflective balloons, sheets of carbon nanotubes or graphene, or low-albedo material. The former possibility offers two advantages: for one, atmospheric reflectors could be built in-situ, using locally-sourced carbon. Second, Venus’ atmosphere is dense enough that such structures could easily float atop the clouds.
NASA scientist Geoffrey A. Landis has also proposed that cities could be built above Venus’ clouds, which in turn could act as both a solar shield and as processing stations. These would provide initial living spaces for colonists, and would act as terraformers, gradually converting Venus’ atmosphere into something livable so the colonists could migrate to the surface.
Another suggestion has to do with Venus’ rotational speed. Venus rotates once every 243 days, which is by far the slowest rotation period of any of the major planets. As such, Venus’s experiences extremely long days and nights, which could prove difficult for most known Earth species of plants and animals to adapt to. The slow rotation also probably accounts for the lack of a significant magnetic field.
To address this, British Interplanetary Society member Paul Birch suggested creating a system of orbital solar mirrors near the L1 Lagrange point between Venus and the Sun. Combined with a soletta mirror in polar orbit, these would provide a 24-hour light cycle.
It has also been suggested that Venus’ rotational velocity could be spun-up by either striking the surface with impactors or conducting close fly-bys using bodies larger than 96.5 km (60 miles) in diameter. There is also the suggestion of using using mass drivers and dynamic compression members to generate the rotational force needed to speed Venus up to the point where it experienced a day-night cycle identical to Earth’s (see above).
Then there’s the possibility of removing some of Venus’ atmosphere, which could accomplished in a number of ways. For starters, impactors directed at the surface would blow some of the atmosphere off into space. Other methods include space elevators and mass accelerators (ideally placed on balloons or platforms above the clouds), which could gradually scoop gas from the atmosphere and eject it into space.
One of the main reasons for colonizing Venus, and altering its climate for human settlement, is the prospect of creating a “backup location” for humanity. And given the range of choices – Mars, the Moon, and the Outer Solar System – Venus has several things going for it the others do not. All of these highlight why Venus is known as Earth’s “Sister Planet”.
For starters, Venus is a terrestrial planet that is similar in size, mass and composition to Earth. This is why Venus has similar gravity to Earth, which is about of what we experience 90% (or 0.904 g, to be exact. As a result, humans living on Venus would be at a far lower risk of developing health problems associated with time spent in weightlessness and microgravity environments – such as osteoporosis and muscle degeneration.
Venus’s relative proximity to Earth would also make transportation and communications easier than with most other locations in the solar system. With current propulsion systems, launch windows to Venus occur every 584 days, compared to the 780 days for Mars. Flight time is also somewhat shorter since Venus is the closest planet to Earth. At it’s closest approach, it is 40 million km distant, compared to 55 million km for Mars.
Another reason has to do with Venus’ runaway greenhouse effect, which is the reason for the planet’s extreme heat and atmospheric density. In testing out various ecological engineering techniques, our scientists would learn a great deal about their effectiveness. This information, in turn, will come in mighty handy in the ongoing fight against Climate Change here on Earth.
And in the coming decades, this fight is likely to become rather intense. As the NOAA reported in March of 2015, carbon dioxide levels in the atmosphere have now surpassed 400 ppm, a level not seen since the the Pliocene Era – when global temperatures and sea level were significantly higher. And as a series of scenarios computed by NASA show, this trend is likely to continue until 2100, with severe consequences.
In one scenario, carbon dioxide emissions will level off at about 550 ppm toward the end of the century, resulting in an average temperature increase of 2.5 °C (4.5 °F). In the second scenario, carbon dioxide emissions rise to about 800 ppm, resulting in an average increase of about 4.5 °C (8 °F). Whereas the increases predicted in the first scenario are sustainable, in the latter scenario, life will become untenable on many parts of the planet.
So in addition to creating a second home for humanity, terraforming Venus could also help to ensure that Earth remains a viable home for our species. And of course, the fact that Venus is a terrestrial planet means that it has abundant natural resources that could be harvested, helping humanity to achieve a “post-scarcity” economy.
Beyond the similarities Venus’ has with Earth (i.e. size, mass and composition), there are numerous differences that would make terraforming and colonizing it a major challenge. For one, reducing the heat and pressure of Venus’ atmosphere would require a tremendous amount of energy and resources. It would also require infrastructure that does not yet exist and would be very expensive to build.
For instance, it would require immense amounts of metal and advanced materials to build an orbital shade large enough to cool Venus’ atmosphere to the point that its greenhouse effect would be arrested. Such a structure, if positioned at L1, would also need to be four times the diameter of Venus itself. It would have to be assembled in space, which would require a massive fleet of robot assemblers.
In contrast, increasing the speed of Venus’s rotation would require tremendous energy, not to mention a significant number of impactors that would have to cone from the outer solar System – mainly from the Kuiper Belt. In all of these cases, a large fleet of spaceships would be needed to haul the necessary material, and they would need to be equipped with advanced drive systems that could make the trip in a reasonable amount of time.
Currently, no such drive systems exist, and conventional methods – ranging from ion engines to chemical propellants – are neither fast or economical enough. To illustrate, NASA’s New Horizons mission took more than 11 years to get make its historic rendezvous with Pluto in the Kuiper Belt, using conventional rockets and the gravity-assist method.
Meanwhile, the Dawn mission, which relied relied on ionic propulsion, took almost four years to reach Vesta in the Asteroid Belt. Neither method is practical for making repeated trips to the Kuiper Belt and hauling back icy comets and asteroids, and humanity has nowhere near the number of ships we would need to do this.
The same problem of resources holds true for the concept of placing solar reflectors above the clouds. The amount of material would have to be large and would have to remain in place long after the atmosphere had been modified, since Venus’s surface is currently completely enshrouded by clouds. Also, Venus already has highly reflective clouds, so any approach would have to significantly surpass its current albedo (0.65) to make a difference.
And when it comes to removing Venus’ atmosphere, things are equally challenging. In 1994, James B. Pollack and Carl Sagan conducted calculations that indicated that an impactor measuring 700 km in diameter striking Venus at high velocity would less than a thousandth of the total atmosphere. What’s more, there would be diminishing returns as the atmosphere’s density decreases, which means thousands of giant impactors would be needed.
In addition, most of the ejected atmosphere would go into solar orbit near Venus, and – without further intervention – could be captured by Venus’s gravitational field and become part of the atmosphere once again. Removing atmospheric gas using space elevators would be difficult because the planet’s geostationary orbit lies an impractical distance above the surface, where removing using mass accelerators would be time-consuming and very expensive.
In sum, the potential benefits of terraforming Venus are clear. Humanity would have a second home, we would be able to add its resources to our own, and we would learn valuable techniques that could help prevent cataclysmic change here on Earth. However, getting to the point where those benefits could be realized is the hard part.
Like most proposed terraforming ventures, many obstacles need to be addressed beforehand. Foremost among these are transportation and logistics, mobilizing a massive fleet of robot workers and hauling craft to harness the necessary resources. After that, a multi-generational commitment would need to be made, providing financial resources to see the job through to completion. Not an easy task under the most ideal of conditions.
Suffice it to say, this is something that humanity cannot do in the short-run. However, looking to the future, the idea of Venus becoming our “Sister Planet” in every way imaginable – with oceans, arable land, wildlife and cities – certainly seems like a beautiful and feasible goal. The only question is, how long will we have to wait?
Since it was first launched in 1990, the Hubble Space Telescope has provided people all over the world with breathtaking views of the Universe. Using its high-tech suite of instruments, Hubble has helped resolve some long-standing problems in astronomy, and helped to raise new questions. And always, its operators have been pushing it to the limit, hoping to gaze farther and farther into the great beyond and see what’s lurking there.
And as NASA announced with a recent press release, using the HST, an international team of astronomers just shattered the cosmic distance record by measuring the farthest galaxy ever seen in the universe. In so doing, they have not only looked deeper into the cosmos than ever before, but deeper into it’s past. And what they have seen could tell us much about the early Universe and its formation.
Due to the effects of special relativity, astronomers know that when they are viewing objects in deep space, they are seeing them as they were millions or even billions of years ago. Ergo, an objects that is located 13.4 billions of light-years away will appear to us as it was 13.4 billion years ago, when its light first began to make the trip to our little corner of the Universe.
This is precisely what the team of astronomers witnessed when they gazed upon GN-z11, a distant galaxy located in the direction of the constellation of Ursa Major. With this one galaxy, the team of astronomers – which includes scientists from Yale University, the Space Telescope Science Institute (STScI), and the University of California – were able to see what a galaxy in our Universe looked like just 400 million years after the Big Bang.
Prior to this, the most distant galaxy ever viewed by astronomers was located 13.2 billion light years away. Using the same spectroscopic techniques, the Hubble team confirmed that GN-z11 was nearly 200 million light years more distant. This was a big surprise, as it took astronomers into a region of the Universe that was thought to be unreachable using the Hubble Space Telescope.
In fact, astronomers did not suspect that they would be able to probe this deep into space and time without using Spitzer, or until the deployment the James Webb Space Telescope – which is scheduled to launch in October 2018. As Pascal Oesch of Yale University, the principal investigator of the study, explained:
“We’ve taken a major step back in time, beyond what we’d ever expected to be able to do with Hubble. We see GN-z11 at a time when the universe was only three percent of its current age. Hubble and Spitzer are already reaching into Webb territory.”
In addition, the findings also have some implications for previous distance estimates. In the past, astronomers had estimated the distance of GN-z11 by relying on Hubble and Spitzer’s color imaging techniques. This time, they relied on Hubble’s Wide Field Camera 3 to spectroscopically measure the galaxies redshift for the first time. In so doing, they determined that GN-z11 was farther way than they thought, which could mean that some particularly bright galaxies who’s distanced have been measured using Hubble could also be farther away.
The results also reveal surprising new clues about the nature of the very early universe. For starters, the Hubble images (combined with data from Spitzer) showed that GN-z11 is 25 times smaller than the Milky Way is today, and has just one percent of our galaxy’s mass in stars. At the same time, it is forming stars at a rate that is 20 times greater than that of our own galaxy.
As Garth Illingworth – one of the team’s investigator’s from the University of California, Santa Cruz – explained:
“It’s amazing that a galaxy so massive existed only 200 million to 300 million years after the very first stars started to form. It takes really fast growth, producing stars at a huge rate, to have formed a galaxy that is a billion solar masses so soon. This new record will likely stand until the launch of the James Webb Space Telescope.”
Last, but not least, they provide a tantalizing clue as to what future missions – like the James Webb Space Telescope – will be finding. Once deployed, astronomers will likely be looking ever farther into space, and farther into the past. With every step, we are closing in on seeing what the very first galaxies that formed in our Universe looked like.
Very recently, a team of scientists from the Commonwealth Scientific and Industrial Research Organization (CSIRO) achieved an historic first by being able to pinpoint the source of fast radio bursts (FRBs). With the help of observatories around the world, they determined that these radio signals originated in an elliptical galaxy 6 billion light years from Earth. But as it turns out, this feat has been followed by yet another historic first.
In all previous cases where FRBs were detected, they appeared to be one-off events, lasting for mere milliseconds. However, after running the data from a recent FRB through a supercomputer, a team of scientists at McGill University in Montreal have determined that in this instance, the signal was repeating in nature. This finding has some serious implications for the astronomical community, and is also considered by some to be proof of extra-terrestrial intelligence.
FRBs have puzzled astronomers since they were first detected in 2007. This event, known as the Lorimer Burst, lasted a mere five milliseconds and appeared to be coming from a location near the Large Magellanic Cloud, billions of light years away. Since that time, a total of 16 FRBs have been detected. And in all but this one case, the duration was extremely short and was not followed up by any additional bursts.
Because of their short duration and one-off nature, many scientists have reasoned that FRBs must be the result of cataclysmic events – such as a star going supernova or a neutron star collapsing into a black hole. However, after sifting through data obtained by the Arecibo radio telescope in Puerto Rico, a team of students from McGill University – led by PhD student Paul Scholz – determined that an FRB detected in 2012 did not conform to this pattern.
In an article published in Nature, Scholz and his associates describe how this particular signal – FRB 121102 – was followed by several bursts with properties that were consistent with the original signal. Running the data which was gathered in May and June through a supercomputer at the McGill High Performance Computing Center, they determined that FRB 121102 had emitted a total of 10 new bursts after its initial detection.
This would seem to indicate that FRBs have more than just one cause, which presents some rather interesting possibilities. As Paul Scholz told Universe Today via email:
“All previous Fast Radio Bursts have only been one-time events, so a lot of explanations for them have involved a cataclysmic event that destroys the source of the bursts, such as a neutron star collapsing into a black hole. Our discovery of repeating bursts from FRB 121102 shows that the source cannot have been destroyed and it must have been due to a phenomenon that can repeat, such as bright pulses from a rotating neutron star.”
Another possibility which is making the rounds is that this signal is not natural in origin. Since their discovery, FRBs and other “transient signals” – i.e. seemingly random and temporary signals – from the Universe have been the subject of speculation. As would be expected, there have been some who have suggested that they might be the long sought-after proof that extra-terrestrial civilizations exist.
For example, in 1967, after receiving a strange reading from a radio array in a Cambridge field, astrophysicist Jocelyn Bell Burnell and her team considered the possibility that what they were seeing was an alien message. This would later be shown to be incorrect – it was, in fact, the first discovery of a pulsar. However, the possibility these signals are alien in origin has remained fixed in the public (and scientific) imagination.
This has certainly been the case since the discovery of FRBs. In an article published by New Scientistsin April of 2015 – titled “Cosmic Radio Plays An Alien Tune” – writer and astrophysicist Sarah Scoles explores the possibility of whether or not the strange regularity of some FRBs that appeared to be coming from within the Milky Way could be seen as evidence of alien intelligence.
However, the likelihood that these signals are being sent by extra-terrestrials is quite low. For one, FRBs are not an effective way to send a message. As Dr. Maura McLaughlin of West Virginia University – who was part of the first FRB discovery – has explained, it takes a lot of energy to make a signal that spreads across lots of frequencies (which is a distinguishing feature of FRBs).
And if these bursts came from outside of our galaxy, which certainly seems to be the case, they would have to be incredibly energetic to get this far. As Dr. McLaughlin explained to Universe Today via email:
“The total amount of power required to produce just one FRB pulse is as much as the Sun produces in a month! Although we might expect extraterrestrial civilizations to send short-duration signals, sending a signal over the very wide radio bandwidths over which FRBs are detected would require an improbably immense amount of energy. We expect that extraterrestrial civilizations would transmit over a very narrow range of radio frequencies, much like a radio station on Earth.
But regardless of whether these signals are natural or extra-terrestrial in origin, they do present some rather exciting possibilities for astronomical research and our knowledge of the Universe. Moving forward, Scholz and his team hope to identify the galaxy where the radio bursts originated, and plans to use test out some recently-developed techniques in the process.
“Next we would like to localize the source of the bursts to identify the galaxy that they are coming from,” he said. “This will let us know about the environment around the source. To do this, we need to use radio interferometry to get a precise enough sky location. But, to do this we need to detect a burst while we are looking at the source with such a radio telescope array. Since the source is not always bursting we will have to wait until we get a detection of a burst while we are looking with radio interferometry. So, if we’re patient, eventually we should be able to pinpoint the galaxy that the bursts are coming from.”
In the end, we may find that rapid burst radio waves are a more common occurrence than we thought. In all likelihood, they are being regularly emitted by rare and powerful stellar objects, ones which we’ve only begun to notice. As for the other possibility? Well, we’re not saying it’s aliens, but we’re quite sure others will be!
With all the talk about manned missions to Mars by the 2030s, its easy to overlook another major proposal for the next great leap. In recent years, the European Space Agency has been quite vocal about its plan to go back to the Moon by the 2020s. More importantly, they have spoken often about their plans to construct a moon base, one which would serve as a staging platform for future missions to Mars and beyond.
These plans were detailed at a recent international symposium that took place on Dec. 15th at the the European Space Research and Technology Center in Noordwijk, Netherlands. During the symposium, which was titled “Moon 2020-2030 – A New Era of Coordinated Human and Robotic Exploration”, the new Director General of the ESA – Jan Woerner – articulated his agency’s vision.
The purpose of the symposium – which saw 200 scientists and experts coming together to discuss plans and missions for the next decade – was to outline common goals for lunar exploration, and draft methods on how these can be achieved cooperatively. Intrinsic to this was the International Space Exploration Coordinated Group‘s (ISECG) Global Exploration Roadmap, an agenda for space exploration that was drafted by the group’s 14 members – which includes NASA, the ESA, Roscosmos, and other federal agencies.
This roadmap not only lays out the strategic significance of the Moon as a global space exploration endeavor, but also calls for a shared international vision on how to go about exploring the Moon and using it as a stepping stone for future goals. When it came time to discuss how the ESA might contribute to this shared vision, Woerner outlined his agency’s plan to establish an international lunar base.
In the past, Woerner has expressed his interest in a base on the Moon that would act as a sort of successor to the International Space Station. Looking ahead, he envisions how an international community would live and perform research in this environment, which would be constructed using robotic workers, 3D printing techniques, and in-situ resources utilization.
The construction of such a base would also offer opportunities to leverage new technologies and forge lucrative partnerships between federal space agencies and private companies. Already, the ESA has collaborated with the architectural design firm Foster + Partners to come up with the plan for their lunar village, and other private companies have also been recruited to help investigate other aspects of building it.
Going forward, the plan calls for a series of manned missions to the Moon beginning in the 2020s, which would involve robot workers paving the way for human explorers to land later. These robots would likely be controlled through telepresence, and would combine lunar regolith with magnesium oxide and a binding salt to print out the shield walls of the habitat.
At present, the plan is for the base to be built in southern polar region, which exists in a near-state of perpetual twilight. Whether or not this will serve as a suitable location will be the subject of the upcoming Lunar Polar Sample Return mission – a joint effort between the ESA and Roscosmos that will involve sending a robotic probe to the Moon’s South Pole-Aitken Basin by 2020 to retrieve samples of ice.
This mission follows in the footsteps of NASA’s Lunar Reconnaissance Orbiter (LRO), which showed that the Shakleton crater – located in the Moon’s southern polar region – has an abundant supply of water ice. This could not only be used to provide the Moon base with a source of drinking water, but could also be converted into hydrogen to refuel spacecraft on their way to and from Earth.
As Woerner was quoted as saying by the Daily Mail during the course of the symposium, this lunar base would provide the opportunity for scientists from many different nations to live and work together:
The future of space travel needs a new vision. Right now we have the Space Station as a common international project, but it won’t last forever. If I say Moon Village, it does not mean single houses, a church, a town hall and so on… My idea only deals with the core of the concept of a village: people working and living together in the same place. And this place would be on the Moon. In the Moon Village we would like to combine the capabilities of different spacefaring nations, with the help of robots and astronauts. The participants can work in different fields, perhaps they will conduct pure science and perhaps there will even be business ventures like mining or tourism.
Naturally, the benefits would go beyond scientific research and international cooperation. As NexGen Space LLC (a consultant company for NASA) recently stated, such a base would be a major stepping stone on the way to Mars. In fact, the company estimated that if such a base included refueling stations, it could cut the cost of any future Mars missions by about $10 billion a year.
And of course, a lunar base would also yield valuable scientific data that would come in handy for future missions. Located far from Earth’s protective magnetic field, astronauts on the Moon (and in circumpolar obit) would be subjected to levels of cosmic radiation that astronauts in orbit around Earth (i.e. aboard the ISS) are not. This data will prove immeasurably useful when plotting upcoming missions to Mars or into deep space.
An additional benefit is the possibility of creating an international presence on the Moon that would ensure that the spirit of the Outer Space Treaty endures. Signed back in 1966 at the height of the “Moon Race”, this treaty stated that “the exploration and use of outer space shall be carried out for the benefit and in the interests of all countries and shall be the province of all mankind.”
In other words, the treaty was meant to ensure that no nation or space agency could claim anything in space, and that issues of territorial sovereignty would not extend to the celestial sphere. But with multiple agencies discussing plans to build bases on the Moon – including NASA, Roscosmos, and JAXA – it is possible that issues of “Moon sovereignty” might emerge at some point in the future.
And having a base that could facilitate regular trips to the Moon would also be a boon for the burgeoning space tourism industry. Beyond offering trips into Low Earth Orbit (LEO) aboard Virgin Galactic, Richard Branson has also talked about the possibility of offering trips to the Moon by 2043. Golden Spike, another space tourism company, also hopes to offer round-trip lunar adventures someday (at a reported $750 million a pop).
Other private space ventures that are looking to make the Moon a tourist destination include Space Adventures and Excalibur Almaz – both of which are hoping to offer lunar fly-bys (no Moon walks, sorry) for $150 million apiece someday. Many analysts predict that in the coming decade, this industry will begin to (no pun intended) take flight. As such, establishing infrastructure there ahead of time would certainly be beneficial.
“We’re going back to the Moon”. That appeared to be central the message behind the recent symposium and the ESA’s plans for future space exploration. And this time, it seems, we will be staying there! And from there, who knows? The Universe is a big place…
Mapping the Universe with satellites and ground-based observatories have not only provided scientists with a pretty good understanding of its structure, but also of its composition. And for some time now, they have been working with a model that states that the Universe consists of 4.9% “normal” matter (i.e. that which we can see), 26.8% “dark matter” (that which we can’t), and 68.3% “dark energy”.
From what they have observed, scientists have also concluded that the normal matter in the Universe is concentrated in web-like filaments, which make up about 20% of the Universe by volume. But a recent study performed by the Institute of Astro- and Particle Physics at the University of Innsbruck in Austria has found that a surprising amount of normal matter may live in the voids, and that black holes may have deposited it there.
In a paper submitted to the Royal Astronomical Society, Dr. Haider and his team described how they performed measurements of the mass and volume of the Universe’s filamentary structures to get a better idea of where the Universe’s mass is located. To do this, they used data from the Illustris project – a large computer simulation of the evolution and formation of galaxies.
As an ongoing research project run by an international collaboration of scientists (and using supercomputers from around the world), Illustris has created the most detailed simulations of our Universe to date. Beginning with conditions roughly 300,000 years after the Big Bang, these simulations track how gravity and the flow of matter changed the structure of the cosmos up to the present day, roughly 13.8 billion years later.
The process begins with the supercomputers simulating a cube of space in the universe, which measures some 350 million light years on each side. Both normal and dark matter are dealt with, particularly the gravitational effect that dark matter has on normal matter. Using this data, Haider and his team noticed something very interesting about the distribution of matter in the cosmos.
Essentially, they found that about 50% of the total mass of the Universe is compressed into a volume of 0.2%, consisting of the galaxies we see. A further 44% is located in the enveloping filaments, consisting of gas particles and dust. The remaining 6% is located in the empty spaces that fall between them (aka. the voids), which make up 80% of the Universe.
However, a surprising faction of this normal matter (20%) appears to have been transported there, apparently by the supermassive black holes located at the center of galaxies. The method for this delivery appears to be in how black holes convert some of the matter that regularly falls towards them into energy, which is then delivered to the sounding gas, leading to large outflows of matter.
These outflows stretch for hundreds of thousands of lights years beyond the host galaxy, filling the void with invisible mass. As Dr. Haider explains, these conclusions supported by this data are rather startling. “This simulation,” he said, “one of the most sophisticated ever run, suggests that the black holes at the center of every galaxy are helping to send matter into the loneliest places in the universe. What we want to do now is refine our model, and confirm these initial findings.”
The findings are also significant because they just may offer an explanation to the so-called “missing baryon problem”. In short, this problem describes how there is an apparent discrepancy between our current cosmological models and the amount of normal matter we can see in the Universe. Even when dark matter and dark energy are factored in, half of the remaining 4.9% of the Universe’s normal matter still remains unaccounted for.
For decades, scientists have been working to find this “missing matter”, and several suggestions have been made as to where it might be hiding. For instance, in 2011, a team of students at the Monash School of Physics in Australia confirming that some of it was in the form of low-density, high energy matter that could only be observed in the x-ray wavelength.
In 2012, using data from the Chandra X-ray Observatory, a NASA research team reported that our galaxy, and the nearby Large and Small Magellanic Clouds, were surrounded by an enormous halo of hot gas that was invisible at normal wavelengths. These findings indicated that all galaxies may be surrounded by mass that, while not visible to the naked eye, is nevertheless detectable using current methods.
And just days ago, researchers from the Commonwealth Scientific and Industrial Research Organization (CSIRO) described how they had used fast radio bursts (FRBs) to measure the density of cosmic baryons in the intergalactic medium – which yielded results that seem to indicate that our current cosmological models are correct.
Factor in all the mass that is apparently being delivered to the void by supermassive black holes, and it could be that we finally have a complete inventory of all the normal matter of the Universe. This is certainly an exciting prospect, as it means that one of the greatest cosmological mysteries of our time could finally be solved.
Now if we could just account for the “abnormal” matter in the Universe, and all that dark energy, we’d be in business!
In July of 2012, researchers at the CERN laboratory made history when they announced the discovery of the Higgs Boson. Though its existence had been hypothesized for over half a century, confirming its existence was a major boon for scientists. In discovering this one particle, the researchers were also able to confirm the Standard Model of particle physics. Much the same is true of our current cosmological model.
For decades, scientists been going by the theory that the Universe consists of about 70% dark energy, 25% dark matter and 5% “luminous matter” – i.e. the matter we can see. But even when all the visible matter is added up, there is a discrepancy where much of it is still considered “missing”. But thanks to the efforts of a team from the Commonwealth Scientific and Industrial Research Organization (CSIRO), scientists now know that we have it right.
This began on April 18th, 2015, when the CSIRO’s Parkes Observatory in Australia detected a fast radio burst (FRB) coming from space. An international alert was immediately issued, and within a few hours, telescopes all around the world were looking for the signal. The CSIRO team began tracking it as well with the Australian Telescope Compact Array (ATCA) located at the Paul Wild Observatory (north of Parkes).
With the help of the National Astronomical Observatory of Japan’s (NAOJ) Subaru telescope in Hawaii, they were able to pinpoint where the signal was coming from. As the CSIRO team described in a paper submitted to Nature, they identified the source, which was an elliptical galaxy located 6 billion light years from Earth.
This was an historic accomplishment, since pinpointing the source of FRBs have never before been possible. Not only do the signals last mere milliseconds, but they are also subject to dispersion – i.e. a delay caused by how much material they pass through. And while FRBs have been detected in the past, the teams tracking them have only been able to obtain measurements of the dispersion, but never the signal’s redshift.
Redshift occurs as a result of an object moving away at relativistic speeds (a portion of the speed of light). For decades, scientists have been using it to determine how fast other galaxies are moving away from our own, and hence the rate of expansion of the Universe. Relying on optical data obtained by the Subaru telescope, the CSIRO team was able to obtain both the dispersion and the redshift data from this signal.
As stated in their paper, this information yielded a “direct measurement of the cosmic density of ionized baryons in the intergalactic medium”. Or, as Dr. Simon Johnston – of the CSIRO’s Astronomy and Space Science division and the co-author of the study – explains, the team was not only to locate the source of the signal, but also obtain measurements which confirmed the distribution of matter in the Universe.
“Until now, the dispersion measure is all we had,” he said. “By also having a distance we can now measure how dense the material is between the point of origin and Earth, and compare that with the current model of the distribution of matter in the Universe. Essentially this lets us weigh the Universe, or at least the normal matter it contains.”
Dr. Evan Keane of the SKA Organization, and lead author on the paper, was similarly enthused about the team’s discovery. “[W]e have found the missing matter,” he said. “It’s the first time a fast radio burst has been used to conduct a cosmological measurement.”
As already noted, FRB signals are quite rare, and only 16 have been detected in the past. Most of these were found by sifting through data months or years after the signal was detected, by which time it would be impossible for any follow-up observations. To address this, Dr. Keane and his team developed a system to detect FRBs and immediately alert other telescopes, so that the source could be pinpointed.
It is known as the Square Kilometer Array (SKA), an international effort led by the SKA Organization to build the world’s largest radio telescope. Combining extreme sensitivity, resolution and a wide field of view, the SKA is expected to trace many FRBs to their host galaxies. In so doing, it is hoped the array will provide more measurements confirming the distribution of matter in the Universe, as well as more information on dark energy.
In the end, these and other discoveries by the SKA could have far-reaching consequences. Knowing the distribution of matter in the universe, and improving our understanding of dark matter (and perhaps even dark energy) could go a long way towards developing a Theory Of Everything (TOE). And knowing how all the fundamental forces of our universe interact will go a long way to finally knowing with certainty how it came to be.
These are exciting time indeed. With every step, we are peeling back the layers of our universe!
On July 14th, 2015, the New Horizons space probe made history when it became the first spacecraft to conduct a flyby of the dwarf planet of Pluto. Since that time, it has been making its way through the Kuiper Belt, on its way to joining Voyager 1 and 2 in interstellar space. With this milestone reached, many are wondering where we should send our spacecraft next.
Naturally, there are those who recommend we set our sights on our nearest star – particularly proponents of interstellar travel and exoplanet hunters. In addition to being Earth’s immediate neighbor, there is the possibility of one or more exoplanets in this system. Confirming the existence of exoplanets would be one of the main reasons to go. But more than that, it would be a major accomplishment!