The Tools Humanity Will Need for Living in the Year 1 Trillion

A new study considers what life could be like for civilizations 1 trillion years from now, when every star in the Universe will expand beyond the cosmic horizon. Credit: ESO/S. Brunier

Since the 1990s, astrophysicists have known that for the past few billion years, the Universe has been experiencing an accelerated rate of expansion. This gave rise to the theory that the Universe is permeated by a mysterious invisible energy known as “dark energy”, which acts against gravity and is pushing the cosmos apart. In time, this energy will become the dominant force in the Universe, causing all stars and galaxies to spread beyond the cosmic horizon.

At this point, all stars and galaxies in the Universe will no longer be visible or accessible from any other. The question remains, what will intelligent civilizations (such as our own) do for resources and energy at this point? This question was addressed in a recent paper by Dr. Abraham Loeb – the  Frank B. Baird, Jr., Professor of Science at Harvard University and the Chair of the Harvard Astronomy Department.

The paper, “Securing Fuel for our Frigid Cosmic Future“, recently appeared online. As he indicates in his study, when the Universe is ten times its current age (roughly 138 billion years old), all stars outside the Local Group of galaxies will no be accessible to us since they will be receding away faster than the speed of light. For this reason, he recommends that humanity follow the lesson from Aesop’s fable, “The Ants and the Grasshopper”.

This classic tale tells the story of ants who spent the summer collecting food for the winter while the grasshopper chose to enjoy himself. While different versions of the story exist that offer different takes on the importance of hard work, charity, and compassion, the lesson is simple: always be prepared. In this respect, Loeb recommends that advanced species migrate to rich clusters of galaxies.

These clusters represent the largest reservoirs of matter bound by gravity and would therefore be better able to resist the accelerated expansion of the Universe. As Dr. Loeb told Universe Today via email:

“In my essay I point out that mother Nature was kind to us as it spontaneously gave birth to the same massive reservoir of fuel that we would have aspired to collect by artificial means. Primordial density perturbations from the early universe led to the gravitational collapse of regions as large as tens of millions of light years, assembling all the matter in them into clusters of galaxies – each containing the equivalent of a thousand Milky Way galaxies.”

Dr. Loeb also indicated where humanity (or other advanced civilizations) should consider relocating to when the expansion of the Universe causes the stars of the Local Group to expand beyond the cosmic horizon. Within 50 million light years, he indicates, likes the Virgo Cluster, which contains about a thousands times more matter than the Milky Way Galaxy. The second closest is the Coma Cluster, a collection of over 1000 galaxies located about 336 million light years away.

Diagram showing the Virgo Supercluster. Credit: Wikipedia Commons/Andrew Z. Colvin

In addition to offering a solution to the accelerating expansion of the Universe, Dr. Loeb’s study also presents some interesting possibilities when it comes to the search for extra-terrestrial intelligence (SETI). If, in fact, there are already advanced civilizations migrating to prepare for the inevitable expansion of the Universe, they may be detectable by various means. As Dr. Loeb explained:

“If traveling civilizations transmit powerful signals then we might be able to see evidence for their migration towards clusters of galaxies. Moreover, we would expected a larger concentration of advanced civilization in clusters than would be expected simply by counting the number of galaxies there. Those that settle there could establish more prosperous communities, in analogy to civilizations near rivers or lakes on Earth.”

This paper is similar to a study Dr. Loeb conducted back in 2011, which appeared in the Journal of Cosmology and Astroparticle Physics under the title “Cosmology with Hypervelocity Stars“. At the time, Dr. Loeb was addressing what would happen in the distant future when all extragalactic light sources will cease to be visible or accessible due to the accelerating expansion of the Universe.

This study was a follow-up to a 2001 paper in which Dr. Loeb addressed what would become of the Universe in billions of years – which appeared in the journal Physical Review Letters under the title “The Long–Term Future of Extragalactic Astronomy“. Shortly thereafter, Dr. Loeb and Freeman Dyson himself began to correspond about what could be done to address this problem.

An artist’s conception of a hypervelocity star that has escaped the Milky Way. Credit: NASA

Their correspondence was the subject of an article by Nathan Sanders (a writer for Astrobites) who recounted what Dr. Loeb and Dr. Dyson had to say on the matter. As Dr. Loeb recalls:

“A decade ago I wrote a few papers on the long-term future of the Universe, trillions of years from now. Since the cosmic expansion is accelerating, I showed that once the universe will age by a factor of ten (about a hundred billion years from now), all matter outside our Local Group of galaxies (which includes the Milky Way and the Andromeda galaxy, along with their satellites) will be receding away from us faster than light. After one of my papers was posted in 2011, Freeman Dyson wrote to me and suggested to a vast “cosmic engineering project” in which we will concentrate matter from a large-scale region around us to a small enough volume such that it will stay bound by its own gravity and not expand with the rest of the Universe.”

At the time, Dr. Loeb indicated that data gathered by the Sloan Digital Sky Survey (SDSS) indicated that attempts at “super-engineering” did not appear to be taking place. This was based on the fact that the galaxy clusters observed by the SDSS were not overdense, nor did they exhibit particularly high velocities (as would be expected). To this, Dr. Dyson wrote: “That is disappointing. On the other hand, if our colleagues have been too lazy to do the job, we have plenty of time to start doing it ourselves.”

A similar idea was presented in a recent paper by Dr. Dan Hooper, an astrophysicist from the Fermi National Accelerator Laboratory (FNAL) and the University of Chicago. In his study, Dr. Hooper suggested that advanced species could survive all stars in the Local Group expanding beyond the cosmic horizon (100 billion years from now), by harvesting stars across tens of millions of light years.

Artist impression of the 14 galaxies detected by ALMA as they appear in the very early, very distant universe. These galaxies are in the process of merging and will eventually form the core of a massive galaxy cluster. Credit: NRAO/AUI/NSF; S. Dagnello

This harvesting would consist of building unconventional Dyson Spheres that would use the energy they collected from stars to propel them towards the center of the species’ civilization. However, only stars that range in mass of 0.2 to 1 Solar Masses would be usable, as high-mass stars would evolve beyond their main sequence before reaching the destination and low-mass stars would not generate enough energy for acceleration to make it in time.

But as Dr. Loeb indicates, there are additional limitations to this approach, which makes migrating more attractive than harvesting.

“First, we do not know of any technology that enables moving stars around, and moreover Sun-like stars only shine for about ten billion years (of order the current age of the Universe) and cannot serve as nuclear furnaces that would keep us warm into the very distant future. Therefore, an advanced civilization does not need to embark on a giant construction project as suggested by Dyson and Hooper, but only needs to propel itself towards the nearest galaxy cluster and take advantage of the cluster resources as fuel for its future prosperity.”

While this may seem like a truly far-off concern, it does raise some interesting questions about the long-term evolution of the Universe and how intelligent civilizations may be forced to adapt. In the meantime, if it offers some additional possibilities for searching for extra-terrestrial intelligences (ETIs), then so much the better.

And as Dr. Dyson said, if there are currently no ETIs preparing for the coming “cosmic winter” with cosmic engineering projects, perhaps it is something humanity can plan to tackle someday!

Further Reading: arXiv, Journal of Cosmology and Astroparticle Physics, astrobites, astrobites (2)

How an Advanced Civilization Could Stop Dark Energy From Preventing Their Future Exploration

This illustration shows the evolution of the Universe, from the Big Bang on the left, to modern times on the right. Image: NASA

During the 1930s, astronomers came to realize that the Universe is in a state of expansion. By the 1990s, they realized that the rate at which it is expansion is accelerating, giving rise to the theory of “Dark Energy”. Because of this, it is estimated that in the next 100 billion years, all stars within the Local Group – the part of the Universe that includes a total of 54 galaxies, including the Milky Way – will expand beyond the cosmic horizon.

At this point, these stars will no longer be observable, but inaccessible – meaning that no advanced civilization will be able to harness their energy. Addressing this, Dr. Dan Hooper  – an astrophysicist from the Fermi National Accelerator Laboratory (FNAL) and the University of Chicago – recently conducted a study that indicated how a sufficiently advanced civilization might be able to harvest these stars and prevent them from expanding outward.

Continue reading “How an Advanced Civilization Could Stop Dark Energy From Preventing Their Future Exploration”

Big Bang Theory: Evolution of Our Universe

Illustration of the Big Bang Theory
The Big Bang Theory: A history of the Universe starting from a singularity and expanding ever since. Credit: grandunificationtheory.com

How was our Universe created? How did it come to be the seemingly infinite place we know of today? And what will become of it, ages from now? These are the questions that have been puzzling philosophers and scholars since the beginning the time, and led to some pretty wild and interesting theories. Today, the consensus among scientists, astronomers and cosmologists is that the Universe as we know it was created in a massive explosion that not only created the majority of matter, but the physical laws that govern our ever-expanding cosmos. This is known as The Big Bang Theory.

For almost a century, the term has been bandied about by scholars and non-scholars alike. This should come as no surprise, seeing as how it is the most accepted theory of our origins. But what exactly does it mean? How was our Universe conceived in a massive explosion, what proof is there of this, and what does the theory say about the long-term projections for our Universe?

The basics of the Big Bang theory are fairly simple. In short, the Big Bang hypothesis states that all of the current and past matter in the Universe came into existence at the same time, roughly 13.8 billion years ago. At this time, all matter was compacted into a very small ball with infinite density and intense heat called a Singularity. Suddenly, the Singularity began expanding, and the universe as we know it began.

Continue reading “Big Bang Theory: Evolution of Our Universe”

The Search for Dark Energy Just Got Easier

The Victor M. Blanco telescope at Cerro Tololo Interamerican Observatory (CTIO) in the Chilean Andes. Credit: Berkeley Lab

Since the early 20th century, scientists and physicists have been burdened with explaining how and why the Universe appears to be expanding at an accelerating rate. For decades, the most widely accepted explanation is that the cosmos is permeated by a mysterious force known as “dark energy”. In addition to being responsible for cosmic acceleration, this energy is also thought to comprise 68.3% of the universe’s non-visible mass.

Much like dark matter, the existence of this invisible force is based on observable phenomena and because it happens to fit with our current models of cosmology, and not direct evidence. Instead, scientists must rely on indirect observations, watching how fast cosmic objects (specifically Type Ia supernovae) recede from us as the universe expands.

This process would be extremely tedious for scientists – like those who work for the Dark Energy Survey (DES) – were it not for the new algorithms developed collaboratively by researchers at Lawrence Berkeley National Laboratory and UC Berkeley.

“Our algorithm can classify a detection of a supernova candidate in about 0.01 seconds, whereas an experienced human scanner can take several seconds,” said Danny Goldstein, a UC Berkeley graduate student who developed the code to automate the process of supernova discovery on DES images.

Currently in its second season, the DES takes nightly pictures of the Southern Sky with DECam – a 570-megapixel camera that is mounted on the Victor M. Blanco telescope at Cerro Tololo Interamerican Observatory (CTIO) in the Chilean Andes. Every night, the camera generates between 100 Gigabytes (GB) and 1 Terabyte (TB) of imaging data, which is sent to the National Center for Supercomputing Applications (NCSA) and DOE’s Fermilab in Illinois for initial processing and archiving.

A Type Ia supernova occurs when a white dwarf accretes material from a companion star until it exceeds the Chandrasekhar limit and explodes. By studying these exploding stars, astronomers can measure dark energy and the expansion of the universe. CfA scientists have found a way to correct for small variations in the appearance of these supernovae, so that they become even better standard candles. The key is to sort the supernovae based on their color.  Credit: NASA/CXC/M. Weiss
By studying Type Ia supernova, astronomers can measure dark energy and the expansion of the universe. Credit: NASA/CXC/M. Weiss

Object recognition programs developed at the National Energy Research Scientific Computing Center (NERSC) and implemented at NCSA then comb through the images in search of possible detections of Type Ia supernovae. These powerful explosions occur in binary star systems where one star is a white dwarf, which accretes material from a companion star until it reaches a critical mass and explodes in a Type Ia supernova.

“These explosions are remarkable because they can be used as cosmic distance indicators to within 3-10 percent accuracy,” says Goldstein.

Distance is important because the further away an object is located in space, the further back in time it is. By tracking Type Ia supernovae at different distances, researchers can measure cosmic expansion throughout the universe’s history. This allows them to put constraints on how fast the universe is expanding and maybe even provide other clues about the nature of dark energy.

“Scientifically, it’s a really exciting time because several groups around the world are trying to precisely measure Type Ia supernovae in order to constrain and understand the dark energy that is driving the accelerated expansion of the universe,” says Goldstein, who is also a student researcher in Berkeley Lab’s Computational Cosmology Center (C3).

UC Berkeley / Berkeley Lab graduate student Danny Goldstein developed a new code using the machine learning technique Random Forest to vet detections of supernova candidates automatically, in real time, optimizing it for the Dark Energy Survey. Credit: Danny Goldstein, UC Berkeley / Berkeley Lab)
Goldstein’s new code uses machine learning techniques to vet detections of supernova candidates. Credit: Danny Goldstein, UC Berkeley/Berkeley Lab)

The DES begins its search for Type Ia explosions by uncovering changes in the night sky, which is where the image subtraction pipeline developed and implemented by researchers in the DES supernova working group comes in. The pipeline subtracts images that contain known cosmic objects from new images that are exposed nightly at CTIO.

Each night, the pipeline produces between 10,000 and a few hundred thousand detections of supernova candidates that need to be validated.

“Historically, trained astronomers would sit at the computer for hours, look at these dots, and offer opinions about whether they had the characteristics of a supernova, or whether they were caused by spurious effects that masquerade as supernovae in the data. This process seems straightforward until you realize that the number of candidates that need to be classified each night is prohibitively large and only one in a few hundred is a real supernova of any type,” says Goldstein. “This process is extremely tedious and time-intensive. It also puts a lot of pressure on the supernova working group to process and scan data fast, which is hard work.”

To simplify the task of vetting candidates, Goldstein developed a code that uses the machine learning technique “Random Forest” to vet detections of supernova candidates automatically and in real-time to optimize them for the DES. The technique employs an ensemble of decision trees to automatically ask the types of questions that astronomers would typically consider when classifying supernova candidates.

Evolution of a Type Ia supernova. Credit: NASA/ESA/A. Feild
Evolution of a Type Ia supernova. Credit: NASA/ESA/A. Feild

At the end of the process, each detection of a candidate is given a score based on the fraction of decision trees that considered it to have the characteristics of a detection of a supernova. The closer the classification score is to one, the stronger the candidate. Goldstein notes that in preliminary tests, the classification pipeline achieved 96 percent overall accuracy.

“When you do subtraction alone you get far too many ‘false-positives’ — instrumental or software artifacts that show up as potential supernova candidates — for humans to sift through,” says Rollin Thomas, of Berkeley Lab’s C3, who was Goldstein’s collaborator.

He notes that with the classifier, researchers can quickly and accurately strain out the artifacts from supernova candidates. “This means that instead of having 20 scientists from the supernova working group continually sift through thousands of candidates every night, you can just appoint one person to look at maybe few hundred strong candidates,” says Thomas. “This significantly speeds up our workflow and allows us to identify supernovae in real-time, which is crucial for conducting follow up observations.”

“Using about 60 cores on a supercomputer we can classify 200,000 detections in about 20 minutes, including time for database interaction and feature extraction.” says Goldstein.

Goldstein and Thomas note that the next step in this work is to add a second-level of machine learning to the pipeline to improve the classification accuracy. This extra layer would take into account how the object was classified in previous observations as it determines the probability that the candidate is “real.” The researchers and their colleagues are currently working on different approaches to achieve this capability.

Further Reading: Berkley Lab