These 25 Billion Galaxies are Definitely Living in a Simulation

Understanding the Universe and how it has evolved over the course of billions of years is a rather daunting task. On the one hand, it involves painstakingly looking billions of light years into deep space (and thus, billions of years back in time) to see how its large-scale structure changed over time. Then, massive amounts of computing power are needed to simulate what it should look like (based on known physics) and seeing if they match up.

That is what a team of astrophysicists from the University of Zurich (UZH) did using the “Piz Daint” supercomputer. With this sophisticated machine, they simulated the formation of our entire Universe and produced a catalog of about 25 billion virtual galaxies. This catalog will be launched aboard the ESA’s Euclid mission in 2020, which will spend six years probing the Universe for the sake of investigating dark matter.

The team’s work was detailed in a study that appeared recently in the journal Computational Astrophysics and Cosmology. Led by Douglas Potter, the team spent the past three years developing an optimized code to describe (with unprecedented accuracy) the dynamics of dark matter as well as the formation of large-scale structures in the Universe.

The code, known as PKDGRAV3, was specifically designed to optimally use the available memory and processing power of modern super-computing architectures. After being executed on the “Piz Daint” supercomputer – located at the Swiss National Computing Center (CSCS) – for a period of only 80 hours, it managed to generate a virtual Universe of two trillion macro-particles, from which a catalogue of 25 billion virtual galaxies was extracted.

Intrinsic to their calculations was the way in which dark matter fluid would have evolved under its own gravity, thus leading to the formation of small concentrations known as “dark matter halos”. It is within these halos – a theoretical component that is thought to extend well beyond the visible extent of a galaxy – that galaxies like the Milky Way are believed to have formed.

Naturally, this presented quite the challenge. It required not only a precise calculation of how the structure of dark matter evolves, but also required that they consider how this would influence every other part of the Universe. As Joachim Stadel, a professor with the Center for Theoretical Astrophysics and Cosmology at UZH and a co-author on the paper, told Universe Today via email:

“We simulated 2 trillion such dark matter “pieces”, the largest calculation of this type that has ever been performed. To do this we had to use a computation technique known as the “fast multipole method” and use one of the fastest computers in the world, “Piz Daint” at the Swiss National Supercomputing Centre, which among other things has very fast graphics processing units (GPUs) which allow an enormous speed-up of the floating point calculations needed in the simulation. The dark matter clusters into dark matter “halos” which in turn harbor the galaxies. Our calculation accurately produces the distribution and properties of the dark matter, including the halos, but the galaxies, with all of their properties, must be placed within these halos using a model. This part of the task was performed by our colleagues at Barcelona under the direction of Pablo Fossalba and Francisco Castander. These galaxies then have the expected colors, spatial distribution and the emission lines (important for the spectra observed by Euclid) and can be used to test and calibrate various systematics and random errors within the entire instrument pipeline of Euclid.”

Artist impression of the Euclid probe, which is set to launch in 2020. Credit: ESA

Thanks to the high precision of their calculations, the team was able to turn out a catalog that met the requirements of the European Space Agency’s Euclid mission, whose main objective is to explore the “dark universe”. This kind of research is essential to understanding the Universe on the largest of scales, mainly because the vast majority of the Universe is dark.

Between the 23% of the Universe which is made up of dark matter and the 72% that consists of dark energy, only one-twentieth of the Universe is actually made up of matter that we can see with normal instruments (aka. “luminous” or baryonic matter). Despite being proposed during the 1960s and 1990s respectively, dark matter and dark energy remain two of the greatest cosmological mysteries.

Given that their existence is required in order for our current cosmological models to work, their existence has only ever been inferred through indirect observation. This is precisely what the Euclid mission will do over the course of its six year mission, which will consist of it capturing light from billions of galaxies and measuring it for subtle distortions caused by the presence of mass in the foreground.

Much in the same way that measuring background light can be distorted by the presence of a gravitational field between it and the observer (i.e. a time-honored test for General Relativity), the presence of dark matter will exert a gravitational influence on the light. As Stadel explained, their simulated Universe will play an important role in this Euclid mission – providing a framework that will be used during and after the mission.

Diagram showing the Lambda-CBR universe, from the Big Bang to the the current era. Credit: Alex Mittelmann/Coldcreation

“In order to forecast how well the current components will be able to make a given measurement, a Universe populated with galaxies as close as possible to the real observed Universe must be created,” he said. “This ‘mock’ catalogue of galaxies is what was generated from the simulation and will be now used in this way. However, in the future when Euclid begins taking data, we will also need to use simulations like this to solve the inverse problem. We will then need to be able to take the observed Universe and determine the fundamental parameters of cosmology; a connection which currently can only be made at a sufficient precision by large simulations like the one we have just performed. This is a second important aspect of how such simulation work [and] is central to the Euclid mission.”

From the Euclid data, researchers hope to obtain new information on the nature of dark matter, but also to discover new physics that goes beyond the Standard Model of particle physics – i.e. a modified version of general relativity or a new type of particle. As Stadel explained, the best outcome for the mission would be one in which the results do not conform to expectations.

“While it will certainly make the most accurate measurements of fundamental cosmological parameters (such as the amount of dark matter and energy in the Universe) far more exciting would be to measure something that conflicts or, at the very least, is in tension with the current ‘standard lambda cold dark matter‘ (LCDM) model,” he said. “One of the biggest questions is whether the so called ‘dark energy’ of this model is actually a form of energy, or whether it is more correctly described by a modification to Einstein’s general theory of relativity. While we may just begin to scratch the surface of such questions, they are very important and have the potential to change physics at a very fundamental level.”

In the future, Stadel and his colleagues hope to be running simulations on cosmic evolution that take into account both dark matter and dark energy. Someday, these exotic aspects of nature could form the pillars of a new cosmology, one which reaches beyond the physics of the Standard Model. In the meantime, astrophysicists from around the world will likely be waiting for the first batch of results from the Euclid mission with baited breath.

Euclid is one of several missions that is currently engaged in the hunt for dark matter and the study of how it shaped our Universe. Others include the Alpha Magnetic Spectrometer (AMS-02) experiment aboard the ISS, the ESO’s Kilo Degree Survey (KiDS), and CERN’s Large Hardon Collider. With luck, these experiments will reveal pieces to the cosmological puzzle that have remained elusive for decades.

Further Reading: UZH, Computational Astrophysics and Cosmology

How Do We Know the Universe is Flat? Discovering the Topology of the Universe

Does This Look Flat?


Whenever we talk about the expanding Universe, everyone wants to know how this is going to end. Sure, they say, the fact that most of the galaxies we can see are speeding away from us in all directions is really interesting. Sure, they say, the Big Bang makes sense, in that everything was closer together billions of years ago.

But how does it end? Does this go on forever? Do galaxies eventually slow down, come to a stop, and then hurtle back together in a Big Crunch? Will we get a non-stop cycle of Big Bangs, forever and ever?

Illustration of the Big Bang Theory
The Big Bang Theory: A history of the Universe starting from a singularity and expanding ever since. Credit: grandunificationtheory.com

We’ve done a bunch of articles on many different aspects of this question, and the current conclusion astronomers have reached is that because the Universe is flat, it’s never going to collapse in on itself and start another Big Bang.

But wait, what does it mean to say that the Universe is “flat”? Why is that important, and how do we even know?

Before we can get started talking about the flatness of the Universe, we need to talk about flatness in general. What does it mean to say that something is flat?

If you’re in a square room and walk around the corners, you’ll return to your starting point having made 4 90-degree turns. You can say that your room is flat. This is Euclidian geometry.

Earth, seen from space, above the Pacific Ocean. Credit: NASA

But if you make the same journey on the surface of the Earth. Start at the equator, make a 90-degree turn, walk up to the North Pole, make another 90-degree turn, return to the equator, another 90-degree turn and return to your starting point.

In one situation, you made 4 turns to return to your starting point, in another situation it only took 3. That’s because the topology of the surface you were walking on decided what happens when you take a 90-degree turn.

You can imagine an even more extreme example, where you’re walking around inside a crater, and it takes more than 4 turns to return to your starting point.

Another analogy, of course, is the idea of parallel lines. If you fire off two parallel lines at the North pole, they move away from each other, following the topology of the Earth and then come back together.

Got that? Great.

Omega Centauri. Credits: NASA, ESA and the Hubble SM4 ERO Team

Now, what about the Universe itself? You can imagine that same analogy. Imaging flying out into space on a rocket for billions of light-years, performing 90-degree maneuvers and returning to your starting point.

You can’t do it in 3, or 5, you need 4, which means that the topology of the Universe is flat. Which is totally intuitive, right? I mean, that would be your assumption.

But astronomers were skeptical and needed to know for certain, and so, they set out to test this assumption.

In order to prove the flatness of the Universe, you would need to travel a long way. And astronomers use the largest possible observation they can make. The Cosmic Microwave Background Radiation, the afterglow of the Big Bang, visible in all directions as a red-shifted, fading moment when the Universe became transparent about 380,000 years after the Big Bang.

Cosmic Microwave Background Radiation. Image credit: NASA
Cosmic Microwave Background Radiation. Image credit: NASA

When this radiation was released, the entire Universe was approximately 2,700 C. This was the moment when it was cool enough for photons were finally free to roam across the Universe. The expansion of the Universe stretched these photons out over their 13.8 billion year journey, shifting them down into the microwave spectrum, just 2.7 degrees above absolute zero.

With the most sensitive space-based telescopes they have available, astronomers are able to detect tiny variations in the temperature of this background radiation.

And here’s the part that blows my mind every time I think about it. These tiny temperature variations correspond to the largest scale structures of the observable Universe. A region that was a fraction of a degree warmer become a vast galaxy cluster, hundreds of millions of light-years across.

Having a non-flat universe would cause distortions between what we saw in the CMBR compared to the current universe. Credit: NASA / WMAP Science Team

The Cosmic Microwave Background Radiation just gives and gives, and when it comes to figuring out the topology of the Universe, it has the answer we need. If the Universe was curved in any way, these temperature variations would appear distorted compared to the actual size that we see these structures today.

But they’re not. To best of its ability, ESA’s Planck space telescope, can’t detect any distortion at all. The Universe is flat.

Illustration of the ESA Planck Telescope in Earth orbit (Credit: ESA)

Well, that’s not exactly true. According to the best measurements astronomers have ever been able to make, the curvature of the Universe falls within a range of error bars that indicates it’s flat. Future observations by some super Planck telescope could show a slight curvature, but for now, the best measurements out there say… flat.

We say that the Universe is flat, and this means that parallel lines will always remain parallel. 90-degree turns behave as true 90-degree turns, and everything makes sense.

But what are the implications for the entire Universe? What does this tell us?

Unfortunately, the biggest thing is what it doesn’t tell us. We still don’t know if the Universe is finite or infinite. If we could measure its curvature, we could know that we’re in a finite Universe, and get a sense of what its actual true size is, out beyond the observable Universe we can measure.

The observable – or inferrable universe. This may just be a small component of the whole ball game.

We know that the volume of the Universe is at least 100 times more than we can observe. At least. If the flatness error bars get brought down, the minimum size of the Universe goes up.

And remember, an infinite Universe is still on the table.

Another thing this does, is that it actually causes a problem for the original Big Bang theory, requiring the development of a theory like inflation.

Since the Universe is flat now, it must have been flat in the past, when the Universe was an incredibly dense singularity. And for it to maintain this level of flatness over 13.8 billion years of expansion, in kind of amazing.

In fact, astronomers estimate that the Universe must have been flat to 1 part within 1×10^57 parts.

Which seems like an insane coincidence. The development of inflation, however, solves this, by expanding the Universe an incomprehensible amount moments after the Big Bang. Pre and post inflation Universes can have vastly different levels of curvature.

In the olden days, cosmologists used to say that the flatness of the Universe had implications for its future. If the Universe was curved where you could complete a full journey with less than 4 turns, that meant it was closed and destined to collapse in on itself.

And it was more than 4 turns, it was open and destined to expand forever.

New results from NASA’s Galaxy Evolution Explorer and the Anglo-Australian Telescope atop Siding Spring Mountain in Australia confirm that dark energy (represented by purple grid) is a smooth, uniform force that now dominates over the effects of gravity (green grid). Image credit: NASA/JPL-Caltech

Well, that doesn’t really matter any more. In 1998, the astronomers discovered dark energy, which is this mysterious force accelerating the expansion of the Universe. Whether the Universe is open, closed or flat, it’s going to keep on expanding. In fact, that expansion is going to accelerate, forever.

I hope this gives you a little more understanding of what cosmologists mean when they say that the Universe is flat. And how do we know it’s flat? Very precise measurements in the Cosmic Microwave Background Radiation.

Is there anything that all pervasive relic of the early Universe can’t do?

New Explanation for Dark Energy? Tiny Fluctuations of Time and Space

Since the late 1920s, astronomers have been aware of the fact that the Universe is in a state of expansion. Initially predicted by Einstein’s Theory of General Relativity, this realization has gone on to inform the most widely-accepted cosmological model – the Big Bang Theory. However, things became somewhat confusing during the 1990s, when improved observations showed that the Universe’s rate of expansion has been accelerating for billions of years.

This led to the theory of Dark Energy, a mysterious invisible force that is driving the expansion of the cosmos. Much like Dark Matter which explained the “missing mass”, it then became necessary to find this elusive energy, or at least provide a coherent theoretical framework for it. A new study from the University of British Columbia (UBC) seeks to do just that by postulating the Universe is expanding due to fluctuations in space and time.

The study – which was recently published in the journal Physical Review D – was led by Qingdi Wang, a PhD student with the Department of Physics and Astronomy at UBC. Under the supervisions of UBC Professor William Unruh (the man who proposed the Unruh Effect) and with assistance from Zhen Zhu (another PhD student at UBC), they provide a new take on Dark Energy.

Diagram showing the Lambda-CBR universe, from the Big Bang to the the current era. Credit: Alex Mittelmann/Coldcreation

The team began by addressing the inconsistencies arising out of the two main theories that together explain all natural phenomena in the Universe. These theories are none other than General Relativity and quantum mechanics, which effectively explain how the Universe behaves on the largest of scales (i.e. stars, galaxies, clusters) and the smallest (subatomic particles).

Unfortunately, these two theories are not consistent when it comes to a little matter known as gravity, which scientists are still unable to explain in terms of quantum mechanics. The existence of Dark Energy and the expansion of the Universe are another point of disagreement. For starters, candidates theories like vacuum energy – which is one of the most popular explanations for Dark Energy – present serious incongruities.

According to quantum mechanics, vacuum energy would have an incredibly large energy density to it. But if this is true, then General Relativity predicts that this energy would have an incredibly strong gravitational effect, one which would be powerful enough to cause the Universe to explode in size. As Prof. Unruh shared with Universe Today via email:

“The problem is that any naive calculation of the vacuum energy gives huge values. If one assumes that there is some sort of cutoff so one cannot get energy densities much greater than the Planck energy density (or about 1095 Joules/meter³)  then one finds that one gets a Hubble constant – the time scale on which the Universe roughly doubles in size – of the order of 10-44 sec. So, the usual approach is to say that somehow something reduces that down so that one gets the actual expansion rate of about 10 billion years instead. But that ‘somehow’ is pretty mysterious and no one has come up with an even half convincing mechanism.”

Timeline of the Big Bang and the expansion of the Universe. Credit: NASA

Whereas other scientists have sought to modify the theories of General Relativity and quantum mechanics in order to resolve these inconsistencies, Wang and his colleagues sought a different approach. As Wang explained to Universe Today via email:

“Previous studies are either trying to modify quantum mechanics in some way to make vacuum energy small or trying to modify General Relativity in some way to make gravity numb for vacuum energy. However, quantum mechanics and General Relativity are the two most successful theories that explain how our Universe works… Instead of trying to modify quantum mechanics or General Relativity, we believe that we should first understand them better. We takes the large vacuum energy density predicted by quantum mechanics seriously and just let them gravitate according to General Relativity without modifying either of them.”

For the sake of their study, Wang and his colleagues performed new sets of calculations on vacuum energy that took its predicted high energy density into account. They then considered the possibility that on the tiniest of scales – billions of times smaller than electrons – the fabric of spacetime is subject to wild fluctuations, oscillating at every point between expansion and contraction.

Could fluctuations at the tiniest levels of space time explain Dark Energy and the expansion of the cosmos? Credit: University of Washington

As it swings back and forth, the result of these oscillations is a net effect where the Universe expands slowly, but at an accelerating rate. After performing their calculations, they noted that such an explanation was consistent with both the existence of quantum vacuum energy density and General Relativity. On top of that, it is also consistent with what scientists have been observing in our Universe for almost a century. As Unruh described it:

“Our calculations showed that one could consistently regard [that] the Universe on the tiniest scales is actually expanding and contracting at an absurdly fast rate; but that on a large scale, because of an averaging over those tiny scales, physics would not notice that ‘quantum foam’. It has a tiny residual effect in giving an effective cosmological constant (dark energy type effect). In some ways it is like waves on the ocean which travel as if the ocean were perfectly smooth but really we know that there is this incredible dance of the atoms that make up the water, and waves average over those fluctuations, and act as if the surface was smooth.”

In contrast to conflicting theories of a Universe where the various forces that govern it cannot be resolved and must cancel each other out, Wang and his colleagues presents a picture where the Universe is constantly in motion. In this scenario, the effects of vacuum energy are actually self-cancelling, and also give rise to the expansion and acceleration we have been observing all this time.

While it may be too soon to tell, this image of a Universe that is highly-dynamic (even on the tiniest scales) could revolutionize our understanding of spacetime. At the very least, these theoretical findings are sure to stimulate debate within the scientific community, as well as experiments designed to offer direct evidence. And that, as we know, is the only way we can advance our understanding of this thing known as the Universe.

Further Reading: UBC News, Physical Review D

Rise Of The Super Telescopes: The Wide Field Infrared Survey Telescope

NASA's Wide Field Infrared Survey Telescope (WFIRST) will capture Hubble-quality images covering swaths of sky 100 times larger than Hubble does, enabling cosmic evolution studies. Its Coronagraph Instrument will directly image exoplanets and study their atmospheres. Credits: NASA/GSFC/Conceptual Image Lab

We humans have an insatiable hunger to understand the Universe. As Carl Sagan said, “Understanding is Ecstasy.” But to understand the Universe, we need better and better ways to observe it. And that means one thing: big, huge, enormous telescopes.

In this series we’ll look at the world’s upcoming Super Telescopes:

The Wide Field Infrared Survey Telescope (WFIRST)

It’s easy to forget the impact that the Hubble Space Telescope has had on our state of knowledge about the Universe. In fact, that might be the best measurement of its success: We take the Hubble, and all we’ve learned from it, for granted now. But other space telescopes are being developed, including the WFIRST, which will be much more powerful than the Hubble. How far will these telescopes extend our understanding of the Universe?

“WFIRST has the potential to open our eyes to the wonders of the universe, much the same way Hubble has.” – John Grunsfeld, NASA Science Mission Directorate

The WFIRST might be the true successor to the Hubble, even though the James Webb Space Telescope (JWST) is often touted as such. But it may be incorrect to even call WFIRST a telescope; it’s more accurate to call it an astrophysics observatory. That’s because one of its primary science objectives is to study Dark Energy, that rather mysterious force that drives the expansion of the Universe, and Dark Matter, the difficult-to-detect matter that slows that expansion.

WFIRST will have a 2.4 meter mirror, the same size as the Hubble. But, it will have a camera that will expand the power of that mirror. The Wide Field Instrument is a 288-megapixel multi-band near-infrared camera. Once it’s in operation, it will capture images that are every bit as sharp as those from Hubble. But there is one huge difference: The Wide Field Instrument will capture images that cover over 100 times the sky that Hubble does.

Alongside the Wide Field Instrument, WFIRST will have the Coronagraphic Instrument. The Coronagraphic Instrument will advance the study of exoplanets. It’ll use a system of filters and masks to block out the light from other stars, and hone in on planets orbiting those stars. This will allow very detailed study of the atmospheres of exoplanets, one of the main ways of determining habitability.

WFIRST is slated to be launched in 2025, although it’s too soon to have an exact date. But when it launches, the plan is for WFIRST to travel to the Sun-Earth LaGrange Point 2 (L2.) L2 is a gravitationally balanced point in space where WFIRST can do its work without interruption. The mission is set to last about 6 years.

Probing Dark Energy

“WFIRST has the potential to open our eyes to the wonders of the universe, much the same way Hubble has,” said John Grunsfeld, astronaut and associate administrator of NASA’s Science Mission Directorate at Headquarters in Washington. “This mission uniquely combines the ability to discover and characterize planets beyond our own solar system with the sensitivity and optics to look wide and deep into the universe in a quest to unravel the mysteries of dark energy and dark matter.”

In a nutshell, there are two proposals for what Dark Energy can be. The first is the cosmological constant, where Dark Energy is uniform throughout the cosmos. The second is what’s known as scalar fields, where the density of Dark Energy can vary in time and space.

We used to think that the Universe expanded at a steady rate. Then in the 1990s we discovered that the expansion had started accelerating about 5 billion years ago. Dark Energy is the name given to the force driving that expansion. Image: NASA/STSci/Ann Feild
We used to think that the Universe expanded at a steady rate. Then in the 1990s we discovered that the expansion had accelerated. Dark Energy is the name given to the force driving that expansion. Image: NASA/STSci/Ann Feild

Since the 1990s, observations have shown us that the expansion of the Universe is accelerating. That acceleration started about 5 billion years ago. We think that Dark Energy is responsible for that accelerated expansion. By providing such large, detailed images of the cosmos, WFIRST will let astronomers map expansion over time and over large areas. WFIRST will also precisely measure the shapes, positions and distances of millions of galaxies to track the distribution and growth of cosmic structures, including galaxy clusters and the Dark Matter accompanying them. The hope is that this will give us a next level of understanding when it comes to Dark Energy.

If that all sounds too complicated, look at it this way: We know the Universe is expanding, and we know that the expansion is accelerating. We want to know why it’s expanding, and how. We’ve given the name ‘Dark Energy’ to the force that’s driving that expansion, and now we want to know more about it.

Probing Exoplanets

Dark Energy and the expansion of the Universe is a huge mystery, and a question that drives cosmologists. (They really want to know how the Universe will end!) But for many of the rest of us, another question is even more compelling: Are we alone in the Universe?

There’ll be no quick answer to that one, but any answer we find begins with studying exoplanets, and that’s something that WFIRST will also excel at.

Artist's concept of the TRAPPIST-1 star system, an ultra-cool dwarf that has seven Earth-size planets orbiting it. We're going to keep finding more and more solar systemsl like this, but we need observatories like WFIRST, with starshades, to understand the planets better. Credits: NASA/JPL-Caltech
Artist’s concept of the TRAPPIST-1 star system, an ultra-cool dwarf that has seven Earth-size planets orbiting it. We’re going to keep finding more and more solar systems like this, but we need observatories like WFIRST to understand the planets better. Credits: NASA/JPL-Caltech

“WFIRST is designed to address science areas identified as top priorities by the astronomical community,” said Paul Hertz, director of NASA’s Astrophysics Division in Washington. “The Wide-Field Instrument will give the telescope the ability to capture a single image with the depth and quality of Hubble, but covering 100 times the area. The coronagraph will provide revolutionary science, capturing the faint, but direct images of distant gaseous worlds and super-Earths.”

“The coronagraph will provide revolutionary science, capturing the faint, but direct images of distant gaseous worlds and super-Earths.” – Paul Hertz, NASA Astrophysics Division

The difficulty in studying exoplanets is that they are all orbiting stars. Stars are so bright they make it impossible to see their planets in any detail. It’s like staring into a lighthouse miles away and trying to study an insect near the lighthouse.

The Coronagraphic Instrument on board WFIRST will excel at blocking out the light of distant stars. It does that with a system of mirrors and masks. This is what makes studying exoplanets possible. Only when the light from the star is dealt with, can the properties of exoplanets be examined.

This will allow detailed measurements of the chemical composition of an exoplanet’s atmosphere. By doing this over thousands of planets, we can begin to understand the formation of planets around different types of stars. There are some limitations to the Coronagraphic Instrument, though.

The Coronagraphic Instrument was kind of a late addition to WFIRST. Some of the other instrumentation on WFIRST isn’t optimized to work with it, so there are some restrictions to its operation. It will only be able to study gas giants, and so-called Super-Earths. These larger planets don’t require as much finesse to study, simply because of their size. Earth-like worlds will likely be beyond the power of the Coronagraphic Instrument.

These limitations are no big deal in the long run. The Coronagraph is actually more of a technology demonstration, and it doesn’t represent the end-game for exoplanet study. Whatever is learned from this instrument will help us in the future. There will be an eventual successor to WFIRST some day, perhaps decades from now, and by that time Coronagraph technology will have advanced a great deal. At that future time, direct snapshots of Earth-like exoplanets may well be possible.

But maybe we won’t have to wait that long.

Starshade To The Rescue?

There is a plan to boost the effectiveness of the Coronagraph on WFIRST that would allow it to image Earth-like planets. It’s called the EXO-S Starshade.

The EXO-S Starshade is a 34m diameter deployable shading system that will block starlight from impairing the function of WFIRST. It would actually be a separate craft, launched separately and sent on its way to rendezvous with WFIRST at L2. It would not be tethered, but would orient itself with WFIRST through a system of cameras and guide lights. In fact, part of the power of the Starshade is that it would be about 40,000 to 50,000 km away from WFIRST.

Dark Energy and Exoplanets are priorities for WFIRST, but there are always other discoveries awaiting better telescopes. It’s not possible to predict everything that we’ll learn from WFIRST. With images as detailed as Hubble’s, but 100 times larger, we’re in for some surprises.

“This mission will survey the universe to find the most interesting objects out there.” – Neil Gehrels, WFIRST Project Scientist

“In addition to its exciting capabilities for dark energy and exoplanets, WFIRST will provide a treasure trove of exquisite data for all astronomers,” said Neil Gehrels, WFIRST project scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “This mission will survey the universe to find the most interesting objects out there.”

With all of the Super Telescopes coming on line in the next few years, we can expect some amazing discoveries. In 10 to 20 years time, our knowledge will have advanced considerably. What will we learn about Dark Matter and Dark Energy? What will we know about exoplanet populations?

Right now it seems like we’re just groping towards a better understanding of these things, but with WFIRST and the other Super Telescopes, we’re poised for more purposeful study.

James Webb Space Telescope Celebrated in Stunning New Video

Behold, the mighty primary mirror of the James Webb Space Telescope, in all its gleaming glory! Image: NASA/Chris Gunn

NASA has some high hopes for the James Webb Space Telescope, which finished “cold” phase of its construction at the end of November, 2016. The result of 20 years of engineering and construction, this telescope is seen as Hubble’s natural successor. Once it is deployed in October of 2018, it will use a 6.5 meter (21 ft 4 in) primary mirror to examine the Universe in the visible, near-infrared and mid-infrared wavelengths.

All told, the JWST will be 100 times more powerful than its predecessor, and will be capable of looking over 13 billion years in time. To honor the completion of the telescope, Northrop Grumman – the company contracted by NASA to build it – and Crazy Boat Pictures teamed up to produce a short film about it. Titled “Into the Unknown – the Story of NASA’s James Webb Space Telescope“, the video chronicles the project from inception to completion.

The film (which you can watch at the bottom of the page) shows the construction of the telescopes large mirrors, its instrument package and its framework. It also features conversations with the scientists and engineers who were involved, and some stunning visuals. In addition to detailing the creation process, the film also delves into the telescope’s mission and all the cosmological questions it will address.

In addressing the nature of James Webb’s mission, the film also pays homage to the Hubble Space Telescope and its many accomplishments. Over the course of its 26 years of operation, it has revealed auroras, supernovas and discovered billions of stars, galaxies and exoplanets, some of which were shown to orbit within their star’s respective habitable zones.

On top of that, Hubble was used to determine the age of the Universe (13.8 billion years) and confirmed the existence of the supermassive black hole (SMBH) – aka. Sagitarrius A* – at the center of our galaxy, not to mention many others. It was also responsible for measuring the rate at which the Universe is expanding – in other words, measuring the Hubble Constant.

This played a pivotal role in helping scientists to develop the theory of Dark Energy, one of the most profound discoveries since Edwin Hubble (the telescope’s namesake) proposed that the Universe is in a state of expansion back in 1929. So it goes without saying that the deployment of the Hubble Space Telescope led to some of greatest discoveries in modern astronomy.

That being said, Hubble is still subject to limitations, which astronomers are now hoping to push past. For one, its instruments are not able to pick up the most distant (and hence, dimmest) galaxies in the Universe, which date to just a few hundred million years after the Big Bang. Even with “The Deep Fields” initiative, Hubble is still limited to seeing back to about half a billion years after the Big Bang.

Illustration of the depth by which Hubble imaged galaxies in prior Deep Field initiatives, in units of the Age of the Universe. The goal of the Frontier Fields is to peer back further than the Hubble Ultra Deep Field and get a wealth of images of galaxies as they existed in the first several hundred million years after the Big Bang. Note that the unit of time is not linear in this illustration. Illustration Credit: NASA and A. Feild (STScI)
Illustration of the depth by which Hubble imaged galaxies in prior Deep Field initiatives, in units of the Age of the Universe. Credit: NASA and A. Feild (STScI)

As Dr. John Mather, the project scientist for the James Webb Telescope, told Universe Today via email:

“Hubble showed us that we could not see the first galaxies being born, because they’re too far away, too faint, and too red. JWST is bigger, colder, and observes infrared light to see those first galaxies.  Hubble showed us there’s a black hole in the center of almost every galaxy. JWST will look as far back in time as possible to see when and how that happened: did the galaxy form the black hole, or did the galaxy grow around a pre-existing black hole?  Hubble showed us great clouds of glowing gas and dust where stars are being born. JWST will look through the dust clouds to see the stars themselves as they form in the cloud. Hubble showed us that we can see some planets around other stars, and that we can get chemical information about other planets that happen to pass directly in front of their stars.  JWST will extend this to longer wavelengths with a bigger telescope, with a possibility of detecting water on a super-Earth exoplanet. Hubble showed us details of planets and asteroids close to home, and JWST will give a closer look, though it’s still better to send a visiting robot if we can.”
Basically, the JWST will be able to see farther back to about 100 million years after the Big Bang, when the first stars and galaxies were born. It is also designed to operate at the L2 Lagrange Point, farther away from the Earth than Hubble – which was designed to remain in low-Earth orbit. This means the JWST will subject to less thermal and optical interference from the Earth and the Moon, but will also make it more difficult to service.

With its much larger set of segmented mirrors, it will observe the Universe as it capture light from the first galaxies and stars. Its extremely-sensitive suite of optics will also be able to gather information in the long-wavelength (orange-red) and infrared wavelengths with greater accuracy, measuring the redshift of distant galaxies, and even helping in the hunt for extra-solar planets.

A primary mirror segments of the James Webb Space Telescope, made of beryllium. Credit: NASA/MSFC/David Higginbotham/Emmett Given
A primary mirror segments of the James Webb Space Telescope, made of beryllium. Credit: NASA/MSFC/David Higginbotham/Emmett Given

With the assembly of its major components now complete, the telescope will spend the next two years undergoing tests before its scheduled launch date in October of 2018. These will include stress tests that will subject the telescope to the types of intense vibrations, sounds and g forces (ten times Earth normal) it will experience inside the Ariane 5 rocket that will take it into space.

Six months before its deployment, NASA also plans to send the JWST to the Johnson Space Center where it will be subjected to the kinds of conditions it will experience in space. This will consists of scientists placing the telescope in a chamber where temperatures will be lowered to 53 K (-220 °C; -370 °F), which will simulate its operating conditions at the L2 Lagrange Point.

Once all of that is complete and the JWST checks out, it will be launched aboard an Ariane 5 rocket from Arianespace’s ELA-3 launch pad in French Guayana. And thanks to experience gained from Hubble and updated algorithms, the telescope will be focused and gathering information shortly after it is launched. And as Dr. Mather explained, the big cosmological questions it is expected to address are numerous:

“Where did we come from? The Big Bang gave us hydrogen and helium spread out almost uniformly across the universe. But something, presumably gravity, stopped the expansion of the material and turned it into galaxies and stars and black holes. JWST will look at all these processes: how did the first luminous objects form, and what were they? How and where did the black holes form, and what did they do to the growing galaxies? How did the galaxies cluster together, and how did galaxies like the Milky Way grow and develop their beautiful spiral structure? Where is the cosmic dark matter and how does it affect ordinary matter? How much dark energy is there, and how does it change with time?”

Needless to say, NASA and the astronomical community are quite excited that the James Webb Telescope is finished construction, and can’t wait until it is deployed and begins to send back data. One can only imagine the kinds of things it will see deep in the cosmic field. But in the meantime, be sure to check out the film and see how this effort all came together:

Further Reading: NASA – JWST, Northrop Grumman

ESO Survey Shows Dark Matter to be Pretty “Smooth”

Dark Matter has been something of a mystery ever since it was first proposed. In addition to trying to find some direct evidence of its existence, scientists have also spent the past few decades developing theoretical models to explain how it works. In recent years, the popular conception has been that Dark Matter is “cold”, and distributed in clumps throughout the Universe, an observation supported by the Planck mission data.

However, a new study produced by an international team of researchers paints a different picture. Using data from the Kilo Degree Survey (KiDS), these researchers studied how the light coming from millions of distant galaxies was affected by the gravitational influence of matter on the largest of scales. What they found was that Dark Matter appears to more smoothly distributed throughout space than previously thought.

For the past five years, the KiDS survey has been using the VLT Survey Telescope (VST) – the largest telescope at the ESO’s La Silla Paranal Observatory in Chile – to survey 1500 square degrees of the southern night sky. This volume of space has been monitored in four bands (UV, IR, green and red) using weak gravitational lensing and photometric redshift measurements.

All-sky survey data from ESA's Planck space telescope. Credit: ESA
All-sky survey data from ESA’s Planck space telescope. Credit: ESA

Consistent with Einstein’s Theory of General Relativity, gravitational lensing involves studying how the gravitational field of a massive object will bend light. Meanwhile, redshift attempts to gauge the speed at which other galaxies are moving away from ours by measuring the extent to which their light is shifted towards the red end of the spectrum (i.e. its wavelength becomes longer the faster the source is moving away).

Gravitational lensing is especially useful when it comes to determining how the Universe came to be. Our current cosmological model, known as the Lambda Cold Dark Matter (Lambda CDM) model, states that Dark Energy is responsible for the late-time acceleration in the expansion of the Universe, and that Dark Matter is made up of massive particles that are responsible for cosmological structure formation.

Using a slight variation on this technique known as cosmic sheer, the research team studied light from distant galaxies to determine how it is warped by the presence of the largest structures in the Universe (such as superclusters and filaments). As Dr. Hendrik Hildebrandt – an astronomer from the Argelander Institute for Astronomy (AIfA) and the lead author of the paper – told Universe Today via email:

“Usually one thinks of one big mass like a galaxy cluster that causes this light deflection. But there is also matter all throughout the Universe. The light from distant galaxies gets continuously deflected by this so-called large-scale structure. This results in galaxies that are close on the sky to be “pointing” in the same direction. It’s a tiny effect but it can be measured with statistical methods from large samples of galaxies.When we have measured how strongly galaxies are “pointing” in the same direction we can infer from this the statistical properties of the large-scale structure, e.g. the mean matter density and how strongly the matter is clumped/clustered.”

Beautiful image of sprites at La Silla Observatory, captured by ESO Photo Ambassador Petr Horálek. Sprites are extremely rare atmosphere phemomena caused by irregularities in the ionosphere, high above storm clouds, at altitudes of about 80 kilometres. Typically seen as groups of red-orange flashes, they are triggered by positive cloud-to-ground lightning, which is rarer and more powerful than its negative counterpart, as the lightning discharge originates from the upper regions of the cloud, further from the ground. In a short burst, the sprite extends rapidly downwards, creating dangling red tendrils before disappearing. The sprite pictured here was most likely over 500 kilometres away (compare with a satellite image showing the storm over Argentina), spanned a height of up to 80 kilometres and lasted only a fraction of a second. Links: Midsummer Night Brings Sprites — Rare phenomenon caught on camera at La Silla Red Sprites at La Silla Observatory Sprites at Paranal Observatory
A rare phenomena known as “sprites” being seen above the La Silla Observatory in Chile,  Credit: ESO/Petr Horálek

Using this technique, the research team conducted an analysis of 450 square degrees of KiDS data, which corresponds to about 1% of the entire sky. Within this volume of space, the observed how the light coming from about 15 million galaxies interacted with all the matter that lies between them and Earth.

Combining the extremely sharp images obtained by VST with advanced computer software, the team was able to carry out one of the most precise measurements ever made of cosmic shear. Interestingly enough, the results were not consistent with those produced by the ESA’s Planck mission, which has been the most comprehensive mapper of the Universe to date.

The Planck mission has provided some wonderfully detailed and accurate information about the Cosmic Microwave Background (CMB). This has helped astronomers to map the early Universe, as well as develop theories of how matter was distributed during this period. As Hildebrandt explained:

“Planck measures many cosmological parameters with exquisite precision from the temperature fluctuations of the cosmic microwave background, i.e. physical processes that happened 400,000 years after the Big Bang. Two of those parameters are the mean matter density of the Universe and a measure of how strongly this matter is clumped. With cosmic shear, we also measure these two parameters but a much later cosmic times (a few billion years ago or ~10 billion years after the Big Bang), i.e. in our more recent past.”

This model assumes the cosmological principle. The LCDM universe is homogeneous and isotropic. Time dilation and redshift z are attributed to a Doppler-like shift in electromagnetic radiation as it travels across expanding space. This model assumes a nearly "flat" spatial geometry. Light traveling in this expanding model moves along null geodesics. Light waves are 'stretched' by the expansion of space as a function of time. The expansion is accelerating due to a vacuum energy or dark energy inherent in empty space. Approximately 73% of the energy density of the present universe is estimated to be dark energy. In addition, a dark matter component is currently estimated to constitute about 23% of the mass-energy density of the universe. The 5% remainder comprises all the matter and energy observed as subatomic particles, chemical elements and electromagnetic radiation; the material of which gas, dust, rocks, planets, stars, galaxies, etc., are made. This model includes a single originating big bang event, or initial singularity, which constitutes an abrupt appearance of expanding space containing radiation. This event was immediately followed by an exponential expansion of space (inflation).
The LCDM cosmological model assumes the existence of Dark Matter and Dark Energy, and that both played an active role in the formation of the Universe. Credit: Wikipedia Commons/Alex Mittelmann

However, Hildebrandt and his team found values for these parameters that were significantly lower than those found by Planck. Basically, their cosmic shear results suggest that there is less matter in the Universe and that it is less clustered than what the Planck results predicted. These results are likely to have an impact on cosmological studies and theoretical physics in the coming years.

As it stands, Dark Matter remains undetectable using standard methods. Like black holes, its existence can only be inferred from the observable gravitational effects it has on visible matter. In this case, its presence and fundamental nature are measured by how it has affected the evolution of the Universe over the past 13.8 billion years. But since the results appear to be conflicting, astronomers may now have to reconsider some of their previously held notions.

“There are several options: because we do not understand the dominant ingredients of the Universe (dark matter and dark energy) we can play with the properties of both,” said Hildebrandt. “For example, different forms of dark energy (more complex than the simplest possibility, which is Einstein’s “cosmological constant”) could explain our measurements. Another exciting possibility is that this is a sign that the laws of gravity on the scale of the Universe are different from General Relativity. All we can say for now is that something appears to be not quite right!”

Further Reading: ESO, arXiv

New Theory of Gravity Does Away With Need for Dark Matter


Erik Verlinde explains his new view of gravity

Let’s be honest. Dark matter’s a pain in the butt. Astronomers have gone to great lengths to explain why is must exist and exist in huge quantities, yet it remains hidden. Unknown. Emitting no visible energy yet apparently strong enough to keep galaxies in clusters from busting free like wild horses, it’s everywhere in vast quantities. What is the stuff – axions, WIMPS, gravitinos, Kaluza Klein particles?

Estimated distribution of matter and energy in the universe. Credit: NASA
Estimated distribution of matter and energy in the universe. Credit: NASA

It’s estimated that 27% of all the matter in the universe is invisible, while everything from PB&J sandwiches to quasars accounts for just 4.9%.  But a new theory of gravity proposed by theoretical physicist Erik Verlinde of the University of Amsterdam found out a way to dispense with the pesky stuff.

formation of complex symmetrical and fractal patterns in snowflakes exemplifies emergence in a physical system.
Snowflakes exemplify the concept of emergence with their complex symmetrical and fractal patterns created when much simpler pieces join together. Credit: Bob King

Unlike the traditional view of gravity as a fundamental force of nature, Verlinde sees it as an emergent property of space.  Emergence is a process where nature builds something large using small, simple pieces such that the final creation exhibits properties that the smaller bits don’t. Take a snowflake. The complex symmetry of a snowflake begins when a water droplet freezes onto a tiny dust particle. As the growing flake falls, water vapor freezes onto this original crystal, naturally arranging itself into a hexagonal (six-sided) structure of great beauty. The sensation of temperature is another emergent phenomenon, arising from the motion of molecules and atoms.

So too with gravity, which according to Verlinde, emerges from entropy. We all know about entropy and messy bedrooms, but it’s a bit more subtle than that. Entropy is a measure of disorder in a system or put another way, the number of different microscopic states a system can be in. One of the coolest descriptions of entropy I’ve heard has to do with the heat our bodies radiate. As that energy dissipates in the air, it creates a more disordered state around us while at the same time decreasing our own personal entropy to ensure our survival. If we didn’t get rid of body heat, we would eventually become disorganized (overheat!) and die.

The more massive the object, the more it distorts spacetime. Credit: LIGO/T. Pyle
The more massive the object, the more it distorts space-time, shown here as the green mesh. Earth orbits the Sun by rolling around the dip created by the Sun’s mass in the fabric of space-time. It doesn’t fall into the Sun because it also possesses forward momentum. Credit: LIGO/T. Pyle

Emergent or entropic gravity, as the new theory is called, predicts the exact same deviation in the rotation rates of stars in galaxies currently attributed to dark matter. Gravity emerges in Verlinde’s view from changes in fundamental bits of information stored in the structure of space-time, that four-dimensional continuum revealed by Einstein’s general theory of relativity. In a word, gravity is a consequence of entropy and not a fundamental force.

Space-time, comprised of the three familiar dimensions in addition to time, is flexible. Mass warps the 4-D fabric into hills and valleys that direct the motion of smaller objects nearby. The Sun doesn’t so much “pull” on the Earth as envisaged by Isaac Newton but creates a great pucker in space-time that Earth rolls around in.

In a 2010 article, Verlinde showed how Newton’s law of gravity, which describes everything from how apples fall from trees to little galaxies orbiting big galaxies, derives from these underlying microscopic building blocks.

His latest paper, titled Emergent Gravity and the Dark Universe, delves into dark energy’s contribution to the mix.  The entropy associated with dark energy, a still-unknown form of energy responsible for the accelerating expansion of the universe, turns the geometry of spacetime into an elastic medium.

“We find that the elastic response of this ‘dark energy’ medium takes the form of an extra ‘dark’ gravitational force that appears to be due to ‘dark matter’,” writes Verlinde. “So the observed dark matter phenomena is a remnant, a memory effect, of the emergence of spacetime together with the ordinary matter in it.”

Rotation curve of the typical spiral galaxy M 33 (yellow and blue points with errorbars) and the predicted one from distribution of the visible matter (white line). The discrepancy between the two curves is accounted for by adding a dark matter halo surrounding the galaxy. Credit: Public domain / Wikipedia
This diagram shows rotation curves of stars in M33, a typical spiral galaxy. The vertical scale is speed and the horizontal is distance from the galaxy’s nucleus. Normally, we expect stars to slow down the farther they are from galactic center (bottom curve), but in fact they revolve much faster (top curve). The discrepancy between the two curves is accounted for by adding a dark matter halo surrounding the galaxy. Credit: Public domain / Wikipedia

I’ll be the first one to say how complex Verlinde’s concept is, wrapped in arcane entanglement entropy, tensor fields and the holographic principal, but the basic idea, that gravity is not a fundamental force, makes for a fascinating new way to look at an old face.

Physicists have tried for decades to reconcile gravity with quantum physics with little success. And while Verlinde’s theory should be rightly be taken with a grain of salt, he may offer a way to combine the two disciplines into a single narrative that describes how everything from falling apples to black holes are connected in one coherent theory.

Weekly Space Hangout – October 28, 2016: Dr. Derrick Pitts of the Fels Planetarium

Host: Fraser Cain (@fcain)

Special Guest:
Dr. Derrick Pitts, Chief Astronomer and Director of the Fels Planetarium at The Franklin Institute. He has been a NASA Solar System Ambassador since 2009 and serves as the Astrobiology Ambassador for the NASA/MIRS/UNCF Special Program Corporation’s Astrobiology Partnership Program. Additionally, Derrick was recently appointed to the outreach advisory board for the Thirty-Meter-Telescope at Mauna Kea in Hawaii.

Guests:
Morgan Rehnberg (MorganRehnberg.com / @MorganRehnberg)
Kimberly Cartier ( KimberlyCartier.org / @AstroKimCartier )
Nicole Gugliucci (cosmoquest.org / @noisyastronomer)
Paul M. Sutter (pmsutter.com / @PaulMattSutter)

Their stories this week:

Dark energy not real?

Young star system caught breaking apart

Are most “habitable” planets around small stars water worlds?

We use a tool called Trello to submit and vote on stories we would like to see covered each week, and then Fraser will be selecting the stories from there. Here is the link to the Trello WSH page (http://bit.ly/WSHVote), which you can see without logging in. If you’d like to vote, just create a login and help us decide what to cover!

If you would like to join the Weekly Space Hangout Crew, visit their site here and sign up. They’re a great team who can help you join our online discussions!

If you would like to sign up for the AstronomyCast Solar Eclipse Escape, where you can meet Fraser and Pamela, plus WSH Crew and other fans, visit our site linked above and sign up!

We record the Weekly Space Hangout every Friday at 12:00 pm Pacific / 3:00 pm Eastern. You can watch us live on Universe Today, or the Universe Today YouTube page.

Dark Energy Illuminated By Largest Galactic Map Ten Years In The Making

In 1929, Edwin Hubble forever changed our understanding of the cosmos by showing that the Universe is in a state of expansion. By the 1990s, astronomers determined that the rate at which it is expanding is actually speeding up, which in turn led to the theory of “Dark Energy“. Since that time, astronomers and physicists have sought to determine the existence of this force by measuring the influence it has on the cosmos.

The latest in these efforts comes from the Sloan Digital Sky Survey III (SDSS III), where an international team of researchers have announced that they have finished creating the most precise measurements of the Universe to date. Known as the Baryon Oscillation Spectroscopic Survey (BOSS), their measurements have placed new constraints on the properties of Dark Energy.

The new measurements were presented by Harvard University astronomer Daniel Eisenstein at a recent meeting of the American Astronomical Society. As the director of the Sloan Digital Sky Survey III (SDSS-III), he and his team have spent the past ten years measuring the cosmos and the periodic fluctuations in the density of normal matter to see how galaxies are distributed throughout the Universe.

An illustration of the concept of baryon acoustic oscillations, which are imprinted in the early universe and can still be seen today in galaxy surveys like BOSS (Illustration courtesy of Chris Blake and Sam Moorfield).
An illustration of baryon acoustic oscillations, which are imprinted in the early universe and can still be seen today in galaxy surveys like BOSS. Credit: Chris Blake and Sam Moorfield

And after a decade of research, the BOSS team was able to produce a three-dimensional map of the cosmos that covers more than six billion light-years. And while other recent surveys have looked further afield – up to distances of 9 and 13 billion light years – the BOSS map is unique in that it boasts the highest accuracy of any cosmological map.

In fact, the BOSS team was able to measure the distribution of galaxies in the cosmos, and at a distance of 6 billion light-years, to within an unprecedented 1% margin of error. Determining the nature of cosmic objects at great distances is no easy matter, due the effects of relativity. As Dr. Eisenstein told Universe Today via email:

“Distances are a long-standing challenge in astronomy. Whereas humans often can judge distance because of our binocular vision, galaxies beyond the Milky Way are much too far away to use that. And because galaxies come in a wide range of intrinsic sizes, it is hard to judge their distance. It’s like looking at a far-away mountain; one’s judgement of its distance is tied up with one’s judgement of its height.”

In the past, astronomers have made accurate measurements of objects within the local universe (i.e. planets, neighboring stars, star clusters) by relying on everything from radar to redshift – the degree to which the wavelength of light is shifted towards the red end of the spectrum. However, the greater the distance of an object, the greater the degree of uncertainty.

 An artist's concept of the latest, highly accurate measurement of the Universe from BOSS. The spheres show the current size of the "baryon acoustic oscillations" (BAOs) from the early universe, which have helped to set the distribution of galaxies that we see in the universe today. Galaxies have a slight tendency to align along the edges of the spheres — the alignment has been greatly exaggerated in this illustration. BAOs can be used as a "standard ruler" (white line) to measure the distances to all the galaxies in the universe. Credit: Zosia Rostomian, Lawrence Berkeley National Laboratory
An artist’s concept of the latest, highly accurate measurement of the Universe from BOSS. Credit: Zosia Rostomian/Lawrence Berkeley National Laboratory

And until now, only objects that are a few thousand light-years from Earth – i.e. within the Milky Way galaxy – have had their distances measured to within a one-percent margin of error. As the largest of the four projects that make up the Sloan Digital Sky Survey III (SDSS-III), what sets BOSS apart is the fact that it relies primarily on the measurement of what are called “baryon acoustic oscillations” (BAOs).

These are essentially subtle periodic ripples in the distribution of visible baryonic (i.e. normal) matter in the cosmos. As Dr. Daniel Eisenstein explained:

“BOSS measures the expansion of the Universe in two primary ways. The first is by using the baryon acoustic oscillations (hence the name of the survey). Sound waves traveling in the first 400,000 years after the Big Bang create a preferred scale for separations of pairs of galaxies. By measuring this preferred separation in a sample of many galaxies, we can infer the distance to the sample. 

“The second method is to measure how clustering of galaxies differs between pairs oriented along the line of sight compared to transverse to the line of sight. The expansion of the Universe can cause this clustering to be asymmetric if one uses the wrong expansion history when converting redshifts to distance.”

With these new, highly-accurate distance measurements, BOSS astronomers will be able to study the influence of Dark Matter with far greater precision. “Different dark energy models vary in how the acceleration of the expansion of the Universe proceeds over time,” said Eisenstein. “BOSS is measuring the expansion history, which allows us to infer the acceleration rate. We find results that are highly consistent with the predictions of the cosmological constant model, that is, the model in which dark energy has a constant density over time.”

An international team of researchers have produced the largest 3-D map of the universe to date, which validates Einstein's theory of General Relativity. Credit: NAOJ/CFHT/ SDSS
Discerning the large-scale structure of the universe, and the role played by Dark Energy, is key to unlocking its mysteries. Credit: NAOJ/CFHT/ SDSS

In addition to measuring the distribution of normal matter to determine the influence of Dark Energy, the SDSS-III Collaboration is working to map the Milky Way and search for extrasolar planets. The BOSS measurements are detailed in a series of articles that were submitted to journals by the BOSS collaboration last month, all of which are now available online.

And BOSS is not the only effort to understand the large-scale structure of our Universe, and how all its mysterious forces have shaped it. Just last month, Professor Stephen Hawking announced that the COSMOS supercomputing center at Cambridge University would be creating the most detailed 3D map of the Universe to date.

Relying on data obtained by the CMB data obtained by the ESA’s Planck satellite and information from the Dark Energy Survey, they also hope to measure the influence Dark Energy has had on the distribution of matter in our Universe. Who knows? In a few years time, we may very well come to understand how all the fundamental forces governing the Universe work together.

Further Reading: SDSIII

Professor Stephen Hawking Intends To Map The Known Universe

Back in 1997, a team of leading scientists and cosmologists came together to establish the COSMOS supercomputing center at Cambridge University. Under the auspices of famed physicist Stephen Hawking, this facility and its supercomputer are dedicated to the research of cosmology, astrophysics and particle physics – ultimately, for the purpose of unlocking the deeper mysteries of the Universe.

Yesterday, in what was themed as a “tribute to Stephen Hawking”, the COSMOS center announced that it will be embarking on what is perhaps the boldest experiment in cosmological mapping. Essentially, they intend to create the most detailed 3D map of the early universe to date, plotting the position of billions of cosmic structures including supernovas, black holes, and galaxies.

This map will be created using the facility’s supercomputer, located in Cambridge’s Department of Applied Mathematics and Theoretical Physics. Currently, it is the largest shared-memory computer in Europe, boasting 1,856 Intel Xeon E5 processor cores, 31 Intel Many Integrated Core (MIC) co-processors, and 14.5 terabytes of globally shared memory.

The COSMOS IX supercomputer. Credit: cosmos.damtp.cam.ac.uk
The COSMOS IX supercomputer. Credit: cosmos.damtp.cam.ac.uk

The 3D will also rely on data obtained by two previous surveys – the ESA’s Planck satellite and the Dark Energy Survey. From the former, the COSMOS team will use the detailed images of the Cosmic Microwave Background (CMB) – the radiation leftover by the Big Ban – that were released in 2013. These images of the oldest light in the cosmos allowed physicists to refine their estimates for the age of the Universe (13.82 billion years) and its rate of expansion.

This information will be combined with data from the Dark Energy Survey which shows the expansion of the Universe over the course of the last 10 billion years. From all of this, the COSMOS team will compare the early distribution of matter in the Universe with its subsequent expansion to see how the two link up.

While cosmological simulations that looked at the evolution and large-scale structure of the Universe have been performed in the past – such as the Evolution and Assembly of GaLaxies and their Environments (EAGLE) project and the survey performed by the Institute for the Physics and Mathematics of the Universe at Tokyo University – this will be the first time where scientists compare data the early Universe to its evolution since.

The project is also expected to receive a boost from the deployment of the ESA’s Euclid probe, which is scheduled for launch in 2020. This mission will measure the shapes and redshifts of galaxies (looking 10 billion years into the past), thereby helping scientists to understand the geometry of the “dark Universe” – i.e. how dark matter and dark energy influence it as a whole.

Artist impression of the Euclid probe, which is set to launch in 2020. Credit: ESA
Artist impression of the Euclid probe, which is set to launch in 2020. Credit: ESA

The plans for the COSMOS center’s 3D map are will be unveiled at the Starmus science conference, which will be taking place from July 2nd to 27th, 2016, in Tenerife – the largest of the Canary Islands, located off the coast of Spain. At this conference, Hawking will be discussing the details of the COSMOS project.

In addition to being the man who brought the COSMOS team together, the theme of the project – “Beyond the Horizon – Tribute to Stephen Hawking” – was selected because of Hawking’s long-standing commitment to physics and cosmology. “Hawking is a great theorist but he always wants to test his theories against observations,” said Prof. Shellard in a Cambridge press release. “What will emerge is a 3D map of the universe with the positions of billions of galaxies.”

Hawking will also present the first ever Stephen Hawking Medal for Science Communication, an award established by Hawking that will be bestowed on those who help promote science to the public through media – i.e. cinema, music, writing and art. Other speakers who will attending the event include Neil deGrasse Tyson, Chris Hadfield, Martin Rees, Adam Riess, Rusty Schweickart, Eric Betzig, Neil Turok, and Kip Thorne.

Professor Hawking, flanked by , announcing the launch of the Stephen Hawking Medal for Science Communication, Dec. 16th, 2015. Credit:
Professor Hawking and colleagues from the Royal Society announcing the launch of the Stephen Hawking Medal for Science Communication, Dec. 16th, 2015. Credit: starmus.com

Naturally, it is hoped that the creation of this 3D map will confirm current cosmological theories, which include the current age of the Universe and whether or not the Standard Model of cosmology – aka. the Lambda Cold Dark Matter (CDM) model – is in fact the correct one. As Hawking is surely hoping, this could bring us one step closer to a Theory of Everything!

Further Reading: Cambridge News