Astronomy Cast Ep. 488: Dark Energy: 2018 Edition

The updates continue. Last week we talked about dark matter, and this week we continue with its partner dark energy. Of course, they’re not really partners, unless you consider mysteriousness to be an attribute. Dark energy, that force that’s accelerating the expansion of the Universe. What have we learned?

We usually record Astronomy Cast every Friday at 3:00 pm EST / 12:00 pm PST / 20:00 PM UTC. You can watch us live on AstronomyCast.com, or the AstronomyCast YouTube page.

Visit the Astronomy Cast Page to subscribe to the audio podcast!

If you would like to support Astronomy Cast, please visit our page at Patreon here – https://www.patreon.com/astronomycast. We greatly appreciate your support!

If you would like to join the Weekly Space Hangout Crew, visit their site here and sign up. They’re a great team who can help you join our online discussions!

Astronomers Just Found 72 Stellar Explosions, but Don’t Know What’s Causing Them

Images of one of the transient events, from eight days before the maximum brightness to 18 days afterwards. This outburst took place at a distance of 4 billion light years. Credit: M. Pursiainen / University of Southampton and DES collaboration

A supernova is one of the most impressive natural phenomena in the Universe. Unfortunately, such events are often brief and transient, temporarily becoming as bright as an entire galaxy and then fading away. But given what these bright explosions – which occur when a star reaches the end of its life cycle – can teach us about the Universe, scientists are naturally very interested in studying them.

Using data from the Dark Energy Survey Supernova (DES-SN) program, a team of astronomers recently detected 72 supernovae, the largest number of events discovered to date. These supernovae were not only very bright, but also very brief – a finding which the team is still struggling to explain. The results of their study were presented on Tuesday, April 3rd, at the European Week of Astronomy and Space Science in Liverpool.

The team was led by Miika Pursiainen, a PhD researcher from the University of Southampton. For the sake of their study, the team relied on data from the 4-meter telescope at the Cerro Tololo Inter-American Observatory (CTIO). This telescope is part of the Dark Energy Survey, a global effort to map hundreds of millions of galaxies and thousands of supernovae in to find patterns int he cosmic structure that will reveal the nature of dark energy.

This image shows the incredibly distant and ancient supernova DES16C2nm. The supernova was discovered by the Dark Energy Survey. Image: Mat Smith and DES collaboration.

As Pursiainen commented in a recent Southampton news release:

“The DES-SN survey is there to help us understand dark energy, itself entirely unexplained. That survey then also reveals many more unexplained transients than seen before. If nothing else, our work confirms that astrophysics and cosmology are still sciences with a lot of unanswered questions!”

As noted, these events were very peculiar in that they had a similar maximum brightness compared to different types of supernove, they were visible for far less time. Whereas supernova typically last for several months or more, these transient supernovae were visible for about a week to a month. The events also appeared to be very hot, with temperatures ranging from 10,000 to 30,000 °C (18,000 to 54,000 °F).

They also vary considerably in size, ranging from being several times the distance between the Earth and the Sun – 150 million km, 93 million mi (or 1 AU) – to hundreds of times. However, they also appear to be expanding and cooling over time, which is what is expected from an event like a supernova. Because of this, there is much debate about the origin of these transient supernovae.

Artistic impression of a star going supernova, casting its chemically enriched contents into the universe. Credit: NASA/Swift/Skyworks Digital/Dana Berry

A possible explanation is that these stars shed a lot of material before they exploded, and that this could have shrouded them in matter. This material may then have been heated by the supernovae themselves, causing it to rise to very high temperatures. This would mean that in these cases, the team was seeing the hot clouds rather than the exploding stars themselves.

This certainly would explain the observations made by Pursiainen and his team, though a lot more data will be needed to confirm this. In the future, the team hopes to examine more transients and see how often they occur compared to more common supernovae. The study of this powerful and mysterious phenomenon will also benefit from the use of next-generation telescopes.

When the James Webb Space Telescope is deployed in 2020, it will study the most distant supernovae in the Universe. This information, as well as studies performed by ground-based observatories, is expected to not only shed light on the life cycle of stars and dark energy, but also on the formation of black holes and gravitational waves.

Further Reading: University of Southampton

Precise New Measurements From Hubble Confirm the Accelerating Expansion of the Universe. Still no Idea Why it’s Happening

These Hubble Space Telescope images showcase two of the 19 galaxies analyzed in a project to improve the precision of the universe's expansion rate, a value known as the Hubble constant. The color-composite images show NGC 3972 (left) and NGC 1015 (right), located 65 million light-years and 118 million light-years, respectively, from Earth. The yellow circles in each galaxy represent the locations of pulsating stars called Cepheid variables. Credits: NASA, ESA, A. Riess (STScI/JHU)

In the 1920s, Edwin Hubble made the groundbreaking revelation that the Universe was in a state of expansion. Originally predicted as a consequence of Einstein’s Theory of General Relativity, this confirmation led to what came to be known as Hubble’s Constant. In the ensuring decades, and thanks to the deployment of next-generation telescopes – like the aptly-named Hubble Space Telescope (HST) – scientists have been forced to revise this law.

In short, in the past few decades, the ability to see farther into space (and deeper into time) has allowed astronomers to make more accurate measurements about how rapidly the early Universe expanded. And thanks to a new survey performed using Hubble, an international team of astronomers has been able to conduct the most precise measurements of the expansion rate of the Universe to date.

This survey was conducted by the Supernova H0 for the Equation of State (SH0ES) team, an international group of astronomers that has been on a quest to refine the accuracy of the Hubble Constant since 2005. The group is led by Adam Reiss of the Space Telescope Science Institute (STScI) and Johns Hopkins University, and includes members from the American Museum of Natural History, the Neils Bohr Institute, the National Optical Astronomy Observatory, and many prestigious universities and research institutions.

Illustration of the depth by which Hubble imaged galaxies in prior Deep Field initiatives, in units of the Age of the Universe. Credit: NASA and A. Feild (STScI)

The study which describes their findings recently appeared in The Astrophysical Journal under the title “Type Ia Supernova Distances at Redshift >1.5 from the Hubble Space Telescope Multi-cycle Treasury Programs: The Early Expansion Rate“. For the sake of their study, and consistent with their long term goals, the team sought to construct a new and more accurate “distance ladder”.

This tool is how astronomers have traditionally measured distances in the Universe, which consists of relying on distance markers like Cepheid variables – pulsating stars whose distances can be inferred by comparing their intrinsic brightness with their apparent brightness. These measurements are then compared to the way light from distance galaxies is redshifted to determine how fast the space between galaxies is expanding.

From this, the Hubble Constant is derived. To build their distant ladder, Riess and his team conducted parallax measurements using Hubble’s Wide Field Camera 3 (WFC3) of eight newly-analyzed Cepheid variable stars in the Milky Way. These stars are about 10 times farther away than any studied previously – between 6,000 and 12,000 light-year from Earth – and pulsate at longer intervals.

To ensure accuracy that would account for the wobbles of these stars, the team also developed a new method where Hubble would measure a star’s position a thousand times a minute every six months for four years. The team then compared the brightness of these eight stars with more distant Cepheids to ensure that they could calculate the distances to other galaxies with more precision.

Illustration showing three steps astronomers used to measure the universe’s expansion rate (Hubble constant) to an unprecedented accuracy, reducing the total uncertainty to 2.3 percent. Credits: NASA/ESA/A. Feild (STScI)/and A. Riess (STScI/JHU)

Using the new technique, Hubble was able to capture the change in position of these stars relative to others, which simplified things immensely. As Riess explained in a NASA press release:

“This method allows for repeated opportunities to measure the extremely tiny displacements due to parallax. You’re measuring the separation between two stars, not just in one place on the camera, but over and over thousands of times, reducing the errors in measurement.”

Compared to previous surveys, the team was able to extend the number of stars analyzed to distances up to 10 times farther. However, their results also contradicted those obtained by the European Space Agency’s (ESA) Planck satellite, which has been measuring the Cosmic Microwave Background (CMB) – the leftover radiation created by the Big Bang – since it was deployed in 2009.

By mapping the CMB, Planck has been able to trace the expansion of the cosmos during the early Universe – circa. 378,000 years after the Big Bang. Planck’s result predicted that the Hubble constant value should now be 67 kilometers per second per megaparsec (3.3 million light-years), and could be no higher than 69 kilometers per second per megaparsec.

The Big Bang timeline of the Universe. Cosmic neutrinos affect the CMB at the time it was emitted, and physics takes care of the rest of their evolution until today. Credit: NASA/JPL-Caltech/A. Kashlinsky (GSFC).

Based on their sruvey, Riess’s team obtained a value of 73 kilometers per second per megaparsec, a discrepancy of 9%. Essentially, their results indicate that galaxies are moving at a faster rate than that implied by observations of the early Universe. Because the Hubble data was so precise, astronomers cannot dismiss the gap between the two results as errors in any single measurement or method. As Reiss explained:

“The community is really grappling with understanding the meaning of this discrepancy… Both results have been tested multiple ways, so barring a series of unrelated mistakes. it is increasingly likely that this is not a bug but a feature of the universe.”

These latest results therefore suggest that some previously unknown force or some new physics might be at work in the Universe. In terms of explanations, Reiss and his team have offered three possibilities, all of which have to do with the 95% of the Universe that we cannot see (i.e. dark matter and dark energy). In 2011, Reiss and two other scientists were awarded the Nobel Prize in Physics for their 1998 discovery that the Universe was in an accelerated rate of expansion.

Consistent with that, they suggest that Dark Energy could be pushing galaxies apart with increasing strength. Another possibility is that there is an undiscovered subatomic particle out there that is similar to a neutrino, but interacts with normal matter by gravity instead of subatomic forces. These “sterile neutrinos” would travel at close to the speed of light and could collectively be known as “dark radiation”.

This illustration shows the evolution of the Universe, from the Big Bang on the left, to modern times on the right. Credit: NASA

Any of these possibilities would mean that the contents of the early Universe were different, thus forcing a rethink of our cosmological models. At present, Riess and colleagues don’t have any answers, but plan to continue fine-tuning their measurements. So far, the SHoES team has decreased the uncertainty of the Hubble Constant to 2.3%.

This is in keeping with one of the central goals of the Hubble Space Telescope, which was to help reduce the uncertainty value in Hubble’s Constant, for which estimates once varied by a factor of 2.

So while this discrepancy opens the door to new and challenging questions, it also reduces our uncertainty substantially when it comes to measuring the Universe. Ultimately, this will improve our understanding of how the Universe evolved after it was created in a fiery cataclysm 13.8 billion years ago.

Further Reading: NASA, The Astrophysical Journal

These 25 Billion Galaxies are Definitely Living in a Simulation

A section of the virtual universe, a billion light years across, showing how dark matter is distributed in space, with dark matter halos the yellow clumps, interconnected by dark filaments. Cosmic void, shown as the white areas, are the lowest density regions in the Universe. Credit: Joachim Stadel, UZH

Understanding the Universe and how it has evolved over the course of billions of years is a rather daunting task. On the one hand, it involves painstakingly looking billions of light years into deep space (and thus, billions of years back in time) to see how its large-scale structure changed over time. Then, massive amounts of computing power are needed to simulate what it should look like (based on known physics) and seeing if they match up.

That is what a team of astrophysicists from the University of Zurich (UZH) did using the “Piz Daint” supercomputer. With this sophisticated machine, they simulated the formation of our entire Universe and produced a catalog of about 25 billion virtual galaxies. This catalog will be launched aboard the ESA’s Euclid mission in 2020, which will spend six years probing the Universe for the sake of investigating dark matter.

The team’s work was detailed in a study that appeared recently in the journal Computational Astrophysics and Cosmology. Led by Douglas Potter, the team spent the past three years developing an optimized code to describe (with unprecedented accuracy) the dynamics of dark matter as well as the formation of large-scale structures in the Universe.

The code, known as PKDGRAV3, was specifically designed to optimally use the available memory and processing power of modern super-computing architectures. After being executed on the “Piz Daint” supercomputer – located at the Swiss National Computing Center (CSCS) – for a period of only 80 hours, it managed to generate a virtual Universe of two trillion macro-particles, from which a catalogue of 25 billion virtual galaxies was extracted.

Intrinsic to their calculations was the way in which dark matter fluid would have evolved under its own gravity, thus leading to the formation of small concentrations known as “dark matter halos”. It is within these halos – a theoretical component that is thought to extend well beyond the visible extent of a galaxy – that galaxies like the Milky Way are believed to have formed.

Naturally, this presented quite the challenge. It required not only a precise calculation of how the structure of dark matter evolves, but also required that they consider how this would influence every other part of the Universe. As Joachim Stadel, a professor with the Center for Theoretical Astrophysics and Cosmology at UZH and a co-author on the paper, told Universe Today via email:

“We simulated 2 trillion such dark matter “pieces”, the largest calculation of this type that has ever been performed. To do this we had to use a computation technique known as the “fast multipole method” and use one of the fastest computers in the world, “Piz Daint” at the Swiss National Supercomputing Centre, which among other things has very fast graphics processing units (GPUs) which allow an enormous speed-up of the floating point calculations needed in the simulation. The dark matter clusters into dark matter “halos” which in turn harbor the galaxies. Our calculation accurately produces the distribution and properties of the dark matter, including the halos, but the galaxies, with all of their properties, must be placed within these halos using a model. This part of the task was performed by our colleagues at Barcelona under the direction of Pablo Fossalba and Francisco Castander. These galaxies then have the expected colors, spatial distribution and the emission lines (important for the spectra observed by Euclid) and can be used to test and calibrate various systematics and random errors within the entire instrument pipeline of Euclid.”

Artist impression of the Euclid probe, which is set to launch in 2020. Credit: ESA

Thanks to the high precision of their calculations, the team was able to turn out a catalog that met the requirements of the European Space Agency’s Euclid mission, whose main objective is to explore the “dark universe”. This kind of research is essential to understanding the Universe on the largest of scales, mainly because the vast majority of the Universe is dark.

Between the 23% of the Universe which is made up of dark matter and the 72% that consists of dark energy, only one-twentieth of the Universe is actually made up of matter that we can see with normal instruments (aka. “luminous” or baryonic matter). Despite being proposed during the 1960s and 1990s respectively, dark matter and dark energy remain two of the greatest cosmological mysteries.

Given that their existence is required in order for our current cosmological models to work, their existence has only ever been inferred through indirect observation. This is precisely what the Euclid mission will do over the course of its six year mission, which will consist of it capturing light from billions of galaxies and measuring it for subtle distortions caused by the presence of mass in the foreground.

Much in the same way that measuring background light can be distorted by the presence of a gravitational field between it and the observer (i.e. a time-honored test for General Relativity), the presence of dark matter will exert a gravitational influence on the light. As Stadel explained, their simulated Universe will play an important role in this Euclid mission – providing a framework that will be used during and after the mission.

Diagram showing the Lambda-CBR universe, from the Big Bang to the the current era. Credit: Alex Mittelmann/Coldcreation

“In order to forecast how well the current components will be able to make a given measurement, a Universe populated with galaxies as close as possible to the real observed Universe must be created,” he said. “This ‘mock’ catalogue of galaxies is what was generated from the simulation and will be now used in this way. However, in the future when Euclid begins taking data, we will also need to use simulations like this to solve the inverse problem. We will then need to be able to take the observed Universe and determine the fundamental parameters of cosmology; a connection which currently can only be made at a sufficient precision by large simulations like the one we have just performed. This is a second important aspect of how such simulation work [and] is central to the Euclid mission.”

From the Euclid data, researchers hope to obtain new information on the nature of dark matter, but also to discover new physics that goes beyond the Standard Model of particle physics – i.e. a modified version of general relativity or a new type of particle. As Stadel explained, the best outcome for the mission would be one in which the results do not conform to expectations.

“While it will certainly make the most accurate measurements of fundamental cosmological parameters (such as the amount of dark matter and energy in the Universe) far more exciting would be to measure something that conflicts or, at the very least, is in tension with the current ‘standard lambda cold dark matter‘ (LCDM) model,” he said. “One of the biggest questions is whether the so called ‘dark energy’ of this model is actually a form of energy, or whether it is more correctly described by a modification to Einstein’s general theory of relativity. While we may just begin to scratch the surface of such questions, they are very important and have the potential to change physics at a very fundamental level.”

In the future, Stadel and his colleagues hope to be running simulations on cosmic evolution that take into account both dark matter and dark energy. Someday, these exotic aspects of nature could form the pillars of a new cosmology, one which reaches beyond the physics of the Standard Model. In the meantime, astrophysicists from around the world will likely be waiting for the first batch of results from the Euclid mission with baited breath.

Euclid is one of several missions that is currently engaged in the hunt for dark matter and the study of how it shaped our Universe. Others include the Alpha Magnetic Spectrometer (AMS-02) experiment aboard the ISS, the ESO’s Kilo Degree Survey (KiDS), and CERN’s Large Hardon Collider. With luck, these experiments will reveal pieces to the cosmological puzzle that have remained elusive for decades.

Further Reading: UZH, Computational Astrophysics and Cosmology

How Do We Know the Universe is Flat? Discovering the Topology of the Universe

Does This Look Flat?
Does This Look Flat?


Whenever we talk about the expanding Universe, everyone wants to know how this is going to end. Sure, they say, the fact that most of the galaxies we can see are speeding away from us in all directions is really interesting. Sure, they say, the Big Bang makes sense, in that everything was closer together billions of years ago.

But how does it end? Does this go on forever? Do galaxies eventually slow down, come to a stop, and then hurtle back together in a Big Crunch? Will we get a non-stop cycle of Big Bangs, forever and ever?

Illustration of the Big Bang Theory
The Big Bang Theory: A history of the Universe starting from a singularity and expanding ever since. Credit: grandunificationtheory.com

We’ve done a bunch of articles on many different aspects of this question, and the current conclusion astronomers have reached is that because the Universe is flat, it’s never going to collapse in on itself and start another Big Bang.

But wait, what does it mean to say that the Universe is “flat”? Why is that important, and how do we even know?

Before we can get started talking about the flatness of the Universe, we need to talk about flatness in general. What does it mean to say that something is flat?

If you’re in a square room and walk around the corners, you’ll return to your starting point having made 4 90-degree turns. You can say that your room is flat. This is Euclidian geometry.

Earth, seen from space, above the Pacific Ocean. Credit: NASA

But if you make the same journey on the surface of the Earth. Start at the equator, make a 90-degree turn, walk up to the North Pole, make another 90-degree turn, return to the equator, another 90-degree turn and return to your starting point.

In one situation, you made 4 turns to return to your starting point, in another situation it only took 3. That’s because the topology of the surface you were walking on decided what happens when you take a 90-degree turn.

You can imagine an even more extreme example, where you’re walking around inside a crater, and it takes more than 4 turns to return to your starting point.

Another analogy, of course, is the idea of parallel lines. If you fire off two parallel lines at the North pole, they move away from each other, following the topology of the Earth and then come back together.

Got that? Great.

Omega Centauri. Credits: NASA, ESA and the Hubble SM4 ERO Team

Now, what about the Universe itself? You can imagine that same analogy. Imaging flying out into space on a rocket for billions of light-years, performing 90-degree maneuvers and returning to your starting point.

You can’t do it in 3, or 5, you need 4, which means that the topology of the Universe is flat. Which is totally intuitive, right? I mean, that would be your assumption.

But astronomers were skeptical and needed to know for certain, and so, they set out to test this assumption.

In order to prove the flatness of the Universe, you would need to travel a long way. And astronomers use the largest possible observation they can make. The Cosmic Microwave Background Radiation, the afterglow of the Big Bang, visible in all directions as a red-shifted, fading moment when the Universe became transparent about 380,000 years after the Big Bang.

Cosmic Microwave Background Radiation. Image credit: NASA
Cosmic Microwave Background Radiation. Image credit: NASA

When this radiation was released, the entire Universe was approximately 2,700 C. This was the moment when it was cool enough for photons were finally free to roam across the Universe. The expansion of the Universe stretched these photons out over their 13.8 billion year journey, shifting them down into the microwave spectrum, just 2.7 degrees above absolute zero.

With the most sensitive space-based telescopes they have available, astronomers are able to detect tiny variations in the temperature of this background radiation.

And here’s the part that blows my mind every time I think about it. These tiny temperature variations correspond to the largest scale structures of the observable Universe. A region that was a fraction of a degree warmer become a vast galaxy cluster, hundreds of millions of light-years across.

Having a non-flat universe would cause distortions between what we saw in the CMBR compared to the current universe. Credit: NASA / WMAP Science Team

The Cosmic Microwave Background Radiation just gives and gives, and when it comes to figuring out the topology of the Universe, it has the answer we need. If the Universe was curved in any way, these temperature variations would appear distorted compared to the actual size that we see these structures today.

But they’re not. To best of its ability, ESA’s Planck space telescope, can’t detect any distortion at all. The Universe is flat.

Illustration of the ESA Planck Telescope in Earth orbit (Credit: ESA)

Well, that’s not exactly true. According to the best measurements astronomers have ever been able to make, the curvature of the Universe falls within a range of error bars that indicates it’s flat. Future observations by some super Planck telescope could show a slight curvature, but for now, the best measurements out there say… flat.

We say that the Universe is flat, and this means that parallel lines will always remain parallel. 90-degree turns behave as true 90-degree turns, and everything makes sense.

But what are the implications for the entire Universe? What does this tell us?

Unfortunately, the biggest thing is what it doesn’t tell us. We still don’t know if the Universe is finite or infinite. If we could measure its curvature, we could know that we’re in a finite Universe, and get a sense of what its actual true size is, out beyond the observable Universe we can measure.

The observable – or inferrable universe. This may just be a small component of the whole ball game.

We know that the volume of the Universe is at least 100 times more than we can observe. At least. If the flatness error bars get brought down, the minimum size of the Universe goes up.

And remember, an infinite Universe is still on the table.

Another thing this does, is that it actually causes a problem for the original Big Bang theory, requiring the development of a theory like inflation.

Since the Universe is flat now, it must have been flat in the past, when the Universe was an incredibly dense singularity. And for it to maintain this level of flatness over 13.8 billion years of expansion, in kind of amazing.

In fact, astronomers estimate that the Universe must have been flat to 1 part within 1×10^57 parts.

Which seems like an insane coincidence. The development of inflation, however, solves this, by expanding the Universe an incomprehensible amount moments after the Big Bang. Pre and post inflation Universes can have vastly different levels of curvature.

In the olden days, cosmologists used to say that the flatness of the Universe had implications for its future. If the Universe was curved where you could complete a full journey with less than 4 turns, that meant it was closed and destined to collapse in on itself.

And it was more than 4 turns, it was open and destined to expand forever.

New results from NASA’s Galaxy Evolution Explorer and the Anglo-Australian Telescope atop Siding Spring Mountain in Australia confirm that dark energy (represented by purple grid) is a smooth, uniform force that now dominates over the effects of gravity (green grid). Image credit: NASA/JPL-Caltech

Well, that doesn’t really matter any more. In 1998, the astronomers discovered dark energy, which is this mysterious force accelerating the expansion of the Universe. Whether the Universe is open, closed or flat, it’s going to keep on expanding. In fact, that expansion is going to accelerate, forever.

I hope this gives you a little more understanding of what cosmologists mean when they say that the Universe is flat. And how do we know it’s flat? Very precise measurements in the Cosmic Microwave Background Radiation.

Is there anything that all pervasive relic of the early Universe can’t do?

New Explanation for Dark Energy? Tiny Fluctuations of Time and Space

A new study from researchers from the University of British Columbia offers a new explanation of Dark Energy. Credit: NASA

Since the late 1920s, astronomers have been aware of the fact that the Universe is in a state of expansion. Initially predicted by Einstein’s Theory of General Relativity, this realization has gone on to inform the most widely-accepted cosmological model – the Big Bang Theory. However, things became somewhat confusing during the 1990s, when improved observations showed that the Universe’s rate of expansion has been accelerating for billions of years.

This led to the theory of Dark Energy, a mysterious invisible force that is driving the expansion of the cosmos. Much like Dark Matter which explained the “missing mass”, it then became necessary to find this elusive energy, or at least provide a coherent theoretical framework for it. A new study from the University of British Columbia (UBC) seeks to do just that by postulating the Universe is expanding due to fluctuations in space and time.

The study – which was recently published in the journal Physical Review D – was led by Qingdi Wang, a PhD student with the Department of Physics and Astronomy at UBC. Under the supervisions of UBC Professor William Unruh (the man who proposed the Unruh Effect) and with assistance from Zhen Zhu (another PhD student at UBC), they provide a new take on Dark Energy.

Diagram showing the Lambda-CBR universe, from the Big Bang to the the current era. Credit: Alex Mittelmann/Coldcreation

The team began by addressing the inconsistencies arising out of the two main theories that together explain all natural phenomena in the Universe. These theories are none other than General Relativity and quantum mechanics, which effectively explain how the Universe behaves on the largest of scales (i.e. stars, galaxies, clusters) and the smallest (subatomic particles).

Unfortunately, these two theories are not consistent when it comes to a little matter known as gravity, which scientists are still unable to explain in terms of quantum mechanics. The existence of Dark Energy and the expansion of the Universe are another point of disagreement. For starters, candidates theories like vacuum energy – which is one of the most popular explanations for Dark Energy – present serious incongruities.

According to quantum mechanics, vacuum energy would have an incredibly large energy density to it. But if this is true, then General Relativity predicts that this energy would have an incredibly strong gravitational effect, one which would be powerful enough to cause the Universe to explode in size. As Prof. Unruh shared with Universe Today via email:

“The problem is that any naive calculation of the vacuum energy gives huge values. If one assumes that there is some sort of cutoff so one cannot get energy densities much greater than the Planck energy density (or about 1095 Joules/meter³)  then one finds that one gets a Hubble constant – the time scale on which the Universe roughly doubles in size – of the order of 10-44 sec. So, the usual approach is to say that somehow something reduces that down so that one gets the actual expansion rate of about 10 billion years instead. But that ‘somehow’ is pretty mysterious and no one has come up with an even half convincing mechanism.”

Timeline of the Big Bang and the expansion of the Universe. Credit: NASA

Whereas other scientists have sought to modify the theories of General Relativity and quantum mechanics in order to resolve these inconsistencies, Wang and his colleagues sought a different approach. As Wang explained to Universe Today via email:

“Previous studies are either trying to modify quantum mechanics in some way to make vacuum energy small or trying to modify General Relativity in some way to make gravity numb for vacuum energy. However, quantum mechanics and General Relativity are the two most successful theories that explain how our Universe works… Instead of trying to modify quantum mechanics or General Relativity, we believe that we should first understand them better. We takes the large vacuum energy density predicted by quantum mechanics seriously and just let them gravitate according to General Relativity without modifying either of them.”

For the sake of their study, Wang and his colleagues performed new sets of calculations on vacuum energy that took its predicted high energy density into account. They then considered the possibility that on the tiniest of scales – billions of times smaller than electrons – the fabric of spacetime is subject to wild fluctuations, oscillating at every point between expansion and contraction.

Could fluctuations at the tiniest levels of space time explain Dark Energy and the expansion of the cosmos? Credit: University of Washington

As it swings back and forth, the result of these oscillations is a net effect where the Universe expands slowly, but at an accelerating rate. After performing their calculations, they noted that such an explanation was consistent with both the existence of quantum vacuum energy density and General Relativity. On top of that, it is also consistent with what scientists have been observing in our Universe for almost a century. As Unruh described it:

“Our calculations showed that one could consistently regard [that] the Universe on the tiniest scales is actually expanding and contracting at an absurdly fast rate; but that on a large scale, because of an averaging over those tiny scales, physics would not notice that ‘quantum foam’. It has a tiny residual effect in giving an effective cosmological constant (dark energy type effect). In some ways it is like waves on the ocean which travel as if the ocean were perfectly smooth but really we know that there is this incredible dance of the atoms that make up the water, and waves average over those fluctuations, and act as if the surface was smooth.”

In contrast to conflicting theories of a Universe where the various forces that govern it cannot be resolved and must cancel each other out, Wang and his colleagues presents a picture where the Universe is constantly in motion. In this scenario, the effects of vacuum energy are actually self-cancelling, and also give rise to the expansion and acceleration we have been observing all this time.

While it may be too soon to tell, this image of a Universe that is highly-dynamic (even on the tiniest scales) could revolutionize our understanding of spacetime. At the very least, these theoretical findings are sure to stimulate debate within the scientific community, as well as experiments designed to offer direct evidence. And that, as we know, is the only way we can advance our understanding of this thing known as the Universe.

Further Reading: UBC News, Physical Review D

Rise Of The Super Telescopes: The Wide Field Infrared Survey Telescope

NASA's Wide Field Infrared Survey Telescope (WFIRST) will capture Hubble-quality images covering swaths of sky 100 times larger than Hubble does, enabling cosmic evolution studies. Its Coronagraph Instrument will directly image exoplanets and study their atmospheres. Credits: NASA/GSFC/Conceptual Image Lab
NASA's Wide Field Infrared Survey Telescope (WFIRST) will capture Hubble-quality images covering swaths of sky 100 times larger than Hubble does. These enormous images will allow astronomers to study the evolution of the cosmos. Its Coronagraph Instrument will directly image exoplanets and study their atmospheres. Credits: NASA/GSFC/Conceptual Image Lab

We humans have an insatiable hunger to understand the Universe. As Carl Sagan said, “Understanding is Ecstasy.” But to understand the Universe, we need better and better ways to observe it. And that means one thing: big, huge, enormous telescopes.

In this series we’ll look at the world’s upcoming Super Telescopes:

The Wide Field Infrared Survey Telescope (WFIRST)

It’s easy to forget the impact that the Hubble Space Telescope has had on our state of knowledge about the Universe. In fact, that might be the best measurement of its success: We take the Hubble, and all we’ve learned from it, for granted now. But other space telescopes are being developed, including the WFIRST, which will be much more powerful than the Hubble. How far will these telescopes extend our understanding of the Universe?

“WFIRST has the potential to open our eyes to the wonders of the universe, much the same way Hubble has.” – John Grunsfeld, NASA Science Mission Directorate

The WFIRST might be the true successor to the Hubble, even though the James Webb Space Telescope (JWST) is often touted as such. But it may be incorrect to even call WFIRST a telescope; it’s more accurate to call it an astrophysics observatory. That’s because one of its primary science objectives is to study Dark Energy, that rather mysterious force that drives the expansion of the Universe, and Dark Matter, the difficult-to-detect matter that slows that expansion.

WFIRST will have a 2.4 meter mirror, the same size as the Hubble. But, it will have a camera that will expand the power of that mirror. The Wide Field Instrument is a 288-megapixel multi-band near-infrared camera. Once it’s in operation, it will capture images that are every bit as sharp as those from Hubble. But there is one huge difference: The Wide Field Instrument will capture images that cover over 100 times the sky that Hubble does.

Alongside the Wide Field Instrument, WFIRST will have the Coronagraphic Instrument. The Coronagraphic Instrument will advance the study of exoplanets. It’ll use a system of filters and masks to block out the light from other stars, and hone in on planets orbiting those stars. This will allow very detailed study of the atmospheres of exoplanets, one of the main ways of determining habitability.

WFIRST is slated to be launched in 2025, although it’s too soon to have an exact date. But when it launches, the plan is for WFIRST to travel to the Sun-Earth LaGrange Point 2 (L2.) L2 is a gravitationally balanced point in space where WFIRST can do its work without interruption. The mission is set to last about 6 years.

Probing Dark Energy

“WFIRST has the potential to open our eyes to the wonders of the universe, much the same way Hubble has,” said John Grunsfeld, astronaut and associate administrator of NASA’s Science Mission Directorate at Headquarters in Washington. “This mission uniquely combines the ability to discover and characterize planets beyond our own solar system with the sensitivity and optics to look wide and deep into the universe in a quest to unravel the mysteries of dark energy and dark matter.”

In a nutshell, there are two proposals for what Dark Energy can be. The first is the cosmological constant, where Dark Energy is uniform throughout the cosmos. The second is what’s known as scalar fields, where the density of Dark Energy can vary in time and space.

We used to think that the Universe expanded at a steady rate. Then in the 1990s we discovered that the expansion had started accelerating about 5 billion years ago. Dark Energy is the name given to the force driving that expansion. Image: NASA/STSci/Ann Feild
We used to think that the Universe expanded at a steady rate. Then in the 1990s we discovered that the expansion had accelerated. Dark Energy is the name given to the force driving that expansion. Image: NASA/STSci/Ann Feild

Since the 1990s, observations have shown us that the expansion of the Universe is accelerating. That acceleration started about 5 billion years ago. We think that Dark Energy is responsible for that accelerated expansion. By providing such large, detailed images of the cosmos, WFIRST will let astronomers map expansion over time and over large areas. WFIRST will also precisely measure the shapes, positions and distances of millions of galaxies to track the distribution and growth of cosmic structures, including galaxy clusters and the Dark Matter accompanying them. The hope is that this will give us a next level of understanding when it comes to Dark Energy.

If that all sounds too complicated, look at it this way: We know the Universe is expanding, and we know that the expansion is accelerating. We want to know why it’s expanding, and how. We’ve given the name ‘Dark Energy’ to the force that’s driving that expansion, and now we want to know more about it.

Probing Exoplanets

Dark Energy and the expansion of the Universe is a huge mystery, and a question that drives cosmologists. (They really want to know how the Universe will end!) But for many of the rest of us, another question is even more compelling: Are we alone in the Universe?

There’ll be no quick answer to that one, but any answer we find begins with studying exoplanets, and that’s something that WFIRST will also excel at.

Artist's concept of the TRAPPIST-1 star system, an ultra-cool dwarf that has seven Earth-size planets orbiting it. We're going to keep finding more and more solar systemsl like this, but we need observatories like WFIRST, with starshades, to understand the planets better. Credits: NASA/JPL-Caltech
Artist’s concept of the TRAPPIST-1 star system, an ultra-cool dwarf that has seven Earth-size planets orbiting it. We’re going to keep finding more and more solar systems like this, but we need observatories like WFIRST to understand the planets better. Credits: NASA/JPL-Caltech

“WFIRST is designed to address science areas identified as top priorities by the astronomical community,” said Paul Hertz, director of NASA’s Astrophysics Division in Washington. “The Wide-Field Instrument will give the telescope the ability to capture a single image with the depth and quality of Hubble, but covering 100 times the area. The coronagraph will provide revolutionary science, capturing the faint, but direct images of distant gaseous worlds and super-Earths.”

“The coronagraph will provide revolutionary science, capturing the faint, but direct images of distant gaseous worlds and super-Earths.” – Paul Hertz, NASA Astrophysics Division

The difficulty in studying exoplanets is that they are all orbiting stars. Stars are so bright they make it impossible to see their planets in any detail. It’s like staring into a lighthouse miles away and trying to study an insect near the lighthouse.

The Coronagraphic Instrument on board WFIRST will excel at blocking out the light of distant stars. It does that with a system of mirrors and masks. This is what makes studying exoplanets possible. Only when the light from the star is dealt with, can the properties of exoplanets be examined.

This will allow detailed measurements of the chemical composition of an exoplanet’s atmosphere. By doing this over thousands of planets, we can begin to understand the formation of planets around different types of stars. There are some limitations to the Coronagraphic Instrument, though.

The Coronagraphic Instrument was kind of a late addition to WFIRST. Some of the other instrumentation on WFIRST isn’t optimized to work with it, so there are some restrictions to its operation. It will only be able to study gas giants, and so-called Super-Earths. These larger planets don’t require as much finesse to study, simply because of their size. Earth-like worlds will likely be beyond the power of the Coronagraphic Instrument.

These limitations are no big deal in the long run. The Coronagraph is actually more of a technology demonstration, and it doesn’t represent the end-game for exoplanet study. Whatever is learned from this instrument will help us in the future. There will be an eventual successor to WFIRST some day, perhaps decades from now, and by that time Coronagraph technology will have advanced a great deal. At that future time, direct snapshots of Earth-like exoplanets may well be possible.

But maybe we won’t have to wait that long.

Starshade To The Rescue?

There is a plan to boost the effectiveness of the Coronagraph on WFIRST that would allow it to image Earth-like planets. It’s called the EXO-S Starshade.

The EXO-S Starshade is a 34m diameter deployable shading system that will block starlight from impairing the function of WFIRST. It would actually be a separate craft, launched separately and sent on its way to rendezvous with WFIRST at L2. It would not be tethered, but would orient itself with WFIRST through a system of cameras and guide lights. In fact, part of the power of the Starshade is that it would be about 40,000 to 50,000 km away from WFIRST.

Dark Energy and Exoplanets are priorities for WFIRST, but there are always other discoveries awaiting better telescopes. It’s not possible to predict everything that we’ll learn from WFIRST. With images as detailed as Hubble’s, but 100 times larger, we’re in for some surprises.

“This mission will survey the universe to find the most interesting objects out there.” – Neil Gehrels, WFIRST Project Scientist

“In addition to its exciting capabilities for dark energy and exoplanets, WFIRST will provide a treasure trove of exquisite data for all astronomers,” said Neil Gehrels, WFIRST project scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “This mission will survey the universe to find the most interesting objects out there.”

With all of the Super Telescopes coming on line in the next few years, we can expect some amazing discoveries. In 10 to 20 years time, our knowledge will have advanced considerably. What will we learn about Dark Matter and Dark Energy? What will we know about exoplanet populations?

Right now it seems like we’re just groping towards a better understanding of these things, but with WFIRST and the other Super Telescopes, we’re poised for more purposeful study.

James Webb Space Telescope Celebrated in Stunning New Video

Behold, the mighty primary mirror of the James Webb Space Telescope, in all its gleaming glory! Image: NASA/Chris Gunn
The primary mirror of the James Webb Space Telescope, in all its gleaming glory! Image: NASA/Chris Gunn

NASA has some high hopes for the James Webb Space Telescope, which finished the “cold” phase of its construction at the end of November 2016. The result of 20 years of engineering and construction, this telescope is seen as Hubble’s natural successor. Once it is deployed in October of 2018, it will use a 6.5 meter (21 ft 4 in) primary mirror to examine the Universe in the visible, near-infrared, and mid-infrared wavelengths.

All told, the JWST will be 100 times more powerful than its predecessor and will be capable of looking over 13 billion years in time. To honor the completion of the telescope, Northrop Grumman – the company contracted by NASA to build it – and Crazy Boat Pictures teamed up to produce a short film about it. Titled “Into the Unknown – the Story of NASA’s James Webb Space Telescope“, the video chronicles the project from inception to completion.

The film (which you can watch at the bottom of the page) shows the construction of the telescope’s large mirrors, its instrument package, and its framework. It also features conversations with the scientists and engineers who were involved and some stunning visuals. In addition to detailing the creation process, the film also delves into the telescope’s mission and all the cosmological questions it will address.

In addressing the nature of James Webb’s mission, the film also pays homage to the Hubble Space Telescope and its many accomplishments. Over the course of its 26 years of operation, it has revealed auroras and supernovas and discovered billions of stars, galaxies, and exoplanets, some of which were shown to orbit within their star’s respective habitable zones.

On top of that, Hubble was used to determine the age of the Universe (13.8 billion years) and confirmed the existence of the supermassive black hole (SMBH) – Sagittarius A* – at the center of our galaxy, not to mention many others. It was also responsible for measuring the rate at which the Universe is expanding – in other words, measuring the Hubble Constant.

This played a pivotal role in helping scientists to develop the theory of Dark Energy, one of the most profound discoveries since Edwin Hubble (the telescope’s namesake) proposed that the Universe is in a state of expansion back in 1929. So it goes without saying that the deployment of the Hubble Space Telescope led to some of the greatest discoveries in modern astronomy.

That being said, Hubble is still subject to limitations, which astronomers are now hoping to push past. For one, its instruments are not able to pick up the most distant (and hence, dimmest) galaxies in the Universe, which date to just a few hundred million years after the Big Bang. Even with “The Deep Fields” initiative, Hubble is still limited to seeing back to about half a billion years after the Big Bang.

Illustration of the depth by which Hubble imaged galaxies in prior Deep Field initiatives, in units of the Age of the Universe. The goal of the Frontier Fields is to peer back further than the Hubble Ultra Deep Field and get a wealth of images of galaxies as they existed in the first several hundred million years after the Big Bang. Note that the unit of time is not linear in this illustration. Illustration Credit: NASA and A. Feild (STScI)
Illustration of the depth by which Hubble imaged galaxies in prior Deep Field initiatives in units of the Age of the Universe. Credit: NASA and A. Feild (STScI)

As Dr. John Mather, the project scientist for the James Webb Telescope, told Universe Today via email:

“Hubble showed us that we could not see the first galaxies being born, because they’re too far away, too faint, and too red. JWST is bigger, colder, and observes infrared light to see those first galaxies.  Hubble showed us there’s a black hole in the center of almost every galaxy. JWST will look as far back in time as possible to see when and how that happened: did the galaxy form the black hole, or did the galaxy grow around a pre-existing black hole?  Hubble showed us great clouds of glowing gas and dust where stars are being born. JWST will look through the dust clouds to see the stars themselves as they form in the cloud. Hubble showed us that we can see some planets around other stars, and that we can get chemical information about other planets that happen to pass directly in front of their stars.  JWST will extend this to longer wavelengths with a bigger telescope, with a possibility of detecting water on a super-Earth exoplanet. Hubble showed us details of planets and asteroids close to home, and JWST will give a closer look, though it’s still better to send a visiting robot if we can.”
Basically, the JWST will be able to see farther back to about 100 million years after the Big Bang, when the first stars and galaxies were born. It is also designed to operate at the L2 Lagrange Point, farther away from the Earth than Hubble – which was designed to remain in low-Earth orbit. This means the JWST will be subject to less thermal and optical interference from the Earth and the Moon, but will also make it more difficult to service.

With its much larger set of segmented mirrors, it will observe the Universe as it captures light from the first galaxies and stars. Its extremely-sensitive suite of optics will also be able to gather information in the long-wavelength (orange-red) and infrared wavelengths with greater accuracy, measuring the redshift of distant galaxies and even helping in the hunt for extra-solar planets.

A primary mirror segments of the James Webb Space Telescope, made of beryllium. Credit: NASA/MSFC/David Higginbotham/Emmett Given
A primary mirror segments of the James Webb Space Telescope, made of beryllium. Credit: NASA/MSFC/David Higginbotham/Emmett Given

With the assembly of its major components now complete, the telescope will spend the next two years undergoing tests before its scheduled launch date in October 2018. These will include stress tests that will subject the telescope to the types of intense vibrations, sounds, and g forces (ten times Earth’s gravity) it will experience inside the Ariane 5 rocket that will take it into space.

Six months before its deployment, NASA also plans to send the JWST to the Johnson Space Center, where it will be subjected to the kinds of conditions it will experience in space. This will consist of scientists placing the telescope in a chamber where temperatures will be lowered to 53 K (-220 °C; -370 °F), which will simulate its operating conditions at the L2 Lagrange Point.

Once all of that is complete, and the JWST checks out, it will be launched aboard an Ariane 5 rocket from Arianespace’s ELA-3 launch pad in French Guayana. And thanks to experience gained from Hubble and updated algorithms, the telescope will be focused and gathering information shortly after it is launched. And as Dr. Mather explained, the big cosmological questions it is expected to address are numerous:

“Where did we come from? The Big Bang gave us hydrogen and helium spread out almost uniformly across the universe. But something, presumably gravity, stopped the expansion of the material and turned it into galaxies and stars and black holes. JWST will look at all these processes: how did the first luminous objects form, and what were they? How and where did the black holes form, and what did they do to the growing galaxies? How did the galaxies cluster together, and how did galaxies like the Milky Way grow and develop their beautiful spiral structure? Where is the cosmic dark matter and how does it affect ordinary matter? How much dark energy is there, and how does it change with time?”

Needless to say, NASA and the astronomical community are quite excited that the James Webb Telescope is finished construction, and can’t wait until it is deployed and begins to send back data. One can only imagine the kinds of things it will see deep in the cosmic field. But in the meantime, be sure to check out the film and see how this effort all came together:

Further Reading: NASA – JWST, Northrop Grumman

ESO Survey Shows Dark Matter to be Pretty “Smooth”

The technique of gravitational lensing relies on the presence of a large cluster of matter between the observer and the object to magnify light coming from that object. Credit: NASA

Dark Matter has been something of a mystery ever since it was first proposed. In addition to trying to find some direct evidence of its existence, scientists have also spent the past few decades developing theoretical models to explain how it works. In recent years, the popular conception has been that Dark Matter is “cold”, and distributed in clumps throughout the Universe, an observation supported by the Planck mission data.

However, a new study produced by an international team of researchers paints a different picture. Using data from the Kilo Degree Survey (KiDS), these researchers studied how the light coming from millions of distant galaxies was affected by the gravitational influence of matter on the largest of scales. What they found was that Dark Matter appears to more smoothly distributed throughout space than previously thought.

Continue reading “ESO Survey Shows Dark Matter to be Pretty “Smooth””

New Theory of Gravity Does Away With Need for Dark Matter

University of Amsterdam


Erik Verlinde explains his new view of gravity

Let’s be honest. Dark matter’s a pain in the butt. Astronomers have gone to great lengths to explain why is must exist and exist in huge quantities, yet it remains hidden. Unknown. Emitting no visible energy yet apparently strong enough to keep galaxies in clusters from busting free like wild horses, it’s everywhere in vast quantities. What is the stuff – axions, WIMPS, gravitinos, Kaluza Klein particles?

Estimated distribution of matter and energy in the universe. Credit: NASA
Estimated distribution of matter and energy in the universe. Credit: NASA

It’s estimated that 27% of all the matter in the universe is invisible, while everything from PB&J sandwiches to quasars accounts for just 4.9%.  But a new theory of gravity proposed by theoretical physicist Erik Verlinde of the University of Amsterdam found out a way to dispense with the pesky stuff.

formation of complex symmetrical and fractal patterns in snowflakes exemplifies emergence in a physical system.
Snowflakes exemplify the concept of emergence with their complex symmetrical and fractal patterns created when much simpler pieces join together. Credit: Bob King

Unlike the traditional view of gravity as a fundamental force of nature, Verlinde sees it as an emergent property of space.  Emergence is a process where nature builds something large using small, simple pieces such that the final creation exhibits properties that the smaller bits don’t. Take a snowflake. The complex symmetry of a snowflake begins when a water droplet freezes onto a tiny dust particle. As the growing flake falls, water vapor freezes onto this original crystal, naturally arranging itself into a hexagonal (six-sided) structure of great beauty. The sensation of temperature is another emergent phenomenon, arising from the motion of molecules and atoms.

So too with gravity, which according to Verlinde, emerges from entropy. We all know about entropy and messy bedrooms, but it’s a bit more subtle than that. Entropy is a measure of disorder in a system or put another way, the number of different microscopic states a system can be in. One of the coolest descriptions of entropy I’ve heard has to do with the heat our bodies radiate. As that energy dissipates in the air, it creates a more disordered state around us while at the same time decreasing our own personal entropy to ensure our survival. If we didn’t get rid of body heat, we would eventually become disorganized (overheat!) and die.

The more massive the object, the more it distorts spacetime. Credit: LIGO/T. Pyle
The more massive the object, the more it distorts space-time, shown here as the green mesh. Earth orbits the Sun by rolling around the dip created by the Sun’s mass in the fabric of space-time. It doesn’t fall into the Sun because it also possesses forward momentum. Credit: LIGO/T. Pyle

Emergent or entropic gravity, as the new theory is called, predicts the exact same deviation in the rotation rates of stars in galaxies currently attributed to dark matter. Gravity emerges in Verlinde’s view from changes in fundamental bits of information stored in the structure of space-time, that four-dimensional continuum revealed by Einstein’s general theory of relativity. In a word, gravity is a consequence of entropy and not a fundamental force.

Space-time, comprised of the three familiar dimensions in addition to time, is flexible. Mass warps the 4-D fabric into hills and valleys that direct the motion of smaller objects nearby. The Sun doesn’t so much “pull” on the Earth as envisaged by Isaac Newton but creates a great pucker in space-time that Earth rolls around in.

In a 2010 article, Verlinde showed how Newton’s law of gravity, which describes everything from how apples fall from trees to little galaxies orbiting big galaxies, derives from these underlying microscopic building blocks.

His latest paper, titled Emergent Gravity and the Dark Universe, delves into dark energy’s contribution to the mix.  The entropy associated with dark energy, a still-unknown form of energy responsible for the accelerating expansion of the universe, turns the geometry of spacetime into an elastic medium.

“We find that the elastic response of this ‘dark energy’ medium takes the form of an extra ‘dark’ gravitational force that appears to be due to ‘dark matter’,” writes Verlinde. “So the observed dark matter phenomena is a remnant, a memory effect, of the emergence of spacetime together with the ordinary matter in it.”

Rotation curve of the typical spiral galaxy M 33 (yellow and blue points with errorbars) and the predicted one from distribution of the visible matter (white line). The discrepancy between the two curves is accounted for by adding a dark matter halo surrounding the galaxy. Credit: Public domain / Wikipedia
This diagram shows rotation curves of stars in M33, a typical spiral galaxy. The vertical scale is speed and the horizontal is distance from the galaxy’s nucleus. Normally, we expect stars to slow down the farther they are from galactic center (bottom curve), but in fact they revolve much faster (top curve). The discrepancy between the two curves is accounted for by adding a dark matter halo surrounding the galaxy. Credit: Public domain / Wikipedia

I’ll be the first one to say how complex Verlinde’s concept is, wrapped in arcane entanglement entropy, tensor fields and the holographic principal, but the basic idea, that gravity is not a fundamental force, makes for a fascinating new way to look at an old face.

Physicists have tried for decades to reconcile gravity with quantum physics with little success. And while Verlinde’s theory should be rightly be taken with a grain of salt, he may offer a way to combine the two disciplines into a single narrative that describes how everything from falling apples to black holes are connected in one coherent theory.