For decades, various physicists have theorized that even the slightest changes in the fundamental laws of nature would make it impossible for life to exist. This idea, also known as the “Fine-Tuned Universe” argument, suggests that the occurrence of life in the Universe is very sensitive to the values of certain fundamental physics. Alter any of these values (as the logic goes), and life would not exist, meaning we must be very fortunate to be here!
But can this really be the case, or is it possible that life can emerge under different physical constants, and we just don’t know it? This question was recently tackled by Luke A. Barnes, a postdoctoral researcher at the Sidney Institute for Astronomy (SIA) in Australia. In his recent book, A Fortunate Universe: Life in a Finely Tuned Cosmos, he and Sydney astrophysics professor Geraint F. Lewis argued that a fine-tuned Universe makes sense from a physics standpoint.
The authors also summarized these arguments in an invited contribution paper, which appeared in the Routledge Companion to Philosophy of Physics (1st ed.) In this paper, titled “The Fine-Tuning of the Universe for Life,” Barnes explains how “fine-tuning” consists of explaining observations by employing a “suspiciously precise assumption.” This, he argues, has been symptomatic of incomplete theories throughout history and is a common feature of modern cosmology and particle physics.
In some respects, this idea is similar to the Anthropic Principle, which states that any attempt to explain the properties of the Universe cannot ignore our existence as lifeforms. This stands in stark contrast to the Cosmological Principle – aka. Copernican Principle, named after Nicolaus Copernicus, who formulated the heliocentric model of the Universe – which states that there is nothing unique or special about humans or our place in the Universe.
In a previous paper, Barnes and Lewis argued that far from being a case of arrogance or “religion in disguise,” the Anthropic Principle is a necessary part of science. When addressing the coincidence between humanity’s existence and a Universe that is old enough and governed by physics that favor the emergence of intelligent life (i.e., us), they derived a simple maxim: “Any account of the coincidence must consider how the Universe makes beings that are capable of measuring [it].”
But as Barnes explained to Universe Today via email, there are some significant differences between the Anthropic Principle and the Fine-Tuned Universe:
“I understand the relationship between fine-tuning and the anthropic principle as follows. Fine-tuning refers to the fact that small changes to the constants of nature would have resulted in a universe incapable of supporting life. The Anthropic Principle says that if physical life-forms exist, they must observe that they are in a universe that is capable of sustaining their existence.”
Put another way, Barnes states that the Anthropic Principle is an unfalsifiable statement (aka. a tautology) that results from the “selection effect” of our own existence. Since we do not have a population of intelligent life and civilizations to select from, the principle itself cannot be falsified. Meanwhile, says Barnes, the fine-tuning argument is a “surprising fact about the laws of nature as we know them.”
The Fine-Tuned Universe argument dates back to the 1970s when physics began to note that small changes to the fundamental constants of nature, or in the Universe’s initial conditions, would rule out life as we know it. Had the cosmos and the laws of physics themselves evolved differently, the stability required for living creatures to exist (in all their complexity) would not be possible.
But as Barnes notes in his summary paper, this logic runs afoul of the same old problem. Like the geocentric model of antiquity, it contains suspiciously precise assumptions, which he proceeds to address one by one. The first has to do with the Cosmological Constant (CC), an idea Einstein proposed in 1917 as a temporary addition to his field equations for General Relativity. Denoted by the character Lambda, the CC was a force that would “counterbalance gravity” and thus ensure the Universe remained static (a popular view at the time).
While Einstein ditched the CC a few years later when he learned that astronomers had proven that the Universe is expanding, the idea has been reinterpreted since the 1990s. With the realization that cosmic expansion is accelerating, physicists began postulating that Einstein’s CC could be the mysterious force known as “Dark Energy” (DE). This led to the widely accepted cosmological theory known as the Lambda Cold Dark Matter (LCDM) model.
However, the CC also represents one of the most significant theoretical problems in modern physics. Like Dark Matter, the existence of DE or a reinvented CC was proposed to explain the difference between observations and theoretical predictions. Like Ptolemy’s “epicycles” that were used to rationalize observations that didn’t conform with the geocentric model, the CC is an assumption that is “suspiciously precise.”
In addition, there are the inconsistencies CC has with quantum field theory (QFT), which describes particles as configurations of a field. According to QFT, a particular configuration known as a “vacuum state” will still exist in the absence of particles. But if theories regarding CC and DE are to be believed, this would mean that there is a considerable amount of energy in the vacuum state.
The only way to explain this in terms acceptable to QFT and General Relativity is by assuming that the contributions of vacuum energy and quantum fields cancel each out. Once again, this requires a “suspiciously precise” coincidence between several independent factors. In another vein, the Standard Model of Particle Physics tells us that matter consists of 25 different types of subatomic particles divided into four groups (Quarks, Leptons, Guage Bosons, and Scalar Bosons).
The existence of these particles and their respective properties (mass, charge, and spin) have all been verified through rigorous experimentation. The slightest deviation to any of these properties would significantly affect how they interact and behave, leading to the complete instability of matter. Much the same is true of the dimensionality of spacetime, where three dimensions of space (as postulated by Newton) are needed for stable atoms and stable planetary orbits.
A Universe with three spatial dimensions and one dimension of time (as described by General Relativity) is also essential. Any more, says Barnes, and atomic systems could not remain stable. In other words, while the CC may raise theoretical problems, the Standard Model and the dimensionality of space-time are consistent with the fine-tuned model. As Barnes put it:
“The cosmological constant is unexplained in our equations and is consistent with a life-permitting universe only in a very small range. Its value is an unmotivated and precise assumption, in the constant of the standard models of particle physics and cosmology. Many of the other constants of the standard model are the same.”
The question, then, is how does one resolve these issues in our conventional models? What else could explain the fact that our Universe is life-permitting while variations of the smallest kind would make that impossible? To this, Barnes and Lewis suggest that the Multiverse could come to the rescue. “Perhaps the multiverse – our universe is life-permitting by chance, and there are lots of other variegated universes out there,” he said.
But in the meantime, there is still the possibility that any inconsistencies or incongruities indicate what the truth is. Like Copernicus, who realized that the motions of the planets (which required epicycles and equants to make sense) were actually an indication that the model was wrong, fine-tuning may be an indication of physics beyond the standard model or that the model itself needs revision.
“I think fine-tuning in general is a clue to a deeper explanation. Small probabilities might just be small probabilities, or they might be generated by some incorrect assumptions,” Barnes added. “The interesting thing about the fine-tuning of the fundamental constants is that they’re at the bottom floor of scientific explanations at the moment. They’re as deep as physics goes (at least, while it’s supported by evidence.)”
Barnes and Lewis are also responsible for The Cosmic Revolutionary’s Handbook: (Or: How to Beat the Big Bang), which further details their theories on cosmology and the fine-tuned model (published in 2019).
Further Reading: arXiv
The preprint is not published under science but History and Philosophy of Physics, and is a rehash of philosophical ideas which scientifically irrelevant terms like “tautology” attests to.
Even if any the characterizations of the vacuum energy density, Lambda in LambdaCDM models, would be tautological they are testable and precisely that happened here. The paper itself helpfully describes this:
“The anthropic prediction by Weinberg (1987) of the cosmological constant provides an excellent test case. … Weinberg’s analytic calculation gives an upper limit of ?_?,max ? 550?_0 ? 3 × 10^?121 where ?_0 is the present cosmic mass density. Weinberg made this prediction before observation showed that ?_? ? 1.2 × 10^?123.”
Weinberg’s prediction is on the short list of candidates in eBOSS 20 year cosmological summary paper [ https://arxiv.org/pdf/2007.08991.pdf ].
“Nevertheless, the observed consistency with flat ?CDM at the higher precision of this work points increasingly towards a pure cosmological constant solution, for example, as would be produced by a vacuum energy finetuned to have a small value. This fine-tuning represents a theoretical difficulty without any agreed-upon resolution and one that may not be resolvable through fundamental physics considerations alone (Weinberg 1989; Brax & Valageas 2019). This difficulty has been substantially sharpened by the observations presented here.”
And recently the BICEP3/Keck observatories well tested that observed inflation is slow roll, the ratio r of gravitational tensor fluctuations to the inflationary field scalar fluctuations approaching zero, which is a physical process that naturally results in multiverses [ https://physics.aps.org/articles/v14/135 ].
Scientists don’t like this explanation but the universe is powered by magic. I’m serious. Much like the human mind with its logical lobe and creative one, so too is the “mind” of the creator. The creator is a living contradiction. The universe was created with strict physical rules but the contradiction of its origin and precision is because the energy source is the creator, who is magical. Magic is the impossible which is possible. Same for the creator. When you accept the paradox as reality then you begin to grasp how the universe can exist. Science alone will never figure it out. You need the wholeness of rationality powered by irrationality of magic. How do I know this? I’ve been fortunate to have been touched by the hand of the creator in thy physical world, as have others. It is only possible when your mind is truly open to all possibilities.