It takes a rich and diverse set of complex molecules for things like stars, galaxies, planets and lifeforms like us to exist. But before humans and all the complex molecules we’re made of could exist, there had to be that first primordial molecule that started a long chain of chemical events that led to everything you see around you today.
Though it’s been long theorized to exist, the lack of observational evidence for that molecule was problematic for scientists. Now they’ve found it and those scientists can rest easy. Their predictive theory wins!
Looking deep into the observable Universe – and hence, back to the earliest periods of time – is an immensely fascinating thing. In so doing, astronomers are able to see the earliest galaxies in the Universe and learn more about how they evolved over time. From this, they are not only able to see how large-scale structures (like galaxies and galaxy clusters) formed, but also the role played by dark matter.
As they indicate in their study, this protocluster (designated SPT2349-56) was first observed by the National Science Foundation’s South Pole Telescope. Using the Atacama Pathfinder Experiment (APEX), the team conducted follow-up observations that confirmed that it was an extremely distant galactic source, which was then observed with ALMA. Using ALMA’s superior resolution and sensitivity, they were able to distinguish the individual galaxies.
What they found was that these galaxies were forming stars at rate 1,000 times faster than our galaxy, and were crammed inside a region of space that was about three times the size of the Milky Way. Using the ALMA data, the team was also able to create sophisticated computer simulations that demonstrated how this current collection of galaxies will likely grow and evolve over billion of years.
These simulations indicated that once these galaxies merge, the resulting galaxy cluster will rival some of the most massive clusters we see in the Universe today. As Scott Chapman, and astrophysicist at Dalhousie University and a co-author on the study, explained:
“Having caught a massive galaxy cluster in throes of formation is spectacular in and of itself. But, the fact that this is happening so early in the history of the universe poses a formidable challenge to our present-day understanding of the way structures form in the universe.”
The current scientific consensus among astrophysicists states that a few million years after the Big Bang, normal matter and dark matter began to form larger concentrations, eventually giving rise to galaxy clusters. These objects are the largest structures in the Universe, containing trillions of stars, thousands of galaxies, immense amounts of dark matter and massive black holes.
However, current theories and computer models have suggested that protoclusters – like the one observed by ALMA – should have taken much longer to evolve. Finding one that dates to just 1.4 billion years after the Big Bang was therefore quite the surprise. As Tim Miller, who is currently a doctoral candidate at Yale University, indicated:
“How this assembly of galaxies got so big so fast is a bit of a mystery, it wasn’t built up gradually over billions of years, as astronomers might expect. This discovery provides an incredible opportunity to study how galaxy clusters and their massive galaxies came together in these extreme environments.”
Looking to the future, Chapman and his colleagues hope to conduct further studies of SPT2349-56 to see how this protoclusters eventually became a galaxy cluster. “ALMA gave us, for the first time, a clear starting point to predict the evolution of a galaxy cluster,” he said. “Over time, the 14 galaxies we observed will stop forming stars and will collide and coalesce into a single gigantic galaxy.”
The study of this and other protoclusters will be made possible thanks to instruments like ALMA, but also next-generation observatories like the Square Kilometer Array (SKA). Equipped with more sensitive arrays and more advanced computer models, astronomers may be able to create a truly accurate timeline of how our Universe became what it is today.
In February of 2016, scientists working for the Laser Interferometer Gravitational-Wave Observatory (LIGO) made the first-ever detection of gravitational waves. Since that time, multiple detections have taken place, thanks in large to part to improvements in instruments and greater levels of collaboration between observatories. Looking ahead, its possible that missions not designed for this purpose could also “moonlight” as gravitational wave detectors.
For example, the Gaia spacecraft – which is busy creating the most detailed 3D map of the Milky Way – could also be instrumental when it comes to gravitational wave research. That’s what a team of astronomers from the University of Cambridge recently claimed. According to their study, the Gaia satellite has the necessary sensitivity to study ultra-low frequency gravitational waves that are produced by supermassive black hole mergers.
To recap, gravitational waves (GWs) are ripples in space-time that are created by violent events, such as black hole mergers, collisions between neutron stars, and even the Big Bang. Originally predicted by Einstein’s Theory of General Relativity, observatories like LIGO and Advanced Virgo detect these waves by measuring the way space-time flexes and squeezes in response to GWs passing through Earth.
However, passing GWs would also cause the Earth to oscillate in its location with respect to the stars. As a result, an orbiting space telescope (such as Gaia), would be able to pick up on this by noting a temporary shift in the position of distant stars. Launched in 2013, the Gaia observatory has spent the past few years conducting high-precision observations of the positions of stars in our Galaxy (aka. astrometry).
In this respect, Gaia would look for small displacements in the massive field of stars it is monitoring to determine if gravitational waves have passed through the Earth’s neighborhood. To investigate whether or not Gaia was up to the task, Moore and his colleagues performed calculations to determine if the Gaia space telescope had the necessary sensitivity to detect ultra-low frequency GWs.
To this end, Moore and his colleagues simulated gravitational waves produced by a binary supermassive black hole – i.e. two SMBHs orbiting one another. What they found was that by compressing the data sets by a factor of more than 106 (measuring 100,000 stars instead of a billion at a time), GWs could be recovered from Gaia data with an only 1% loss of sensitivity.
This method would be similar to that used in Pulsar Timing Arrays, where a set of millisecond pulsars are examined to determine if gravitational waves modify the frequency of their pulses. However, in this case, stars are being monitored to see if they are oscillating with a characteristic pattern, rather than pulsing. By looking at a field of 100,000 stars at a time, researchers would be able to detect induced apparent motions (see figure above).
Because of this, the full release of Gaia data (scheduled for the early 2020s) is likely to be a major opportunity for those hunting for GW signals. As Moore explained in a APS Physicspress release:
“Gaia will make measuring this effect a realistic prospect for the first time. Many factors contribute to the feasibility of the approach, including the precision and long duration of the astrometric measurements. Gaia will observe about a billion stars over 5–10 years, locating each one of them at least 80 times during that period. Observing so many stars is the major advance provided by Gaia.”
It is also interesting to note that the potential for GW detection was something that researchers recognized when Gaia was still being designed. One such individual was Sergei A. Klioner, a researcher from the Lorhrmann Observatory and the leader of the Gaia group at TU Dresden. As he indicated in his 2017 study, “Gaia-like astrometry and gravitational waves“, Gaia could detect GWs caused by merging SMBHs years after the event:
“It is clear that the most promising sources of gravitational waves for astrometric detection are supermassive binary black holes in the centers of galaxies… It is believed that binary supermassive black holes are a relatively common product of interaction and merging of galaxies in the typical course of their evolution. This sort of objects can give gravitational waves with both frequencies and amplitudes potentially within the reach of space astrometry. Moreover, the gravitational waves from those objects can often be considered to have virtually constant frequency and amplitude during the whole period of observations of several years.”
But of course, there’s no guarantees that sifting through the Gaia data will reveal additional GW signals. For one thing, Moore and his colleagues acknowledge that waves at these ultra-low frequencies could be too weak for even Gaia to detect. In addition, researchers will have to be able to distinguish between GWs and conflicting signals that result from changes in the spacecraft’s orientation – which is no easy challenge!
Still, there is hope that missions like Gaia will be able to reveal GWs that are not easily visible to ground-based interferometric detectors like LIGO and Advanced Virgo. Such detectors are subject to atmospheric effects (like refraction) which prevent them from seeing extremely low frequency waves – for instance, the primordial waves produced during the inflationary epoch of the Big Bang.
In this sense, gravitational wave research is not unlike exoplanet research and many other branches of astronomy. In order to find the hidden gems, observatories may need to take to space to eliminate atmospheric interference and increase their sensitivity. It is possible then that other space telescopes will be retooled for GW research, and that next-generation GW detectors will be mounted aboard spacecraft.
In the past few years, scientists have gone from making the first detection of gravitational waves to developing new and better ways to detecting them. At this rate, it won’t be long before astronomers and cosmologists are able to include gravitational waves into our cosmological models. In other words, they will be able to show what influence these waves played in the history and evolution of the Universe.
It is a well known fact among astronomers and cosmologists that the farther into the Universe you look, the further back in time you are seeing. And the closer astronomers are able to see to the Big Bang, which took place 13.8 billion years ago, the more interesting the discoveries tend to become. It is these finds that teach us the most about the earliest periods of the Universe and its subsequent evolution.
As with other SMBHs, this particular discovery (designated J1342+0928) is a quasar, a class of super bright objects that consist of a black hole accreting matter at the center of a massive galaxy. The object was discovered during the course of a survey for distant objects, which combined infrared data from the WISE mission with ground-based surveys. The team then followed up with data from the Carnegie Observatory’s Magellan telescopes in Chile.
As with all distant cosmological objects, J1342+0928’s distance was determined by measuring its redshift. By measuring how much the wavelength of an object’s light is stretched by the expansion of the Universe before it reaches Earth, astronomers are able to determine how far it had to travel to get here. In this case, the quasar had a redshift of 7.54, which means that it took more than 13 billion years for its light to reach us.
As Xiaohui Fan of the University of Arizona’s Steward Observatory (and a co-author on the study) explained in a Carnegie press release:
“This great distance makes such objects extremely faint when viewed from Earth. Early quasars are also very rare on the sky. Only one quasar was known to exist at a redshift greater than seven before now, despite extensive searching.”
Given its age and mass, the discovery of this quasar was quite the surprise for the study team. As Daniel Stern, an astrophysicist at NASA’s Jet Propulsion Laboratory and a co-author on the study, indicated in a NASA press release, “This black hole grew far larger than we expected in only 690 million years after the Big Bang, which challenges our theories about how black holes form.”
Essentially, this quasar existed at a time when the Universe was just beginning to emerge from what cosmologists call the “Dark Ages”. During this period, which began roughly 380,000 years to 150 million years after the Big Bang, most of the photons in the Universe were interacting with electrons and protons. As a result, the radiation of this period is undetectable by our current instruments – hence the name.
The Universe remained in this state, without any luminous sources, until gravity condensed matter into the first stars and galaxies. This period is known as the “Reinozation Epoch”, which lasted from 150 million to 1 billion years after the Big Bang and was characterized by the first stars, galaxies and quasars forming. It is so-named because the energy released by these ancient galaxies caused the neutral hydrogen of the Universe to get excited and ionize.
Once the Universe became reionzed, photons could travel freely throughout space and the Universe officially became transparent to light. This is what makes the discovery of this quasar so interesting. As the team observed, much of the hydrogen surrounding it is neutral, which means it is not only the most distant quasar ever observed, but also the only example of a quasar that existed before the Universe became reionized.
In other words, J1342+0928 existed during a major transition period for the Universe, which happens to be one of the current frontiers of astrophysics. As if this wasn’t enough, the team was also confounded by the object’s mass. For a black hole to have become so massive during this early period of the Universe, there would have to be special conditions to allow for such rapid growth.
What these conditions are, however, remains a mystery. Whatever the case may be, this newly-found SMBH appears to be consuming matter at the center of a galaxy at an astounding rate. And while its discovery has raised many questions, it is anticipated that the deployment of future telescopes will reveal more about this quasar and its cosmological period. As Stern said:
“With several next-generation, even-more-sensitive facilities currently being built, we can expect many exciting discoveries in the very early universe in the coming years.”
These next-generation missions include the European Space Agency’s Euclid mission and NASA’s Wide-field Infrared Survey Telescope (WFIRST). Whereas Euclid will study objects located 10 billion years in the past in order to measure how dark energy influenced cosmic evolution, WFIRST will perform wide-field near-infrared surveys to measure the light coming from a billion galaxies.
Both missions are expected to reveal more objects like J1342+0928. At present, scientists predict that there are only 20 to 100 quasars as bright and as distant as J1342+0928 in the sky. As such, they were most pleased with this discovery, which is expected to provide us with fundamental information about the Universe when it was only 5% of its current age.
At the Amundsen–Scott South Pole Station in Antarctica lies the IceCube Neutrino Observatory – a facility dedicated to the study of elementary particles known as neutrino. This array consists of 5,160 spherical optical sensors – Digital Optical Modules (DOMs) – buried within a cubic kilometer of clear ice. At present, this observatory is the largest neutrino detector in the world and has spent the past seven years studying how these particles behave and interact.
The most recent study released by the IceCube collaboration, with the assistance of physicists from Pennsylvania State University, has measured the Earth’s ability to block neutrinos for the first time. Consistent with the Standard Model of Particle Physics, they determined that while trillions of neutrinos pass through Earth (and us) on a regular basis, some are occasionally stopped by it.
Back in 2013, the first detections of high-energy neutrinos were made by IceCube collaboration. These neutrinos – which were believed to be astrophysical in origin – were in the peta-electron volt range, making them the highest energy neutrinos discovered to date. IceCube searches for signs of these interactions by looking for Cherenkov radiation, which is produced after fast-moving charged particles are slowed down by interacting with normal matter.
By detecting neutrinos that interact with the clear ice, the IceCube instruments were able to estimate the energy and direction of travel of the neutrinos. Despite these detections, however, the mystery remained as to whether or not any kind of matter could stop a neutrino as it journeyed through space. In accordance with the Standard Model of Particle Physics, this is something that should happen on occasion.
After observing interactions at IceCube for a year, the science team found that the neutrinos that had to travel the farthest through Earth were less likely to reach the detector. As Doug Cowen, a professor of physics and astronomy/astrophysics at Penn State, explained in a Penn State press release:
“This achievement is important because it shows, for the first time, that very-high-energy neutrinos can be absorbed by something – in this case, the Earth. We knew that lower-energy neutrinos pass through just about anything, but although we had expected higher-energy neutrinos to be different, no previous experiments had been able to demonstrate convincingly that higher-energy neutrinos could be stopped by anything.”
The existence of neutrinos was first proposed in 1930 by theoretical physicist Wolfgang Pauli, who postulated their existence as a way of explaining beta decay in terms of the conservation of energy law. They are so-named because they are electrically neutral, and only interact with matter very weakly – i.e. through the weak subatomic force and gravity. Because of this, neutrinos pass through normal matter on a regular basis.
Whereas neutrinos are produced regularly by stars and nuclear reactors here on Earth, the first neutrinos were formed during the Big Bang. The study of their interaction with normal matter can therefore tell us much about how the Universe evolved over the course of billions of years. Many scientists anticipate that the study of neutrinos will indicate the existence of new physics, ones which go beyond the Standard Model.
Because of this, the science team was somewhat surprised (and perhaps disappointed) with their results. As Francis Halzen – the principal investigator for the IceCube Neutrino Observatory and a professor of physics at the University of Wisconsin-Madison – explained:
“Understanding how neutrinos interact is key to the operation of IceCube. We were of course hoping for some new physics to appear, but we unfortunately find that the Standard Model, as usual, withstands the test.
For the most part, the neutrinos selected for this study were more than one million times more energetic than those that are produced by our Sun or nuclear power plants. The analysis also included some that were astrophysical in nature – i.e. produced beyond Earth’s atmosphere – and may have been accelerated towards Earth by supermassive black holes (SMBHs).
Darren Grant, a professor of physics at the University of Alberta, is also the spokesperson for the IceCube Collaboration. As he indicated, this latest interaction study opens doors for future neutrino research. “Neutrinos have quite a well-earned reputation of surprising us with their behavior,” he said. “It is incredibly exciting to see this first measurement and the potential it holds for future precision tests.”
This study not only provided the first measurement of the Earth’s absorption of neutrinos, it also offers opportunities for geophysical researchers who are hoping to use neutrinos to explore Earth’s interior. Given that Earth is capable of stopping some of the billions of high-energy particles that routinely pass through it, scientists could develop a method for studying the Earth’s inner and outer core, placing more accurate constraints on their sizes and densities.
It also shows that the IceCube Observatory is capable of reaching beyond its original purpose, which was particle physics research and the study of neutrinos. As this latest study clearly shows, it is capable of contributing to planetary science research and nuclear physics as well. Physicists also hope to use the full 86-string IceCube array to conduct a multi-year analysis, examining even higher ranges of neutrino energies.
As James Whitmore – the program director in the National Science Foundation’s (NSF) physics division (which provides support for IceCube) – indicated, this could allow them to truly search for physics that go beyond the Standard Model.
“IceCube was built to both explore the frontiers of physics and, in doing so, possibly challenge existing perceptions of the nature of universe. This new finding and others yet to come are in that spirit of scientific discovery.”
Ever since the discovery of the Higgs boson in 2012, physicists have been secure in the knowledge that the long journey to confirm the Standard Model was now complete. Since then, they have set their sets farther, hoping to find new physics that could resolve some of the deeper mysteries of the Universe – i.e. supersymmetry, a Theory of Everything (ToE), etc.
This, as well as studying how physics work at the highest energy levels (similar to those that existed during the Big Bang) is the current preoccupation of physicists. If they are successful, we might just come to understand how this massive thing known as the Universe works.
Since the deployment of the Hubble Space Telescope, astronomers have been able to look deeper into the cosmic web than ever before. The farther they’ve looked, the deeper back in time they are able to see, and thus learn what the Universe looked like billions of years ago. With the deployment of other cutting-edge telescopes and observatories, scientists have been able to learn a great deal more about the history and evolution of the cosmos.
Most recently, an international team of astronomers using the Gemini North Telescope in Hawaii were able to spot a spiral galaxy located 11 billion light years away. Thanks to a new technique that combined gravitational lensing and spectrography, they were able to see an object that existed just 2.6 billion years after the Big Bang. This makes this spiral galaxy, known as A1689B11, the oldest and most distant spiral galaxy spotted to date.
Together, the team relied on the gravitational lensing technique to spot A1689B11. This technique has become a mainstay for astronomers, and involves using a large object (like a galaxy cluster) to bend and magnify the light of a galaxy located behind it. As Dr. Tiantian Yuan, a Swinburne astronomer and the lead author on the research study, explained in a Swinburne press statement:
“This technique allows us to study ancient galaxies in high resolution with unprecedented detail. We are able to look 11 billion years back in time and directly witness the formation of the first, primitive spiral arms of a galaxy.”
They then used the Near-infrared Integral Field Spectrograph (NIFS) on the Gemini North telescope to verify the structure and nature of this spiral galaxy. This instrument was built Peter McGregor of The Australian National University (ANU), which now is responsible for maintaining it. Thanks to this latest discovery, astronomers now have some additional clues as to how galaxies took on the forms that we are familiar with today.
Based on the classification scheme developed by famed astronomer Edwin Hubble (the “Hubble Sequence“), galaxies are divides into 3 broad classes based on their shapes – ellipticals, lenticulars and spirals – with a fourth category reserved for “irregularly-shaped” galaxies. In accordance with this scheme, galaxies start out as elliptical structures before branching off to become spiraled, lenticular, or irregular.
As such, the discovery of such an ancient spiral galaxy is crucial to determining when and how the earliest galaxies began changing from being elliptical to taking on their modern forms. As Dr Renyue Cen, an astronomer from Princeton University and a co-author on the study, says:
“Studying ancient spirals like A1689B11 is a key to unlocking the mystery of how and when the Hubble sequence emerges. Spiral galaxies are exceptionally rare in the early Universe, and this discovery opens the door to investigating how galaxies transition from highly chaotic, turbulent discs to tranquil, thin discs like those of our own Milky Way galaxy.”
On top of that, this study showed that the A1689B11 spiral galaxy has some surprising features which could also help inform (and challenge) our understanding of this period in cosmic history. As Dr. Yuan explained, these features are in stark contrast to galaxies as they exist today. But equally interesting is the fact that it also differentiates this spiral galaxy from other galaxies that are similar in age.
“This galaxy is forming stars 20 times faster than galaxies today – as fast as other young galaxies of similar masses in the early Universe,” said Dr. Yuan. “However, unlike other galaxies of the same epoch, A1689B11 has a very cool and thin disc, rotating calmly with surprisingly little turbulence. This type of spiral galaxy has never been seen before at this early epoch of the Universe!”
In the future, the team hopes to conduct further studies of this galaxy to further resolve its structure and nature, and to compare it to other spiral galaxies from this epoch. Of particular interest to them is when the onset of spiral arms takes place, which should serve as a sort of boundary marker between ancient elliptical galaxies and modern spiral, lenticular and irregular shapes.
They will continue to rely on the NIFS to conduct these studies, but the team also hopes to rely on data collected by the James Webb Space Telescope (which will be launched in 2019). These and other surveys in the coming years are expected to reveal vital information about the earliest galaxies in the Universe, and reveal further clues as to how it changed over time.
Neutrinos are one of the fundamental particles that make up the Universe. Compared to other types of particles, they have very little mass, no charge, and only interact with others via the weak nuclear force and gravity. As such, finding evidence of heir interactions is extremely difficult, requiring massive instruments located deep underground to shield them from any interference.
However, using the Spallation Neutron Source (SNS), a research facility located at the Oak Ridge National Laboratory (ORNL) – a international team of researchers recently made an historic discovery about neutrinos using an entirely different method. As part of the COHERENT experiment, these results confirm a prediction made 43 years ago and offers new possibilities for neutrino research.
The study that details their findings, titled “Observation of coherent elastic neutrino-nucleus scattering“, was recently published in the journal Science. The research was conducted as part of the COHERENT experiment, a collaboration of 80 researchers from 19 institutions from more 4 nations that has been searching for what is known as Coherent Elastic Neutrino-Nucleus Scattering (CEvNS) for over a year.
In finding evidence of this behavior, COHERENT has essentially made history. As Jason Newby, an ORNL physicist and the technical coordinator for COHERENT, said in a ORNL press statement:
“The one-of-a-kind particle physics experiment at Oak Ridge National Laboratory was the first to measure coherent scattering of low-energy neutrinos off nuclei.”
To break it all down, the Standard Model of particle physics indicates that neutrinos are leptons, a particle that interacts with other matter very weakly. They are created through radioactive decay, the nuclear reactions that power stars, and from supernovae. The Big Bang model of cosmology also predicts that neutrinos are the most abundant particles in existence, since they are a byproduct of the creation of the Universe.
As such, their study has been a major focal point for theoretical physicists and cosmologists. In previous studies, neutrino interactions were detected by using literally tons of target material and then examining the particle transformations that resulted from neutrinos hitting them.
Examples include the Super-Kamiokande Observatory in Japan, an underground facility where the target material is 50,000 tons of ultrapure water. In the case of SNOLAB’s Sudbury Neutrino Observatory – which is located in a former mine complex near Sudbury, Ontario – the SNO neutrino detector relies on heavy water for neutrino detection while the SNO+ experiment will use a liquid scintillator.
And the IceCube Neutrino Observatory– the largest neutrino detector in the world, located at the Amundsen–Scott South Pole Station in Antarctica – relies on Antarctic ice to detect neutrino interactions. In all cases, the facilities are extremely isolated and rely on a very expensive equipment.
The COHERENT experiment, however, is immensely smaller and more economical by comparison, weighing a mere 14.5 kg (32 lbs) and occupying far less in the way of space. The experiment was created to take advantage of the existing SNS accelerator-based system, which produces the most intense pulsed neutron beams in the world in order to smash mercury atoms with beams of protons.
This process creates massive amounts of neutrons which are used for various scientific experiments. However, the process also creates a significant amount of neutrinos as a byproduct. To take advantage of this, the COHERENT team began developing a neutrino experiment known as “neutrino alley”. Located in a basement corridor just 20 meters (45 feet) from the mercury tank, the thick concrete walls and gravel provide natural shielding.
The corridor is also fitted with large water tanks to block out additional neutrinos, cosmic rays and other particles. But unlike other experiments, the COHERENT detectors look for signs of neutrinos bumping into the nuclei of other atoms. To do this, the team outfitted the corridor with detectors that rely on a cesium iodide scintillator crystal, which also uses odium to increase the prominence of light signals caused by neutrino interactions.
Juan Collar, a physicist from the University of Chicago, led the design team that created the detector used at SNS. As he explained, this was a “back-to-basics” approach that did away with more expensive and massive detectors:
“They are arguably the most pedestrian kind of radiation detector available, having been around for a century. Sodium-doped cesium iodide merges all of the properties required to work as a small, ‘handheld’ coherent neutrino detector. Very often, less is more.”
Thanks to their experiment and the sophistication of the SNS, the researchers were able to determine that neutrinos are capable of coupling to quarks through the exchange of neutral Z bosons. This process, which is known as Coherent Elastic Neutrino-Nucleus Scattering (CEvNS), was first predicted in 1973. But until now, no experiment or research team has been able to confirm it.
As Jason Newby indicated, the experiment succeeded in large part thanks to the sophistication of the existing facility. “The energy of the SNS neutrinos is almost perfectly tuned for this experiment—large enough to create a detectable signal, but small enough to take advantage of the coherence condition,” he said. “The only smoking gun of the interaction is a small amount of energy imparted to a single nucleus.”
The data it produced was also cleaner than with previous experiments, since the neutrinos (like the SNS neutron beam that produced them) were also pulsed. This allowed for the easy separation of the signal from background signals, which offered an advantage over steady-state neutrino sources – such as those that are produced by nuclear reactors.
The team also detected three “flavors” of neutrinos, which included muon neutrinos, muon antineutrinos, and electron neutrinos. Whereas the muon neutrinos emerged instantaneously, the others were detected a few microseconds later. From this, the COHERENT team not only validated the theory of CEvNS, but also the Standard Model of particle physics. Their findings also have implications for astrophysics and cosmology.
As Kate Scholberg, a physicist from Duke University and COHERENT’s spokesperson, explained:
“When a massive star collapses and then explodes, the neutrinos dump vast energy into the stellar envelope. Understanding the process feeds into understanding of how these dramatic events occur… COHERENT’s data will help with interpretation of measurements of neutrino properties by experiments worldwide. We may also be able to use coherent scattering to better understand the structure of the nucleus.”
While there is no need for further confirmation of their results, the COHERENT researchers plan to conduct additional measurements in order to observe coherent neutrino interactions at distinct rates (another signature of the process). From this, they hope to expand their knowledge of the nature of CEvNS, as well as other basic neutrino properties – such as their intrinsic magnetism.
This discovery was certainly impressive in its own right, given that it validates an aspect of both the Standard Model of particle physics and Big Bang cosmology. But the fact that the method offers cleaner results and relies on instruments that are significantly smaller and less expensive than other experiments – that is very impressive!
The implications of this research are sure to be far-reaching, and it will be interesting to see what other discoveries it enables in the future!
Just a couple of weeks ago, astronomers from Caltech announced their third detection of gravitational waves from the Laser Interferometer Gravitational-Wave Observatory or LIGO.
As with the previous two detections, astronomers have determined that the waves were generated when two intermediate-mass black holes slammed into each other, sending out ripples of distorted spacetime.
One black hole had 31.2 times the mass of the Sun, while the other had 19.4 solar masses. The two spiraled inward towards each other, until they merged into a single black hole with 48.7 solar masses. And if you do the math, twice the mass of the Sun was converted into gravitational waves as the black holes merged.
These gravitational waves traveled outward from the colossal collision at the speed of light, stretching and compressing spacetime like a tsunami wave crossing the ocean until they reached Earth, located about 2.9 billion light-years away.
The waves swept past each of the two LIGO facilities, located in different parts of the United States, stretching the length of carefully calibrated laser measurements. And from this, researchers were able to detect the direction, distance and strength of the original merger.
Seriously, if this isn’t one of the coolest things you’ve ever heard, I’m clearly easily impressed.
Now that the third detection has been made, I think it’s safe to say we’re entering a brand new field of gravitational astronomy. In the coming decades, astronomers will use gravitational waves to peer into regions they could never see before.
Being able to perceive gravitational waves is like getting a whole new sense. It’s like having eyes and then suddenly getting the ability to perceive sound.
This whole new science will take decades to unlock, and we’re just getting started.
As Einstein predicted, any mass moving through space generates ripples in spacetime. When you’re just walking along, you’re actually generating tiny ripples. If you can detect these ripples, you can work backwards to figure out what size of mass made the ripples, what direction it was moving, etc.
Even in places that you couldn’t see in any other way. Let me give you a couple of examples.
Black holes, obviously, are the low hanging fruit. When they’re not actively feeding, they’re completely invisible, only detectable by how they gravitational attract objects or bend light from objects passing behind them.
But seen in gravitational waves, they’re like ships moving across the ocean, leaving ripples of distorted spacetime behind them.
With our current capabilities through LIGO, astronomers can only detect the most massive objects moving at a significant portion of the speed of light. A regular black hole merger doesn’t do the trick – there’s not enough mass. Even a supermassive black hole merger isn’t detectable yet because these mergers seem to happen too slowly.
This is why all the detections so far have been intermediate-mass black holes with dozens of times the mass of our Sun. And we can only detect them at the moment that they’re merging together, when they’re generating the most intense gravitational waves.
If we can boost the sensitivity of our gravitational wave detectors, we should be able to spot mergers of less and more massive black holes.
But merging isn’t the only thing they do. Black holes are born when stars with many more times the mass of our Sun collapse in on themselves and explode as supernovae. Some stars, we’ve now learned just implode as black holes, never generating the supernovae, so this process happens entirely hidden from us.
Is there a singularity at the center of a black hole event horizon, or is there something there, some kind of object smaller than a neutron star, but bigger than an infinitely small point? As black holes merge together, we could see beyond the event horizon with gravitational waves, mapping out the invisible region within to get a sense of what’s going on down there.
We want to know about even less massive objects like neutron stars, which can also form from a supernova explosion. These neutron stars can orbit one another and merge generating some of the most powerful explosions in the Universe: gamma ray bursts. But do neutron stars have surface features? Different densities? Could we detect a wobble in the gravitational waves in the last moments before a merger?
And not everything needs to merge. Sensitive gravitational wave detectors could sense binary objects with a large imbalance, like a black hole or neutron star orbiting around a main sequence star. We could detect future mergers by their gravitational waves.
Are gravitational waves a momentary distortion of spacetime, or do they leave some kind of permanent dent on the Universe that we could trace back? Will we see echoes of gravity from gravitational waves reflecting and refracting through the fabric of the cosmos?
Perhaps the greatest challenge will be using gravitational waves to see beyond the Cosmic Microwave Background Radiation. This region shows us the Universe 380,000 years after the Big Bang, when everything was cool enough for light to move freely through the Universe.
But there was mass there, before that moment. Moving, merging mass that would have generated gravitational waves. As we explained in a previous article, astronomers are working to find the imprint of these gravitational waves on the Cosmic Microwave Background, like an echo, or a shadow. Perhaps there’s a deeper Cosmic Gravitational Background Radiation out there, one which will let us see right to the beginning of time, just moments after the Big Bang.
And as always, there will be the surprises. The discoveries in this new field that nobody ever saw coming. The “that’s funny” moments that take researchers down into whole new fields of discovery, and new insights into how the Universe works.
The LIGO project was begun back in 1994, and the first iteration operated from 2002 to 2012 without a single gravitational wave detection. It was clear that the facility wasn’t sensitive enough, so researchers went back and made massive improvements.
In 2008, they started improving the facility, and in 2015, Advanced LIGO came online with much more sensitivity. With the increased capabilities, Advanced LIGO made its first discovery in 2016, and now two more discoveries have been added.
LIGO can currently only detect the general hemisphere of the sky where a gravitational wave was emitted. And so, LIGO’s next improvement will be to add another facility in India, called INDIGO. In addition to improving the sensitivity of LIGO, this will give astronomers three observations of each event, to precisely detect the origin of the gravitational waves. Then visual astronomers could do follow up observations, to map the event to anything in other wavelengths.
A European experiment known as Virgo has been operating for a few years as well, agreeing to collaborate with the LIGO team if any detections are made. So far, the Virgo experiment hasn’t found anything, but it’s being upgraded with 10 times the sensitivity, which should be fully operational by 2018.
A Japanese experiment called the Kamioka Gravitational Wave Detector, or KAGRA, will come online in 2018 as well, and be able to contribute to the observations. It should be capable of detecting binary neutron star mergers out to nearly a billion light-years away.
Just with visual astronomy, there are a set of next generation supergravitational wave telescopes in the works, which should come online in the next few decades.
The Europeans are building the Einstein Telescope, which will have detection arms 10 km long, compared to 4 km for LIGO. That’s like, 6 more km.
There’s the European Space Agency’s space-based Laser Interferometer Space Antenna, or LISA, which could launch in 2030. This will consist of a fleet of 3 spacecraft which will maintain a precise distance of 2.5 million km from each other. Compare that to the Earth-based detection distances, and you can see why the future of observations will come from space.
And that last idea, looking right back to the beginning of time could be a possibility with the Big Bang Observer mission, which will have a fleet of 12 spacecraft flying in formation. This is still all in the proposal stage, so no concrete date for if or when they’ll actually fly.
Gravitational wave astronomy is one of the most exciting fields of astronomy. This entirely new sense is pushing out our understanding of the cosmos in entirely new directions, allowing us to see regions we could never even imagine exploring before. I can’t wait to see what happens next.
Whenever we talk about the expanding Universe, everyone wants to know how this is going to end. Sure, they say, the fact that most of the galaxies we can see are speeding away from us in all directions is really interesting. Sure, they say, the Big Bang makes sense, in that everything was closer together billions of years ago.
But how does it end? Does this go on forever? Do galaxies eventually slow down, come to a stop, and then hurtle back together in a Big Crunch? Will we get a non-stop cycle of Big Bangs, forever and ever?
We’ve done a bunch of articles on many different aspects of this question, and the current conclusion astronomers have reached is that because the Universe is flat, it’s never going to collapse in on itself and start another Big Bang.
But wait, what does it mean to say that the Universe is “flat”? Why is that important, and how do we even know?
Before we can get started talking about the flatness of the Universe, we need to talk about flatness in general. What does it mean to say that something is flat?
If you’re in a square room and walk around the corners, you’ll return to your starting point having made 4 90-degree turns. You can say that your room is flat. This is Euclidian geometry.
But if you make the same journey on the surface of the Earth. Start at the equator, make a 90-degree turn, walk up to the North Pole, make another 90-degree turn, return to the equator, another 90-degree turn and return to your starting point.
In one situation, you made 4 turns to return to your starting point, in another situation it only took 3. That’s because the topology of the surface you were walking on decided what happens when you take a 90-degree turn.
You can imagine an even more extreme example, where you’re walking around inside a crater, and it takes more than 4 turns to return to your starting point.
Another analogy, of course, is the idea of parallel lines. If you fire off two parallel lines at the North pole, they move away from each other, following the topology of the Earth and then come back together.
Got that? Great.
Now, what about the Universe itself? You can imagine that same analogy. Imaging flying out into space on a rocket for billions of light-years, performing 90-degree maneuvers and returning to your starting point.
You can’t do it in 3, or 5, you need 4, which means that the topology of the Universe is flat. Which is totally intuitive, right? I mean, that would be your assumption.
But astronomers were skeptical and needed to know for certain, and so, they set out to test this assumption.
In order to prove the flatness of the Universe, you would need to travel a long way. And astronomers use the largest possible observation they can make. The Cosmic Microwave Background Radiation, the afterglow of the Big Bang, visible in all directions as a red-shifted, fading moment when the Universe became transparent about 380,000 years after the Big Bang.
When this radiation was released, the entire Universe was approximately 2,700 C. This was the moment when it was cool enough for photons were finally free to roam across the Universe. The expansion of the Universe stretched these photons out over their 13.8 billion year journey, shifting them down into the microwave spectrum, just 2.7 degrees above absolute zero.
With the most sensitive space-based telescopes they have available, astronomers are able to detect tiny variations in the temperature of this background radiation.
And here’s the part that blows my mind every time I think about it. These tiny temperature variations correspond to the largest scale structures of the observable Universe. A region that was a fraction of a degree warmer become a vast galaxy cluster, hundreds of millions of light-years across.
The Cosmic Microwave Background Radiation just gives and gives, and when it comes to figuring out the topology of the Universe, it has the answer we need. If the Universe was curved in any way, these temperature variations would appear distorted compared to the actual size that we see these structures today.
But they’re not. To best of its ability, ESA’s Planck space telescope, can’t detect any distortion at all. The Universe is flat.
Well, that’s not exactly true. According to the best measurements astronomers have ever been able to make, the curvature of the Universe falls within a range of error bars that indicates it’s flat. Future observations by some super Planck telescope could show a slight curvature, but for now, the best measurements out there say… flat.
We say that the Universe is flat, and this means that parallel lines will always remain parallel. 90-degree turns behave as true 90-degree turns, and everything makes sense.
But what are the implications for the entire Universe? What does this tell us?
Unfortunately, the biggest thing is what it doesn’t tell us. We still don’t know if the Universe is finite or infinite. If we could measure its curvature, we could know that we’re in a finite Universe, and get a sense of what its actual true size is, out beyond the observable Universe we can measure.
We know that the volume of the Universe is at least 100 times more than we can observe. At least. If the flatness error bars get brought down, the minimum size of the Universe goes up.
And remember, an infinite Universe is still on the table.
Another thing this does, is that it actually causes a problem for the original Big Bang theory, requiring the development of a theory like inflation.
Since the Universe is flat now, it must have been flat in the past, when the Universe was an incredibly dense singularity. And for it to maintain this level of flatness over 13.8 billion years of expansion, in kind of amazing.
In fact, astronomers estimate that the Universe must have been flat to 1 part within 1×10^57 parts.
Which seems like an insane coincidence. The development of inflation, however, solves this, by expanding the Universe an incomprehensible amount moments after the Big Bang. Pre and post inflation Universes can have vastly different levels of curvature.
In the olden days, cosmologists used to say that the flatness of the Universe had implications for its future. If the Universe was curved where you could complete a full journey with less than 4 turns, that meant it was closed and destined to collapse in on itself.
And it was more than 4 turns, it was open and destined to expand forever.
Well, that doesn’t really matter any more. In 1998, the astronomers discovered dark energy, which is this mysterious force accelerating the expansion of the Universe. Whether the Universe is open, closed or flat, it’s going to keep on expanding. In fact, that expansion is going to accelerate, forever.
I hope this gives you a little more understanding of what cosmologists mean when they say that the Universe is flat. And how do we know it’s flat? Very precise measurements in the Cosmic Microwave Background Radiation.
Is there anything that all pervasive relic of the early Universe can’t do?
The Big Bang. The discovery that the Universe has been expanding for billions of years is one of the biggest revelations in the history of science. In a single moment, the entire Universe popped into existence, and has been expanding ever since.
We know this because of multiple lines of evidence: the cosmic microwave background radiation, the ratio of elements in the Universe, etc. But the most compelling one is just the simple fact that everything is expanding away from everything else. Which means, that if you run the clock backwards, the Universe was once an extremely hot dense region
Let’s go backwards in time, billions of years. The closer you get to the Big Bang, the closer everything was, and the hotter it was. When you reach about 380,000 years after the Big Bang, the entire Universe was so hot that all matter was ionized, with atomic nuclei and electrons buzzing around each other.
Keep going backwards, and the entire Universe was the temperature and density of a star, which fused together the primordial helium and other elements that we see to this day.
Continue to the beginning of time, and there was a point where everything was so hot that atoms themselves couldn’t hold together, breaking into their constituent protons and neutrons. Further back still and even atoms break apart into quarks. And before that, it’s just a big question mark. An infinitely dense Universe cosmologists called the singularity.
When you look out into the Universe in all directions, you see the cosmic microwave background radiation. That’s that point when the Universe cooled down so that light could travel freely through space.
And the temperature of this radiation is almost exactly the same in all directions that you look. There are tiny tiny variations, detectable only by the most sensitive instruments.
When two things are the same temperature, like a spoon in your coffee, it means that those two things have had an opportunity to interact. The coffee transferred heat to the spoon, and now their temperatures have equalized.
When we see this in opposite sides of the Universe, that means that at some point, in the ancient past, those two regions were touching. That spot where the light left 13.8 billion years ago on your left, was once directly touching that spot on your right that also emitted its light 13.8 billion years ago.
This is a great theory, but there’s a problem: The Universe never had time for those opposite regions to touch. For the Universe to have the uniform temperature we see today, it would have needed to spend enough time mixing together. But it didn’t have enough time, in fact, the Universe didn’t have any time to exchange temperature.
Imagine you dipped that spoon into the coffee and then pulled it out moments later before the heat could transfer, and yet the coffee and spoon are exactly the same temperature. What’s going on?
To address this problem, the cosmologist Alan Guth proposed the idea of cosmic inflation in 1980. That moments after the Big Bang, the entire Universe expanded dramatically.
And by “moments”, I mean that the inflationary period started when the Universe was only 10^-36 seconds old, and ended when the Universe was 10^-32 seconds old.
And by “expanded dramatically”, I mean that it got 10^26 times larger. That’s a 1 followed by 26 zeroes.
Before inflation, the observable Universe was smaller than an atom. After inflation, it was about 0.88 millimeters. Today, those regions have been stretched 93 billion light-years apart.
This concept of inflation was further developed by cosmologists Andrei Linde, Paul Steinhardt, Andy Albrecht and others.
Inflation resolved some of the shortcomings of the Big Bang Theory.
The first is known as the flatness problem. The most sensitive satellites we have today measure the Universe as flat. Not like a piece-of-paper-flat, but flat in the sense that parallel lines will remain parallel forever as they travel through the Universe. Under the original Big Bang cosmology, you would expect the curvature of the Universe to grow with time.
The second is the horizon problem. And this is the problem I mentioned above, that two regions of the Universe shouldn’t have been able to see each other and interact long enough to be the same temperature.
The third is the monopole problem. According to the original Big Bang theory, there should be a vast number of heavy, stable “monopoles”, or a magnetic particle with only a single pole. Inflation diluted the number of monopoles in the Universe so don’t detect them today.
Although the cosmic microwave background radiation appears mostly even across the sky, there could still be evidence of that inflationary period baked into it.
In order to do this, astronomers have been focusing on searching for primordial gravitational waves. These are different from the gravitational waves generated through the collision of massive objects. Primordial gravitational waves are the echoes from that inflationary period which should be theoretically detectable through the polarization, or orientation, of light in the cosmic microwave background radiation.
A collaboration of scientists used an instrument known as the Background Imaging of Cosmic Extragalactic Polarization (or BICEP2) to search for this polarization, and in 2014, they announced that maybe, just maybe, they had detected it, proving the theory of cosmic inflation was correct.
Unfortunately, another team working with the space-based Planck telescope posted evidence that the fluctuations they saw could be fully explained by intervening dust in the Milky Way.
The problem is that BICEP2 and Planck are designed to search for different frequencies. In order to really get to the bottom of this question, more searches need to be done, scanning a series of overlapping frequencies. And that’s in the works now.
BICEP2 and Planck and the newly developed South Pole Telescope as well as some observatories in Chile are all scanning the skies at different frequencies at the same time. Distortion from various types of foreground objects, like dust or radiation should be brighter or dimmer in the different frequencies, while the light from the cosmic microwave background radiation should remain constant throughout.
There are more telescopes, searching more wavelengths of light, searching more of the sky. We could know the answer to this question with more certainty shortly.
One of the most interesting implications of cosmic inflation, if proven, is that our Universe is actually just one in a vast multiverse. While the Universe was undergoing that dramatic expansion, it could have created bubbles of spacetime that spawned other universes, with different laws of physics.
In fact, the father of inflation, Alan Guth, said, “It’s hard to build models of inflation that don’t lead to a multiverse.”
And so, if inflation does eventually get confirmed, then we’ll have a whole multiverse to search for in the cosmic microwave background radiation.
The Big Bang was one of the greatest theories in the history of science. Although it did have a few problems, cosmic inflation was developed to address them. Although there have been a few false starts, astronomers are now performing a sensitive enough search that they might find evidence of this amazing inflationary period. And then it’ll be Nobel Prizes all around.