The universe is a seemingly endless sea filled with stars, galaxies, and nebulae. In it, we see patterns and constellations that have inspired stories throughout history. But there is one cosmic pattern we still don’t understand. A question that remains unanswered: What is the shape of the universe? We thought we knew, but new research suggests otherwise, and it could point to a crisis in cosmology.Continue reading “New Research Suggests that the Universe is a Sphere and Not Flat After All”
Back in 2013, the European Space Agency released its first analysis of the data gathered by the Planck observatory. Between 2009 and 2013, this spacecraft observed the remnants of the radiation that filled the Universe immediately after the Big Bang – the Cosmic Microwave Background (CMB) – with the highest sensitivity of any mission to date and in multiple wavelengths.
In addition to largely confirming current theories on how the Universe evolved, Planck’s first map also revealed a number of temperature anomalies – like the CMB “Cold Spot” – that are difficult to explai. Unfortunately, with the latest analysis of the mission data, the Planck Collaboration team has found no new evidence for these anomalies, which means that astrophysicists are still short of an explanation.Continue reading “New observations from the Planck mission don’t resolve anomalies like the CMB “cold spot””
According to the Big Bang cosmological model, our Universe began 13.8 billion years ago when all the matter and energy in the cosmos began expanding. This period of “cosmic inflation” is believed to be what accounts for the large-scale structure of the Universe and why space and the Cosmic Microwave Background (CMB) appear to be largely uniform in all directions.
However, to date, no evidence has been discovered that can definitely prove the cosmic inflation scenario or rule out alternative theories. But thanks to a new study by a team of astronomers from Harvard University and the Harvard-Smithsonian Center for Astrophysics (CfA), scientists may have a new means of testing one of the key parts of the Big Bang cosmological model.
For thousands of years, human being have been contemplating the Universe and seeking to determine its true extent. And whereas ancient philosophers believed that the world consisted of a disk, a ziggurat or a cube surrounded by celestial oceans or some kind of ether, the development of modern astronomy opened their eyes to new frontiers. By the 20th century, scientists began to understand just how vast (and maybe even unending) the Universe really is.
And in the course of looking farther out into space, and deeper back in time, cosmologists have discovered some truly amazing things. For example, during the 1960s, astronomers became aware of microwave background radiation that was detectable in all directions. Known as the Cosmic Microwave Background (CMB), the existence of this radiation has helped to inform our understanding of how the Universe began. Continue reading “What is the Cosmic Microwave Background?”
For decades, scientists have theorized that beyond the edge of the Solar System, at a distance of up to 50,000 AU (0.79 ly) from the Sun, there lies a massive cloud of icy planetesimals known as the Oort Cloud. Named in honor of Dutch astronomer Jan Oort, this cloud is believed to be where long-term comets originate from. However, to date, no direct evidence has been provided to confirm the Oort Cloud’s existence.
This is due to the fact that the Oort Cloud is very difficult to observe, being rather far from the Sun and dispersed over a very large region of space. However, in a recent study, a team of astrophysicists from the University of Pennsylvania proposed a radical idea. Using maps of the Cosmic Microwave Background (CMB) created by the Planck mission and other telescopes, they believe that Oort Clouds around other stars can be detected.
The study – “Probing Oort clouds around Milky Way stars with CMB surveys“, which recently appeared online – was led by Eric J Baxter, a postdoctoral researcher from the Department of Physics and Astronomy at the University of Pennsylvania. He was joined by Pennsylvania professors Cullen H. Blake and Bhuvnesh Jain (Baxter’s primary mentor).
To recap, the Oort Cloud is a hypothetical region of space that is thought to extend from between 2,000 and 5,000 AU (0.03 and 0.08 ly) to as far as 50,000 AU (0.79 ly) from the Sun – though some estimates indicate it could reach as far as 100,000 to 200,000 AU (1.58 and 3.16 ly). Like the Kuiper Belt and the Scattered Disc, the Oort Cloud is a reservoir of trans-Neptunian objects, though it is over a thousands times more distant from our Sun as these other two.
This cloud is believed to have originated from a population of small, icy bodies within 50 AU of the Sun that were present when the Solar System was still young. Over time, it is theorized that orbital perturbations caused by the giant planets caused those objects that had highly-stable orbits to form the Kuiper Belt along the ecliptic plane, while those that had more eccentric and distant orbits formed the Oort Cloud.
According to Baxter and his colleagues, because the existence of the Oort Cloud played an important role in the formation of the Solar System, it is therefore logical to assume that other star systems have their own Oort Clouds – which they refer to as exo-Oort Clouds (EXOCs). As Dr. Baxter explained to Universe Today via email:
“One of the proposed mechanisms for the formation of the Oort cloud around our sun is that some of the objects in the protoplanetary disk of our solar system were ejected into very large, elliptical orbits by interactions with the giant planets. The orbits of these objects were then affected by nearby stars and galactic tides, causing them to depart from orbits restricted to the plane of the solar system, and to form the now-spherical Oort cloud. You could imagine that a similar process could occur around another star with giant planets, and we know that there are many stars out there that do have giant planets.”
As Baxter and his colleagues indicated in their study, detecting EXOCs is difficult, largely for the same reasons for why there is no direct evidence for the Solar System’s own Oort Cloud. For one, there is not a lot of material in the cloud, with estimates ranging from a few to twenty times the mass of the Earth. Second, these objects are very far away from our Sun, which means they do not reflect much light or have strong thermal emissions.
For this reason, Baxter and his team recommended using maps of the sky at the millimeter and submillimeter wavelengths to search for signs of Oort Clouds around other stars. Such maps already exist, thanks to missions like the Planck telescope which have mapped the Cosmic Microwave Background (CMB). As Baxter indicated:
“In our paper, we use maps of the sky at 545 GHz and 857 GHz that were generated from observations by the Planck satellite. Planck was pretty much designed *only* to map the CMB; the fact that we can use this telescope to study exo-Oort clouds and potentially processes connected to planet formation is pretty surprising!”
This is a rather revolutionary idea, as the detection of EXOCs was not part of the intended purpose of the Planck mission. By mapping the CMB, which is “relic radiation” left over from the Big Bang, astronomers have sought to learn more about how the Universe has evolved since the the early Universe – circa. 378,000 years after the Big Bang. However, their study does build on previous work led by Alan Stern (the principal investigator of the New Horizons mission).
In 1991, along with John Stocke (of the University of Colorado, Boulder) and Paul Weissmann (from NASA’s Jet Propulsion Laboratory), Stern conducted a study titled “An IRAS search for extra-solar Oort clouds“. In this study, they suggested using data from the Infrared Astronomical Satellite (IRAS) for the purpose of searching for EXOCs. However, whereas this study focused on certain wavelengths and 17 star systems, Baxter and his team relied on data for tens of thousands of systems and at a wider range of wavelengths.
Other current and future telescopes which Baxter and his team believe could be useful in this respect include the South Pole Telescope, located at the Amundsen–Scott South Pole Station in Antarctica; the Atacama Cosmology Telescope and the Simons Observatory in Chile; the Balloon-borne Large Aperture Submillimeter Telescope (BLAST) in Antarctica; the Green Bank Telescope in West Virgina, and others.
“Furthermore, the Gaia satellite has recently mapped out very accurately the positions and distances of stars in our galaxy,” Baxter added. “This makes choosing targets for exo-Oort cloud searches relatively straightforward. We used a combination of Gaia and Planck data in our analysis.”
To test their theory, Baxter and is team constructed a series of models for the thermal emission of exo-Oort clouds. “These models suggested that detecting exo-Oort clouds around nearby stars (or at least putting limits on their properties) was feasible given existing telescopes and observations,” he said. “In particular, the models suggested that data from the Planck satellite could potentially come close to detecting an exo-Oort cloud like our own around a nearby star.”
In addition, Baxter and his team also detected a hint of a signal around some of the stars that they considered in their study – specifically in the Vega and Formalhaut systems. Using this data, they were able to place constraints on the possible existence of EXOCs at a distance of 10,000 to 100,000 AUs from these stars, which roughly coincides with the distance between our Sun and the Oort Cloud.
However, additional surveys will be needed before the existence any of EXOCs can be confirmed. These surveys will likely involve the James Webb Space Telescope, which is scheduled to launch in 2021. In the meantime, this study has some rather significant implications for astronomers, and not just because it involves the use of existing CMB maps for extra-solar studies. As Baxter put it:
“Just detecting an exo-Oort cloud would be really interesting, since as I mentioned above, we don’t have any direct evidence for the existence of our own Oort cloud. If you did get a detection of an exo-Oort cloud, it could in principle provide insights into processes connected to planet formation and the evolution of protoplanetary disks. For instance, imagine that we only detected exo-Oort clouds around stars that have giant planets. That would provide pretty convincing evidence that the formation of an Oort cloud is connected to giant planets, as suggested by popular theories of the formation of our own Oort cloud.”
Further Reading: arXiv
For decades, the predominant cosmological model used by scientists has been based on the theory that in addition to baryonic matter – aka. “normal” or “luminous” matter, which we can see – the Universe also contains a substantial amount of invisible mass. This “Dark Matter” accounts for roughly 26.8% of the mass of the Universe, whereas normal matter accounts for just 4.9%.
While the search for Dark Matter is ongoing and direct evidence is yet to be found, scientists have also been aware that roughly 90% of the Universe’s normal matter still remained undetected. According to two new studies that were recently published, much of this normal matter – which consists of filaments of hot, diffuse gas that links galaxies together – may have finally been found.
The first study, titled “A Search for Warm/Hot Gas Filaments Between Pairs of SDSS Luminous Red Galaxies“, appeared in the Monthly Notices of the Royal Astronomic Society. The study was led by Hideki Tanimura, a then-PhD candidate at the University of British Columbia, and included researchers from the Canadian Institute for Advanced Research (CIFAR), the Liverpool John Moores University and the University of KwaZulu-Natal.
The second study, which recently appeared online, was titled “Missing Baryons in the Cosmic Web Revealed by the Sunyaev-Zel’dovich Effect“. This team consisted of researchers from the University of Edinburgh and was led Anna de Graaff, a undergraduate student from the Institute for Astronomy at Edinburgh’s Royal Observatory. Working independently of each other, these two team tackled a problem of the Universe’s missing matter.
Based on cosmological simulations, the predominant theory has been that the previously-undetected normal matter of the Universe consists of strands of baryonic matter – i.e. protons, neutrons and electrons – that is floating between galaxies. These regions are what is known as the “Cosmic Web”, where low density gas exists at a temperatures of 105 to 107 K (-168 t0 -166 °C; -270 to 266 °F).
For the sake of their studies, both teams consulted data from the Planck Collaboration, a venture maintained by the European Space Agency that includes all those who contributed to the Planck mission (ESA). This was presented in 2015, where it was used to create a thermal map of the Universe by measuring the influence of the Sunyaev-Zeldovich (SZ) effect.
This effect refers to a spectral distortion in the Cosmic Microwave Background, where photons are scattered by ionized gas in galaxies and larger structures. During its mission to study the cosmos, the Planck satellite measured the spectral distortion of CMB photons with great sensitivity, and the resulting thermal map has since been used to chart the large-scale structure of the Universe.
However, the filaments between galaxies appeared too faint for scientists to examine at the time. To remedy this, the two teams consulted data from the North and South CMASS galaxy catalogues, which were produced from the 12th data release of the Sloan Digital Sky Survey (SDSS). From this data set, they then selected pairs of galaxies and focused on the space between them.
They then stacked the thermal data obtained by Planck for these areas on top of each other in order to strengthen the signals caused by SZ effect between galaxies. As Dr. Hideki told Universe Today via email:
“The SDSS galaxy survey gives a shape of the large-scale structure of the Universe. The Planck observation provides an all-sky map of gas pressure with a better sensitivity. We combine these data to probe the low-dense gas in the cosmic web.”
While Tanimura and his team stacked data from 260,000 galaxy pairs, de Graaff and her team stacked data from over a million. In the end, the two teams came up with strong evidence of gas filaments, though their measurements differed somewhat. Whereas Tanimura’s team found that the density of these filaments was around three times the average density in the surrounding void, de Graaf and her team found that they were six times the average density.
“We detect the low-dense gas in the cosmic web statistically by a stacking method,” said Hideki. “The other team uses almost the same method. Our results are very similar. The main difference is that we are probing a nearby Universe, on the other hand, they are probing a relatively farther Universe.”
This particular aspect of particularly interesting, in that it hints that over time, baryonic matter in the Cosmic Web has become less dense. Between these two results, the studies accounted for between 15 and 30% of the total baryonic content of the Universe. While that would mean that a significant amount of the Universe’s baryonic matter still remains to be found, it is nevertheless an impressive find.
As Hideki explained, their results not only support the current cosmological model of the Universe (the Lambda CDM model) but also goes beyond it:
“The detail in our universe is still a mystery. Our results shed light on it and reveals a more precise picture of the Universe. When people went out to the ocean and started making a map of our world, it was not used for most of the people then, but we use the world map now to travel abroad. In the same way, a map of the entire universe may not be valuable now because we do not have a technology to go far out to the space. However, it could be valuable 500 years later. We are in the first stage of making a map of the entire Universe.”
It also opens up opportunities for future studies of the Comsic Web, which will no doubt benefit from the deployment of next-generation instruments like James Webb Telescope, the Atacama Cosmology Telescope and the Q/U Imaging ExperimenT (QUIET). With any luck, they will be able to spot the remaining missing matter. Then, perhaps we can finally zero in on all the invisible mass!
Since the 1960s, astronomers have been aware of the electromagnetic background radiation that pervades the Universe. Known as the Cosmic Microwave Background, this radiation is the oldest light in the Universe and what is left over from the Big Bang. By 2004, astronomers also became aware that a large region within the CMB appeared to be colder than its surroundings.
Known as the “CMB Cold Spot”, scientists have puzzled over this anomaly for years, with explanations ranging from a data artifact to it being caused by a supervoid. According to a new study conducted by a team of scientists from Durham University, the presence of a supervoid has been ruled out. This conclusion once again opens the door to more exotic explanations – like the existence of a parallel Universe!
The Cold Spot is one of several anomalies that astronomers have been studying since the first maps of CMB were created using data from the Wilkinson Microwave Anisotropy Probe (WMAP). These anomalies are regions in the CMB that fall beneath the average background temperature of 2.73 degrees above absolute zero (-270.43 °C; -460.17 °F). In the case of the Cold Spot, the area is just 0.00015° colder than its surroundings.
And yet, this temperature difference is enough that the Cold Spot has become something of a thorn in the hip of standard models of cosmology. Previously, the smart money appeared to be on it being caused by a supervoid – and area of space measuring billions of light years across which contained few galaxies. To test this theory, the Durham team conducted a survey of the galaxies in the region.
This technique, which measures the extent to which visible light coming from an object is shifted towards the red end of the spectrum, has been the standard method for determining the distance to other galaxies for over a century. For the sake of their study, the Durham team relied on data from the Anglo-Australian Telescope to conduct a survey where they measured the redshifts of 7,000 nearby galaxies.
Based on this high-fidelity dataset, the researchers found no evidence that the Cold Spot corresponded to a relative lack of galaxies. In other words, there was no indication that the region is a supervoid. The results of their study will be published in the Monthly Notices of the Royal Astronomical Society (MNRAS) under the title “Evidence Against a Supervoid Causing the CMB Cold Spot“.
As Ruari Mackenzie – a postdoctoral student in the Dept. of Physics at Durham University, a member of the Center for Extragalactic Astronomy, and the lead author on the paper – explained in an RAS press release:
“The voids we have detected cannot explain the Cold Spot under standard cosmology. There is the possibility that some non-standard model could be proposed to link the two in the future but our data place powerful constraints on any attempt to do that.”
Specifically, the Durham team found that the Cold Spot region could be split into smaller voids, each of which were surrounded by clusters of galaxies. This distribution was consistent with a control field the survey chose for the study, both of which exhibited the same “soap bubble” structure. The question therefore arises: if the Cold Spot is not the result of a void or a relative lack of galaxies, what is causing it?
This is where the more exotic explanations come in, which emphasize that the Cold Spot may be due to something that exists outside the standard model of cosmology. As Tom Shanks, a Professor with the Dept.of Physics at Durham and a co-author of the study, explained:
“Perhaps the most exciting of these is that the Cold Spot was caused by a collision between our universe and another bubble Universe. If further, more detailed, analysis of CMB data proves this to be the case then the Cold Spot might be taken as the first evidence for the multiverse – and billions of other Universes may exist like our own.”
Multiverse Theory, which was first proposed by philosopher and psychologist William James, states that there may be multiple or an even infinite number of Universes that exist parallel to our own. Between these Universes exists the entirety of existence and all cosmological phenomena – i.e. space, time, matter, energy, and all of the physical laws that bind them.
Whereas it is often treated as a philosophical concept, the theory arose in part from the study of cosmological forces, like black holes and problems arising from the Big Bang Theory. In addition, variations on multiverse theory have been suggested as potential resolutions to theories that go beyond the Standard Model of particle physics – such as String Theory and M-theory.
Another variation – the Many-Worlds interpretation – has also been offered as a possible resolution for the wavefunction of subatomic particles. Essentially, it states that all possible outcomes in quantum mechanics exist in alternate universes, and there really is no such thing as “wavefunction collapse’. Could it therefore be argued that an alternate or parallel Universe is too close to our own, and thus responsible for the anomalies we see in the CMB?
As explanations go, it certainly is exciting, if perhaps a bit fantastic? And the Durham team is not prepared to rule out that the Cold Spot could be the result fluctuations that can be explained by the standard model of cosmology. Right now, the only thing that can be said definitively is that the Cold Spot cannot be explained by something as straightforward as a supervoid and the absence of galaxies.
And in the meantime, additional surveys and experiments need to be conducted. Otherwise, this mystery may become a real sticking point for cosmology!
Direction is something we humans are pretty accustomed to. Living in our friendly terrestrial environment, we are used to seeing things in term of up and down, left and right, forwards or backwards. And to us, our frame of reference is fixed and doesn’t change, unless we move or are in the process of moving. But when it comes to cosmology, things get a little more complicated.
For a long time now, cosmologists have held the belief that the universe is homogeneous and isotropic – i.e. fundamentally the same in all directions. In this sense, there is no such thing as “up” or “down” when it comes to space, only points of reference that are entirely relative. And thanks to a new study by researchers from the University College London, that view has been shown to be correct.
For the sake of their study, titled “How isotropic is the Universe?“, the research team used survey data of the Cosmic Microwave Background (CMB) – the thermal radiation left over from the Big Bang. This data was obtained by the ESA’s Planck spacecraft between 2009 and 2013.
The team then analyzed it using a supercomputer to determine if there were any polarization patterns that would indicate if space has a “preferred direction” of expansion. The purpose of this test was to see if one of the basic assumptions that underlies the most widely-accepted cosmological model is in fact correct.
The first of these assumptions is that the Universe was created by the Big Bang, which is based on the discovery that the Universe is in a state of expansion, and the discovery of the Cosmic Microwave Background. The second assumption is that space is homogenous and istropic, meaning that there are no major differences in the distribution of matter over large scales.
This belief, which is also known as the Cosmological Principle, is based partly on the Copernican Principle (which states that Earth has no special place in the Universe) and Einstein’s Theory of Relativity – which demonstrated that the measurement of inertia in any system is relative to the observer.
This theory has always had its limitations, as matter is clearly not evenly distributed at smaller scales (i.e. star systems, galaxies, galaxy clusters, etc.). However, cosmologists have argued around this by saying that fluctuation on the small scale are due to quantum fluctuations that occurred in the early Universe, and that the large-scale structure is one of homogeneity.
By looking for fluctuations in the oldest light in the Universe, scientists have been attempting to determine if this is in fact correct. In the past thirty years, these kinds of measurements have been performed by multiple missions, such as the Cosmic Background Explorer (COBE) mission, the Wilkinson Microwave Anisotropy Probe (WMAP), and the Planck spacecraft.
For the sake of their study, the UCL research team – led by Daniela Saadeh and Stephen Feeney – looked at things a little differently. Instead of searching for imbalances in the microwave background, they looked for signs that space could have a preferred direction of expansion, and how these might imprint themselves on the CMB.
As Daniela Saadeh – a PhD student at UCL and the lead author on the paper – told Universe Today via email:
“We analyzed the temperature and polarization of the cosmic microwave background (CMB), a relic radiation from the Big Bang, using data from the Planck mission. We compared the real CMB against our predictions for what it would look like in an anisotropic universe. After this search, we concluded that there is no evidence for these patterns and that the assumption that the Universe is isotropic on large scales is a good one.”
Basically, their results showed that there is only a 1 in 121 000 chance that the Universe is anisotropic. In other words, the evidence indicates that the Universe has been expanding in all directions uniformly, thus removing any doubts about their being any actual sense of direction on the large-scale.
And in a way, this is a bit disappointing, since a Universe that is not homogenous and the same in all directions would lead to a set of solutions to Einstein’s field equations. By themselves, these equations do not impose any symmetries on space time, but the Standard Model (of which they are part) does accept homogeneity as a sort of given.
These solutions are known as the Bianchi models, which were proposed by Italian mathematician Luigi Bianchi in the late 19th century. These algebraic theories, which can be applied to three-dimensional spacetime, are obtained by being less restrictive, and thus allow for a Universe that is anisotropic.
On the other hand, the study performed by Saadeh, Feeney, and their colleagues has shown that one of the main assumptions that our current cosmological models rest on is indeed correct. In so doing, they have also provided a much-needed sense of closer to a long-term debate.
“In the last ten years there has been considerable discussion around whether there were signs of large-scale anisotropy lurking in the CMB,” said Saadeh. “If the Universe were anisotropic, we would need to revise many of our calculations about its history and content. Planck high-quality data came with a golden opportunity to perform this health check on the standard model of cosmology and the good news is that it is safe.”
So the next time you find yourself looking up at the night sky, remember… that’s a luxury you have only while you’re standing on Earth. Out there, its a whole ‘nother ballgame! So enjoy this thing we call “direction” when and where you can.
And be sure to check out this animation produced by the UCL team, which illustrates the Planck mission’s CMB data:
One of the defining characteristics of the New Space era is partnerships. Whether it is between the private and public sector, different space agencies, or different institutions across the world, collaboration has become the cornerstone to success. Consider the recent agreement between the Netherlands Space Office (NSO) and the Chinese National Space Agency (CNSA) that was announced earlier this week.
In an agreement made possible by the Memorandum of Understanding (MoU) signed in 2015 between the Netherlands and China, a Dutch-built radio antenna will travel to the Moon aboard the Chinese Chang’e 4 satellite, which is scheduled to launch in 2018. Once the lunar exploration mission reaches the Moon, it will deposit the radio antenna on the far side, where it will begin to provide scientists with fascinating new views of the Universe.
The radio antenna itself is also the result of collaboration, between scientists from Radboud University, the Netherlands Institute for Radio Astronomy (ASTRON) and the small satellite company Innovative Solutions in Space (ISIS). After years of research and development, these three organizations have produced an instrument which they hope will usher in a new era of radio astronomy.
Essentially, radio astronomy involves the study of celestial objects – ranging from stars and galaxies to pulsars, quasars, masers and the Cosmic Microwave Background (CMB) – at radio frequencies. Using radio antennas, radio telescopes, and radio interferometers, this method allows for the study of objects that might otherwise be invisible or hidden in other parts of the electromagnetic spectrum.
One drawback of radio astronomy is the potential for interference. Since only certain wavelengths can pass through the Earth’s atmosphere, and local radio wave sources can throw off readings, radio antennas are usually located in remote areas of the world. A good example of this is the Very-Long Baseline Array (VLBA) located across the US, and the Square Kilometer Array (SKA) under construction in Australia and South Africa.
One other solution is to place radio antennas in space, where they will not be subject to interference or local radio sources. The antenna being produced by Radbound, ASTRON and ISIS is being delivered to the far side of the Moon for just this reason. As the latest space-based radio antenna to be deployed, it will be able to search the cosmos in ways Earth-based arrays cannot, looking for vital clues to the origins of the universe.
As Heino Falke – a professor of Astroparticle Physics and Radio Astronomy at Radboud – explained in a University press release, the deployment of this radio antenna on the far side of the Moon will be an historic achievement:
“Radio astronomers study the universe using radio waves, light coming from stars and planets, for example, which is not visible with the naked eye. We can receive almost all celestial radio wave frequencies here on Earth. We cannot detect radio waves below 30 MHz, however, as these are blocked by our atmosphere. It is these frequencies in particular that contain information about the early universe, which is why we want to measure them.”
As it stands, very little is known about this part of the electromagnetic spectrum. As a result, the Dutch radio antenna could be the first to provide information on the development of the earliest structures in the Universe. It is also the first instrument to be sent into space as part of a Chinese space mission.
Alongside Heino Falcke, Marc Klein Wolt – the director of the Radboud Radio Lab – is one of the scientific advisors for the project. For years, he and Falcke have been working towards the deployment of this radio antenna, and have high hopes for the project. As Professor Wolt said about the scientific package he is helping to create:
“The instrument we are developing will be a precursor to a future radio telescope in space. We will ultimately need such a facility to map the early universe and to provide information on the development of the earliest structures in it, like stars and galaxies.”
Together with engineers from ASTRON and ISIS, the Dutch team has accumulated a great deal of expertise from their years working on other radio astronomy projects, which includes experience working on the Low Frequency Array (LOFAR) and the development of the Square Kilometre Array, all of which is being put to work on this new project.
Other tasks that this antenna will perform include monitoring space for solar storms, which are known to have a significant impact on telecommunications here on Earth. With a radio antenna on the far side of the Moon, astronomers will be able to better predict such events and prepare for them in advance.
Another benefit will be the ability to measure strong radio pulses from gas giants like Jupiter and Saturn, which will help us to learn more about their rotational speed. Combined with the recent ESO efforts to map Jupiter at IR frequencies, and the data that is already arriving from the Juno mission, this data is likely to lead to some major breakthroughs in our understanding of this mysterious planet.
Last, but certainly not least, the Dutch team wants to create the first map of the early Universe using low-frequency radio data. This map is expected to take shape after two years, once the Moon has completed a few full rotations around the Earth and computer analysis can be completed.
It is also expected that such a map will provide scientists with additional evidence that confirms the Standard Model of Big Bang cosmology (aka. the Lambda CDM model). As with other projects currently in the works, the results are likely to be exciting and groundbreaking!
Further Reading: Radbound University
The standard model of cosmology tells us that only 4.9% of the Universe is composed of ordinary matter (i.e. that which we can see), while the remainder consists of 26.8% dark matter and 68.3% dark energy. As the names would suggest, we cannot see them, so their existence has had to be inferred based on theoretical models, observations of the large-scale structure of the Universe, and its apparent gravitational effects on visible matter.
Since it was first proposed, there have been no shortages of suggestions as to what Dark Matter particles look like. Not long ago, many scientists proposed that Dark Matter consists of Weakly-Interacting Massive Particles (WIMPs), which are about 100 times the mass of a proton but interact like neutrinos. However, all attempts to find WIMPs using colliders experiments have come up empty. As such, scientists have been exploring the idea lately that dark matter may be composed of something else entirely. Continue reading “Beyond WIMPs: Exploring Alternative Theories Of Dark Matter”