Since the “Golden Age of General Relativity” in the 1960s, scientists have held that much of the Universe consists of a mysterious invisible mass known as “Dark Matter“. Since then, scientists have attempted to resolve this mystery with a double-pronged approach. On the one hand, astrophysicists have attempted to find a candidate particle that could account for this mass.
On the other, astrophysicists have tried to find a theoretical basis that could explain Dark Matter’s behavior. So far, the debate has centered on the question of whether it is “hot” or “cold”, with cold enjoying an edge because of its relative simplicity. However, a new study conducted led by the Harvard-Smithsonian Center for Astrophysics (CfA) revits the idea that Dark Matter might actually be “warm”.
According to the most widely-accepted cosmological theory, the first stars in our Universe formed roughly 150 to 1 billion years after the Big Bang. Over time, these stars began to come together to form globular clusters, which slowly coalesced to form the first galaxies – including our very own Milky Way. For some time, astronomers have held that this process began for our galaxy some 13.51 billion years ago.
In accordance with this theory, astronomers believed that the oldest stars in the Universe were short-lived massive ones that have since died. However, a team of astronomers from Johns Hopking University recently discovered a low-mass star in the Milky Way’s “thin disk” that is roughly 13.5 billion-year-old. This discovery indicates that some of the earliest stars in the Universe could be alive, and available for study.
Looking deep into the observable Universe – and hence, back to the earliest periods of time – is an immensely fascinating thing. In so doing, astronomers are able to see the earliest galaxies in the Universe and learn more about how they evolved over time. From this, they are not only able to see how large-scale structures (like galaxies and galaxy clusters) formed, but also the role played by dark matter.
As they indicate in their study, this protocluster (designated SPT2349-56) was first observed by the National Science Foundation’s South Pole Telescope. Using the Atacama Pathfinder Experiment (APEX), the team conducted follow-up observations that confirmed that it was an extremely distant galactic source, which was then observed with ALMA. Using ALMA’s superior resolution and sensitivity, they were able to distinguish the individual galaxies.
What they found was that these galaxies were forming stars at rate 1,000 times faster than our galaxy, and were crammed inside a region of space that was about three times the size of the Milky Way. Using the ALMA data, the team was also able to create sophisticated computer simulations that demonstrated how this current collection of galaxies will likely grow and evolve over billion of years.
These simulations indicated that once these galaxies merge, the resulting galaxy cluster will rival some of the most massive clusters we see in the Universe today. As Scott Chapman, and astrophysicist at Dalhousie University and a co-author on the study, explained:
“Having caught a massive galaxy cluster in throes of formation is spectacular in and of itself. But, the fact that this is happening so early in the history of the universe poses a formidable challenge to our present-day understanding of the way structures form in the universe.”
The current scientific consensus among astrophysicists states that a few million years after the Big Bang, normal matter and dark matter began to form larger concentrations, eventually giving rise to galaxy clusters. These objects are the largest structures in the Universe, containing trillions of stars, thousands of galaxies, immense amounts of dark matter and massive black holes.
However, current theories and computer models have suggested that protoclusters – like the one observed by ALMA – should have taken much longer to evolve. Finding one that dates to just 1.4 billion years after the Big Bang was therefore quite the surprise. As Tim Miller, who is currently a doctoral candidate at Yale University, indicated:
“How this assembly of galaxies got so big so fast is a bit of a mystery, it wasn’t built up gradually over billions of years, as astronomers might expect. This discovery provides an incredible opportunity to study how galaxy clusters and their massive galaxies came together in these extreme environments.”
Looking to the future, Chapman and his colleagues hope to conduct further studies of SPT2349-56 to see how this protoclusters eventually became a galaxy cluster. “ALMA gave us, for the first time, a clear starting point to predict the evolution of a galaxy cluster,” he said. “Over time, the 14 galaxies we observed will stop forming stars and will collide and coalesce into a single gigantic galaxy.”
The study of this and other protoclusters will be made possible thanks to instruments like ALMA, but also next-generation observatories like the Square Kilometer Array (SKA). Equipped with more sensitive arrays and more advanced computer models, astronomers may be able to create a truly accurate timeline of how our Universe became what it is today.
Since the 1960s, astrophysicists have postulated that in addition to all the matter that we can see, the Universe is also filled with a mysterious, invisible mass. Known as “Dark Matter”, it’s existence was proposed to explain the “missing mass” of the Universe, and is now considered a fundamental part of it. Not only is it theorized to make up about 80% of the Universe’s mass, it is also believed to have played a vital role in the formation and evolution of galaxies.
However, a recent finding may throw this entire cosmological perspective sideways. Based on observations made using the NASA/ESA Hubble Space Telescope and other observatories around the world, astronomers have found a nearby galaxy (NGC 1052-DF2) that does not appear to have any dark matter. This object is unique among galaxies studied so far, and could force a reevaluation of our predominant cosmological models.
For the sake of their study, the team consulted data from the Dragonfly Telephoto Array (DFA), which was used to identify NGC 1052-DF2. Based on data from Hubble, the team was able to determined its distance – 65 million light-years from the Solar System – as well as its size and brightness. In addition, the team discovered that NGC 1052-DF52 is larger than the Milky Way but contains about 250 times fewer stars, which makes it an ultra diffuse galaxy.
As van Dokkum explained, NGC 1052-DF2 is so diffuse that it’s essentially transparent. “I spent an hour just staring at this image,” he said. “This thing is astonishing: a gigantic blob so sparse that you see the galaxies behind it. It is literally a see-through galaxy.”
Using data from the Sloan Digital Sky Survey (SDSS), the Gemini Observatory, and the Keck Observatory, the team studied the galaxy in more detail. By measuring the dynamical properties of ten globular clusters orbiting the galaxy, the team was able to infer an independent value of the galaxy’s mass – which is comparable to the mass of the stars in the galaxy.
This led the team to conclude that either NGC 1052-DF2 contains at least 400 times less dark matter than is predicted for a galaxy of its mass, or none at all. Such a finding is unprecedented in the history of modern astronomy and defied all predictions. As Allison Merritt – an astronomer from Yale University, the Max Planck Institute for Astronomy and a co-author on the paper – explained:
“Dark matter is conventionally believed to be an integral part of all galaxies — the glue that holds them together and the underlying scaffolding upon which they are built…There is no theory that predicts these types of galaxies — how you actually go about forming one of these things is completely unknown.”
“This invisible, mysterious substance is by far the most dominant aspect of any galaxy. Finding a galaxy without any is completely unexpected; it challenges standard ideas of how galaxies work,” added van Dokkum.
However, it is important to note that the discovery of a galaxy without dark matter does not disprove the theory that dark matter exists. In truth, it merely demonstrates that dark matter and galaxies are capable of being separate, which could mean that dark matter is bound to ordinary matter through no force other than gravity. As such, it could actually help scientists refine their theories of dark matter and its role in galaxy formation and evolution.
In the meantime, the researchers already have some ideas as to why dark matter is missing from NGC 1052-DF2. On the one hand, it could have been the result of a cataclysmic event, where the birth of a multitude of massive stars swept out all the gas and dark matter. On the other hand, the growth of the nearby massive elliptical galaxy (NGC 1052) billions of years ago could have played a role in this deficiency.
However, these theories do not explain how the galaxy formed. To address this, the team is analyzing images that Hubble took of 23 other ultra-diffuse galaxies for more dark-matter deficient galaxies. Already, they have found three that appear to be similar to NGC 1052-DF2, which could indicate that dark-matter deficient galaxies could be a relatively common occurrence.
If these latest findings demonstrate anything, it is that the Universe is like an onion. Just when you think you have it figured out, you peal back an additional layer and find a whole new set of mysteries. They also demonstrate that after 28 years of faithful service, the Hubble Space Telescope is still capable of teaching us new things. Good thing too, seeing as the launch of its successor has been delayed until 2020!
Since time immemorial, philosophers and scholars have sought to determine how existence began. With the birth of modern astronomy, this tradition has continued and given rise to the field known as cosmology. And with the help of supercomputing, scientists are able to conduct simulations that show how the first stars and galaxies formed in our Universe and evolved over the course of billions of years.
Until recently, the most extensive and complete study was the “Illustrus” simulation, which looked at the process of galaxy formation over the course of the past 13 billion years. Seeking to break their own record, the same team recently began conducting a simulation known as “Illustris, The Next Generation,” or “IllustrisTNG”. The first round of these findings were recently released, and several more are expected to follow.
Using the Hazel Hen supercomputer at the High-Performance Computing Center Stuttgart (HLRS) – one of the three world-class German supercomputing facilities that comprise the Gauss Centre for Supercomputing (GCS) – the team conducted a simulation that will help to verify and expand on existing experimental knowledge about the earliest stages of the Universe – i.e. what happened from 300,000 years after the Big Bang to the present day.
To create this simulation, the team combined equations (such as the Theory of General Relativity) and data from modern observations into a massive computational cube that represented a large cross-section of the Universe. For some processes, such as star formation and the growth of black holes, the researchers were forced to rely on assumptions based on observations. They then employed numerical models to set this simulated Universe in motion.
Compared to their previous simulation, IllustrisTNG consisted of 3 different universes at three different resolutions – the largest of which measured 1 billion light years (300 megaparsecs) across. In addition, the research team included more precise accounting for magnetic fields, thus improving accuracy. In total, the simulation used 24,000 cores on the Hazel Hen supercomputer for a total of 35 million core hours.
As Prof. Dr. Volker Springel, professor and researcher at the Heidelberg Institute for Theoretical Studies and principal investigator on the project, explained in a Gauss Center press release:
“Magnetic fields are interesting for a variety of reasons. The magnetic pressure exerted on cosmic gas can occasionally be equal to thermal (temperature) pressure, meaning that if you neglect this, you will miss these effects and ultimately compromise your results.”
Another major difference was the inclusion of updated black hole physics based on recent observation campaigns. This includes evidence that demonstrates a correlation between supermassive black holes (SMBHs) and galactic evolution. In essence, SMBHs are known to send out a tremendous amount of energy in the form of radiation and particle jets, which can have an arresting effect on star formation in a galaxy.
While the researchers were certainly aware of this process during the first simulation, they did not factor in how it can arrest star formation completely. By including updated data on both magnetic fields and black hole physics in the simulation, the team saw a greater correlation between the data and observations. They are therefore more confident with the results and believe it represents the most accurate simulation to date.
But as Dr. Dylan Nelson – a physicist with the Max Planck Institute of Astronomy and an llustricTNG member – explained, future simulations are likely to be even more accurate, assuming advances in supercomputers continue:
“Increased memory and processing resources in next-generation systems will allow us to simulate large volumes of the universe with higher resolution. Large volumes are important for cosmology, understanding the large-scale structure of the universe, and making firm predictions for the next generation of large observational projects. High resolution is important for improving our physical models of the processes going on inside of individual galaxies in our simulation.”
This latest simulation was also made possible thanks to extensive support provided by the GCS staff, who assisted the research team with matters related to their coding. It was also the result of a massive collaborative effort that brought together researchers from around the world and paired them with the resources they needed. Last, but not least, it shows how increased collaboration between applied research and theoretical research lead to better results.
Looking ahead, the team hopes that the results of this latest simulation proves to be even more useful than the last. The original Illustris data release gained over 2,000 registered users and resulted in the publication of 130 scientific studies. Given that this one is more accurate and up-to-date, the team expects that it will find more users and result in even more groundbreaking research.
Who knows? Perhaps someday, we may create a simulation that captures the formation and evolution of our Universe with complete accuracy. In the meantime, be sure to enjoy this video of the first Illustris Simulation, courtesy of team member and MIT physicist Mark Vogelsberger:
In their pursuit of learning how our Universe came to be, scientists have probed very deep into space (and hence, very far back in time). Ultimately, their goal is to determine when the first galaxies in our Universe formed and what effect they had on cosmic evolution. Recent efforts to locate these earliest formations have probed to distances of up to 13 billion light-years from Earth – i.e. about 1 billion years after the Big Bang.
From this, scientist are now able to study how early galaxies affected matter around them – in particular, the reionization of neutral atoms. Unfortunately, most early galaxies are very faint, which makes studying their interiors difficult. But thanks to a recent survey conducted by an international team of astronomers, a more luminous, massive galaxy was spotted that could provide a clear look at how early galaxies led to reionization.
In accordance with Big Bang model of cosmology, reionization refers to the process that took place after the period known as the “Dark Ages”. This occurred between 380,000 and 150 million years after the Big Bang, where most of the photons in the Universe were interacting with electrons and protons. As a result, the radiation of this period is undetectable by our current instruments – hence the name.
Just prior to this period, the “Recombination” occurred, where hydrogen and helium atoms began to form. Initially ionized (with no electrons bound to their nuclei) these molecules gradually captured ions as the Universe cooled, becoming neutral. During the period that followed – i.e. between 150 million to 1 billion years after the Big Bang – the large-scale structure of the Universe began to form.
Intrinsic to this was the process of reionization, where the first stars and quasars formed and their radiation reionized the surrounding Universe. It is therefore clear why astronomers want to probe this era of the Universe. By observing the first stars and galaxies, and what effect they had on the cosmos, astronomers will get a clearer picture of how this early period led to the Universe as we know it today.
Luckily for the research team, the massive, star-forming galaxies of this period are known to contain a great deal of dust. While very faint in the optical band, these galaxies emit strong radiation at submillimeter wavelengths, which makes them detectable using today’s advanced telescopes – including the South Pole Telescope (SPT), the Atacama Pathfinder Experiment (APEX), and Atacama Large Millimeter Array (ALMA).
For the sake of their study, Strandet and Weiss relied on data from the SPT to detect a series of dusty galaxies from the early Universe. As Maria Strandet and Axel Weiss of the Max Planck Institute for Radio Astronomy (and the lead author and co-authors on the study, respectively) told Universe Today via email:
“We have used light of about 1 mm wavelength, which can be observed by mm telescopes like SPT, APEX or ALMA. At this wavelength the photons are produced by the thermal radiation of dust. The beauty of using this long wavelength is, that for a large redshift range (look back time), the dimming of galaxies [caused] by increasing distance is compensated by the redshift – so the observed intensity is independent of the redshift. This is because, for higher redshift galaxies, one is looking at intrinsically shorter wavelengths (by (1+z)) where the radiation is stronger for a thermal spectrum like the dust spectrum.”
This was followed by data from ALMA, which the team used to determine the distance of the galaxies by looking at the redshifted wavelength of carbon monoxide molecules in their interstellar mediums (ISM). From all the data they collected, they were able to constrain the properties of one of these galaxies – SPT0311-58 – by observing its spectral lines. In so doing, they determined that this galaxy existed just 760 million years after the Big Bang.
“Since the signal strength at 1mm is independent of the redshift (look back time), we do not have an a priori clue if an object is relatively near (in the cosmological sense) or at the epoch of reionization,” they said. “That is why we undertook a large survey to determine the redshifts via the emission of molecular lines using ALMA. SPT0311-58 turns out to be the highest redshift object discovered in this survey and in fact the most distant massive dusty star-forming galaxy so far discovered.”
From their observations, they also determined that SPT0311-58 has a mass of about 330 billion Solar-masses, which is about 66 times as much as the Milky Way Galaxy (which has about 5 billion Solar-masses). They also estimated that it is forming new stars at a rate of several thousand per year, which could as be the case for neighboring galaxies that are dated to this period.
This rare and distant object is one of the best candidates yet for studying what the early Universe looked like and how it has evolved since. This in turn will allow astronomers and cosmologists to test the theoretical basis for the Big Bang Theory. As Strandet and Weiss told Universe Today about their discovery:
“These objects are important to understanding the evolution of galaxies as a whole since the large amounts of dust already present in this source, only 760 million years after the Big Bang, means that it is an extremely massive object. The mere fact that such massive galaxies already existed when the Universe was still so young puts strong constraints on our understanding of galaxy mass buildup. Furthermore the dust needs to form in a very short time, which gives additional insights on the dust production from the first stellar population.”
The ability to look deeper into space, and farther back in time, has led to many surprising discoveries of late. And these have in turn challenged some of our assumptions about what happened in the Universe, and when. And in the end, they are helping scientists to create a more detailed and complete account of cosmic evolution. Someday soon, we might even be able to probe the earliest moments in the Universe, and watch creation in action!
Understanding the Universe and how it has evolved over the course of billions of years is a rather daunting task. On the one hand, it involves painstakingly looking billions of light years into deep space (and thus, billions of years back in time) to see how its large-scale structure changed over time. Then, massive amounts of computing power are needed to simulate what it should look like (based on known physics) and seeing if they match up.
That is what a team of astrophysicists from the University of Zurich (UZH) did using the “Piz Daint” supercomputer. With this sophisticated machine, they simulated the formation of our entire Universe and produced a catalog of about 25 billion virtual galaxies. This catalog will be launched aboard the ESA’s Euclid mission in 2020, which will spend six years probing the Universe for the sake of investigating dark matter.
The team’s work was detailed in a study that appeared recently in the journal Computational Astrophysics and Cosmology. Led by Douglas Potter, the team spent the past three years developing an optimized code to describe (with unprecedented accuracy) the dynamics of dark matter as well as the formation of large-scale structures in the Universe.
The code, known as PKDGRAV3, was specifically designed to optimally use the available memory and processing power of modern super-computing architectures. After being executed on the “Piz Daint” supercomputer – located at the Swiss National Computing Center (CSCS) – for a period of only 80 hours, it managed to generate a virtual Universe of two trillion macro-particles, from which a catalogue of 25 billion virtual galaxies was extracted.
Intrinsic to their calculations was the way in which dark matter fluid would have evolved under its own gravity, thus leading to the formation of small concentrations known as “dark matter halos”. It is within these halos – a theoretical component that is thought to extend well beyond the visible extent of a galaxy – that galaxies like the Milky Way are believed to have formed.
Naturally, this presented quite the challenge. It required not only a precise calculation of how the structure of dark matter evolves, but also required that they consider how this would influence every other part of the Universe. As Joachim Stadel, a professor with the Center for Theoretical Astrophysics and Cosmology at UZH and a co-author on the paper, told Universe Today via email:
“We simulated 2 trillion such dark matter “pieces”, the largest calculation of this type that has ever been performed. To do this we had to use a computation technique known as the “fast multipole method” and use one of the fastest computers in the world, “Piz Daint” at the Swiss National Supercomputing Centre, which among other things has very fast graphics processing units (GPUs) which allow an enormous speed-up of the floating point calculations needed in the simulation. The dark matter clusters into dark matter “halos” which in turn harbor the galaxies. Our calculation accurately produces the distribution and properties of the dark matter, including the halos, but the galaxies, with all of their properties, must be placed within these halos using a model. This part of the task was performed by our colleagues at Barcelona under the direction of Pablo Fossalba and Francisco Castander. These galaxies then have the expected colors, spatial distribution and the emission lines (important for the spectra observed by Euclid) and can be used to test and calibrate various systematics and random errors within the entire instrument pipeline of Euclid.”
Thanks to the high precision of their calculations, the team was able to turn out a catalog that met the requirements of the European Space Agency’s Euclid mission, whose main objective is to explore the “dark universe”. This kind of research is essential to understanding the Universe on the largest of scales, mainly because the vast majority of the Universe is dark.
Between the 23% of the Universe which is made up of dark matter and the 72% that consists of dark energy, only one-twentieth of the Universe is actually made up of matter that we can see with normal instruments (aka. “luminous” or baryonic matter). Despite being proposed during the 1960s and 1990s respectively, dark matter and dark energy remain two of the greatest cosmological mysteries.
Given that their existence is required in order for our current cosmological models to work, their existence has only ever been inferred through indirect observation. This is precisely what the Euclid mission will do over the course of its six year mission, which will consist of it capturing light from billions of galaxies and measuring it for subtle distortions caused by the presence of mass in the foreground.
Much in the same way that measuring background light can be distorted by the presence of a gravitational field between it and the observer (i.e. a time-honored test for General Relativity), the presence of dark matter will exert a gravitational influence on the light. As Stadel explained, their simulated Universe will play an important role in this Euclid mission – providing a framework that will be used during and after the mission.
“In order to forecast how well the current components will be able to make a given measurement, a Universe populated with galaxies as close as possible to the real observed Universe must be created,” he said. “This ‘mock’ catalogue of galaxies is what was generated from the simulation and will be now used in this way. However, in the future when Euclid begins taking data, we will also need to use simulations like this to solve the inverse problem. We will then need to be able to take the observed Universe and determine the fundamental parameters of cosmology; a connection which currently can only be made at a sufficient precision by large simulations like the one we have just performed. This is a second important aspect of how such simulation work [and] is central to the Euclid mission.”
From the Euclid data, researchers hope to obtain new information on the nature of dark matter, but also to discover new physics that goes beyond the Standard Model of particle physics – i.e. a modified version of general relativity or a new type of particle. As Stadel explained, the best outcome for the mission would be one in which the results do not conform to expectations.
“While it will certainly make the most accurate measurements of fundamental cosmological parameters (such as the amount of dark matter and energy in the Universe) far more exciting would be to measure something that conflicts or, at the very least, is in tension with the current ‘standard lambda cold dark matter‘ (LCDM) model,” he said. “One of the biggest questions is whether the so called ‘dark energy’ of this model is actually a form of energy, or whether it is more correctly described by a modification to Einstein’s general theory of relativity. While we may just begin to scratch the surface of such questions, they are very important and have the potential to change physics at a very fundamental level.”
In the future, Stadel and his colleagues hope to be running simulations on cosmic evolution that take into account both dark matter and dark energy. Someday, these exotic aspects of nature could form the pillars of a new cosmology, one which reaches beyond the physics of the Standard Model. In the meantime, astrophysicists from around the world will likely be waiting for the first batch of results from the Euclid mission with baited breath.
Euclid is one of several missions that is currently engaged in the hunt for dark matter and the study of how it shaped our Universe. Others include the Alpha Magnetic Spectrometer (AMS-02) experiment aboard the ISS, the ESO’s Kilo Degree Survey (KiDS), and CERN’s Large Hardon Collider. With luck, these experiments will reveal pieces to the cosmological puzzle that have remained elusive for decades.
Ever since human beings learned that the Milky Way was not unique or alone in the night sky, astronomers and cosmologists have sought to find out just how many galaxies there are in the Universe. And until recently, our greatest scientific minds believed they had a pretty good idea – between 100 and 200 billion.
However, a new study produced by researchers from the UK has revealed something startling about the Universe. Using Hubble’s Deep Field Images and data from other telescopes, they have concluded that these previous estimates were off by a factor of about 10. The Universe, as it turns out, may have had up to 2 trillion galaxies in it during the course of its history.
Led by Prof. Christopher Conselice of the University of Nottingham, U.K., the team combined images taken by the Hubble Space Telescope with other published data to produced a 3-D map of the Universe. They then incorporated a series of new mathematical models that allowed them to infer the existence of galaxies which are not bright enough to be observed by current instruments.
Using these, they then began reviewing how galaxies have evolved over the past 13 billion years. What they learned was quite fascinating. For one, they observed that the distribution of galaxies throughout the history of the Universe was not even. What’s more, they found that in order for everything in their calculations to add up, there had to be 10 times more galaxies in the early Universe than previously thought.
Most of these galaxies would be similar in mass to the satellite galaxies that have been observed around the Milky Way, and would be too faint to be spotted by today’s instruments. In other words, astronomers have only been able to see about 10% of the early Universe until now, because most of its galaxies were too small and faint to be visible.
As Prof. Conselice explained in a Hubble Science Release, while may help resolve a lingering debate about the structure of the Universe:
“These results are powerful evidence that a significant galaxy evolution has taken place throughout the universe’s history, which dramatically reduced the number of galaxies through mergers between them — thus reducing their total number. This gives us a verification of the so-called top-down formation of structure in the universe.”
To break it down, the “top-down model” of galaxy formation states that galaxies formed from huge gas clouds larger than the resulting galaxies. These clouds began collapsing because their internal gravity was stronger than the pressures in the cloud. Based on the speed at which the gas clouds rotated, they would either form a spiral or an elliptical galaxy.
In contrast, the “bottom-up model” states that galaxies formed during the early Universe due to the merging of smaller clumps that were about the size globular clusters. These galaxies could then have been drawn into clusters and superclusters by their mutual gravity.
In addition to helping to resolve this debate, this study also offers a possible solution to the Olbers’ Paradox (aka. “the dark night sky paradox”). Named after the 18th/19th century German astronomer Heinrich Wilhelm Olbers, this paradox addresses the question of why – given the expanse of the Universe and all the luminous matter in it – is the sky dark at night?
Based on their results, the UK team has surmised that while every point in the night sky contains part of a galaxy, most of them are invisible to the human eye and modern telescopes. This is due to a combination of factors, which includes the effects of cosmic redshift, the fact that the Universe is dynamic (i.e. always expanding) and the absorption of light by cosmic dust and gas.
Needless to say, future missions will be needed to confirm the existence of all these unseen galaxies. And in that respect, Conselice and his colleagues are looking to future missions – ones that are capable of observing stars and galaxies in the non-visible spectrum – to make that happen.
“It boggles the mind that over 90 percent of the galaxies in the universe have yet to be studied,” he added. “Who knows what interesting properties we will find when we discover these galaxies with future generations of telescopes? In the near future, the James Webb Space Telescope will be able to study these ultra-faint galaxies.”
Understanding how many galaxies have existed over time is a fundamental aspect of understanding the Universe as a whole. With every passing study that attempts to resolve what we can see with our current cosmological models, we are getting that much closer!
And be sure to enjoy this video about some of Hubble’s most stunning images, courtesy of HubbleESA:
How has the universe evolved over time? A new supercomputer simulation has provided what scientists say is the most accurate and detailed large cosmological model of the evolution of the large-scale structure of the universe. Called the Bolshoi simulation, and it gives physicists and astronomers a powerful new tool for understanding cosmic mysteries such as galaxy formation, dark matter, and dark energy.