When complete, the Square Kilometer Array (SKA) will be the largest radio telescope array in the entire world. The result of decades of work involving 40 institutions in 11 countries, the SKA will allow astronomers to monitor the sky in unprecedented detail and survey it much faster than with any system currently in existence.
Such a large array will naturally be responsible for gathering an unprecedented amount of data on a regular basis. To sort through all this data, the “brain” for this massive array will consist of two supercomputers. Recently, the SKA’s Science Data Processor (SDP) consortium concluded their engineering design work on one of these supercomputers.
Since time immemorial, philosophers and scholars have sought to determine how existence began. With the birth of modern astronomy, this tradition has continued and given rise to the field known as cosmology. And with the help of supercomputing, scientists are able to conduct simulations that show how the first stars and galaxies formed in our Universe and evolved over the course of billions of years.
Until recently, the most extensive and complete study was the “Illustrus” simulation, which looked at the process of galaxy formation over the course of the past 13 billion years. Seeking to break their own record, the same team recently began conducting a simulation known as “Illustris, The Next Generation,” or “IllustrisTNG”. The first round of these findings were recently released, and several more are expected to follow.
Using the Hazel Hen supercomputer at the High-Performance Computing Center Stuttgart (HLRS) – one of the three world-class German supercomputing facilities that comprise the Gauss Centre for Supercomputing (GCS) – the team conducted a simulation that will help to verify and expand on existing experimental knowledge about the earliest stages of the Universe – i.e. what happened from 300,000 years after the Big Bang to the present day.
To create this simulation, the team combined equations (such as the Theory of General Relativity) and data from modern observations into a massive computational cube that represented a large cross-section of the Universe. For some processes, such as star formation and the growth of black holes, the researchers were forced to rely on assumptions based on observations. They then employed numerical models to set this simulated Universe in motion.
Compared to their previous simulation, IllustrisTNG consisted of 3 different universes at three different resolutions – the largest of which measured 1 billion light years (300 megaparsecs) across. In addition, the research team included more precise accounting for magnetic fields, thus improving accuracy. In total, the simulation used 24,000 cores on the Hazel Hen supercomputer for a total of 35 million core hours.
As Prof. Dr. Volker Springel, professor and researcher at the Heidelberg Institute for Theoretical Studies and principal investigator on the project, explained in a Gauss Center press release:
“Magnetic fields are interesting for a variety of reasons. The magnetic pressure exerted on cosmic gas can occasionally be equal to thermal (temperature) pressure, meaning that if you neglect this, you will miss these effects and ultimately compromise your results.”
Another major difference was the inclusion of updated black hole physics based on recent observation campaigns. This includes evidence that demonstrates a correlation between supermassive black holes (SMBHs) and galactic evolution. In essence, SMBHs are known to send out a tremendous amount of energy in the form of radiation and particle jets, which can have an arresting effect on star formation in a galaxy.
While the researchers were certainly aware of this process during the first simulation, they did not factor in how it can arrest star formation completely. By including updated data on both magnetic fields and black hole physics in the simulation, the team saw a greater correlation between the data and observations. They are therefore more confident with the results and believe it represents the most accurate simulation to date.
But as Dr. Dylan Nelson – a physicist with the Max Planck Institute of Astronomy and an llustricTNG member – explained, future simulations are likely to be even more accurate, assuming advances in supercomputers continue:
“Increased memory and processing resources in next-generation systems will allow us to simulate large volumes of the universe with higher resolution. Large volumes are important for cosmology, understanding the large-scale structure of the universe, and making firm predictions for the next generation of large observational projects. High resolution is important for improving our physical models of the processes going on inside of individual galaxies in our simulation.”
This latest simulation was also made possible thanks to extensive support provided by the GCS staff, who assisted the research team with matters related to their coding. It was also the result of a massive collaborative effort that brought together researchers from around the world and paired them with the resources they needed. Last, but not least, it shows how increased collaboration between applied research and theoretical research lead to better results.
Looking ahead, the team hopes that the results of this latest simulation proves to be even more useful than the last. The original Illustris data release gained over 2,000 registered users and resulted in the publication of 130 scientific studies. Given that this one is more accurate and up-to-date, the team expects that it will find more users and result in even more groundbreaking research.
Who knows? Perhaps someday, we may create a simulation that captures the formation and evolution of our Universe with complete accuracy. In the meantime, be sure to enjoy this video of the first Illustris Simulation, courtesy of team member and MIT physicist Mark Vogelsberger:
Behind every modern tale of cosmological discovery is the supercomputer that made it possible. Such was the case with the announcement yesterday from the European Space Agencies’ Planck mission team which raised the age estimate for the universe to 13.82 billion years and tweaked the parameters for the amounts dark matter, dark energy and plain old baryonic matter in the universe.
Planck built upon our understanding of the early universe by providing us the most detailed picture yet of the cosmic microwave background (CMB), the “fossil relic” of the Big Bang first discovered by Penzias & Wilson in 1965. Planck’s discoveries built upon the CMB map of the universe observed by the Wilkinson Microwave Anisotropy Probe (WMAP) and serves to further validate the Big Bang theory of cosmology.
But studying the tiny fluctuations in the faint cosmic microwave background isn’t easy, and that’s where Hopper comes in. From its L2 Lagrange vantage point beyond Earth’s Moon, Planck’s 72 onboard detectors observe the sky at 9 separate frequencies, completing a full scan of the sky every six months. This first release of data is the culmination of 15 months worth of observations representing close to a trillion overall samples. Planck records on average of 10,000 samples every second and scans every point in the sky about 1,000 times.
That’s a challenge to analyze, even for a supercomputer. Hopper is a Cray XE6 supercomputer based at the Department of Energy’s National Energy Research Scientific Computing center (NERSC) at the Lawrence Berkeley National Laboratory in California. Named after computer scientist and pioneer Grace Hopper, the supercomputer has a whopping 217 terabytes of memory running across 153,216 computer cores with a peak performance of 1.28 petaflops a second. Hopper placed number five on a November 2010 list of the world’s top supercomputers. (The Tianhe-1A supercomputer at the National Supercomputing Center in Tianjin China was number one at a peak performance of 4.7 petaflops per second).
One of the main challenges for the team sifting through the flood of CMB data generated by Planck was to filter out the “noise” and bias from the detectors themselves.
“It’s like more than just bugs on a windshield that we want to remove to see the light, but a storm of bugs all around us in every direction,” said Planck project scientist Charles Lawrence. To overcome this, Hopper runs simulations of how the sky would appear to Planck under different conditions and compares these simulations against observations to tease out data.
“By scaling up to tens of thousands of processors, we’ve reduced the time it takes to run these calculations from an impossible 1,000 years to a few weeks,” said Berkeley lab and Planck scientist Ted Kisner.
But the Planck mission isn’t the only data that Hopper is involved with. Hopper and NERSC were also involved with last year’s discovery of the final neutrino mixing angle. Hopper is also currently involved with studying wave-plasma interactions, fusion plasmas and more. You can see the projects that NERSC computers are tasked with currently on their site along with CPU core hours used in real time. Maybe a future descendant of Hopper could give Deep Thought of Hitchhiker’s Guide to the Galaxy fame competition in solving the answer to Life, the Universe, and Everything.
Also, a big congrats to Planck and NERSC researchers. Yesterday was a great day to be a cosmologist. At very least, perhaps folks won’t continue to confuse the field with cosmetology… trust us, you don’t want a cosmologist styling your hair!