News

Looking for cracks in the standard cosmological model

An international team of astrophysicists has presented a project that simulates the formation of galaxies and large-scale cosmic structures across staggeringly large swaths of space. The first results of their MillenniumTNG (MTNG) project have just been published in a series of 10 articles in the Monthly Notices of the Royal Astronomical Society (MNRAS) journal. The new calculations will help put the standard cosmological model to precision tests that could shed light on fundamental aspects of the universe.

In recent decades, cosmologists have grown accustomed to a baffling conjecture: the matter in the universe is dominated by a strange “dark matter” and its accelerating expansion is caused by an even stranger “dark energy” that acts like some form of antigravity. Ordinary visible matter, which is the stuff that makes up planets, stars and galaxies, like our Milky Way, makes up less than 5% of the cosmic mix. This seemingly bizarre cosmological model is known as ΛCDM.

ΛCDM can explain a large number of cosmological observations, ranging from cosmic microwave radiation (the remnant heat left behind by the Big Bang) to the “cosmic web” arrangement of galaxies along a complex network of filaments. However, the physical nature of the two main ingredients of ΛCDM, dark matter and dark energy, is still incomprehensible today. This mystery has led astrophysicists to look for loopholes in the ΛCDM theory that make their predictions fail. However, the search for "fissures" requires extremely sensitive new observational data, as well as accurate predictions about what the ΛCDM model actually implies.

This week, an international team of researchers from DIPC and Ikerbasque, together with the Max Planck Institute for Astrophysics (MPA), Harvard University, Durham University and York University, have taken a decisive step in the challenge of understanding the ΛCDM model. They have developed the most comprehensive set of cosmological simulations to date. For this project they used specialized software on two extremely powerful supercomputers (Cosma8 in the UK and SuperMUG in Germany), where 300,000 computer cores tracked the formation of around a hundred million galaxies in a region of the universe around 2.3 billion light years old.

MillenniumTNG simulates galaxy formation processes directly on volumes so large that they can be considered representative of the universe as a whole. These simulations make it possible to accurately assess the impact of astrophysical processes such as supernovae explosions and supermassive black holes on the distribution of cosmic matter. Furthermore, these calculations directly simulate the presence of massive neutrinos. These are extremely light  fundamental particles that represent a maximum of 2% of the mass in the universe. Although very small, data from cosmological studies, such as those from the recently launched Euclid satellite of the European Space Agency, will soon reach a degree of precision that could make it possible to detect neutrinos in the cosmos for the first time.

The first results of the MillenniumTNG project yield abundant theoretical predictions that reinforce the importance of computer simulations in modern cosmology. The team has written and submitted ten introductory scientific papers for the project that have just appeared simultaneously in the MNRAS journal.

One of the studies, led by the DIPC, focused on creating tools that quickly generate millions of virtual universes with different assumptions about the cosmological model. "The project has allowed us to develop new ways of understanding the relationship between dark matter and galaxies, and how we can use them to find cracks in the ΛCDM model," says Dr. Sergio Contreras, first author of this study from the MTNG team, who analyzed the data using supercomputers at the DIPC Supercomputing Center. Thanks to machine learning algorithms, the team was able to predict what the universe would look like if the neutrinos had different masses or if the law of gravity was different from what Einstein proposed. "In recent years we have seen a revolution in the way of building computational models, and it is exciting to see how many new doors have been opened to understand fundamental aspects of the cosmos," says Professor Raúl Angulo, an ikerbasque researcher at the DIPC.

Other MTNG project studies include how the shape of galaxies is conditioned by the distribution of matter on a large scale; predictions about the existence of a population of very massive galaxies in the early universe recently discovered with the James Webb Space Telescope; and the construction of virtual universes containing more than a billion galaxies.

The barrage of initial results from the MillenniumTNG simulations make it clear that they will be of great help in devising better strategies for the analysis of upcoming cosmological data. The team's Principal Investigator, MPA Director Prof. Volker Springel, states that "MillenniumTNG combines recent advances in galaxy formation simulations with the field of large-scale cosmic structure, allowing for improved theoretical modeling of the connection of the galaxies with the dark matter backbone of the universe. This can be very useful in advancing key questions in cosmology, such as how best to constrain neutrino mass with data from large-scale structures." The main hydrodynamic simulation required approximately 170 million CPU hours, which is the equivalent of running a QuadCore computer for almost 5,000 years. In addition, the entire project produced over 2 petabytes of simulation data, which is plenty of material for further research. This will keep the participating scientists busy for many years.