Just over a century ago, in 1917, Albert Einstein proposed the first cosmological model based on general relativity, and in the 1920s-1930s Georges Lemaître pushed its limits and anticipated many things that are now part of the Work concerns the standard cosmological model involving matter and dark energy, even going so far as to foresee ideas that Hawking and John Wheeler, to name a few, will explore in quantum cosmology.
Today, thanks to observational programs made possible by instruments such as the Planck satellite and, more recently, Euclid, we have an enormous and growing stream of observations that bring together galaxies, galaxy clusters, and the large structures that compose them. In order to derive information about the physics behind these structures and about their formation itself, powerful simulations must be carried out using supercomputers and, more recently, artificial intelligence (AI) tools.
The idea is to see what numerical simulations predict based on equations and certain observations, to find out whether these simulations predict aspects of the universe today, and how those aspects have changed since the Big Bang. This allows testing of equations and theories developed to understand the observable cosmos, such as the nature of dark matter or dark energy, with computers capable of performing calculations that the human mind cannot, particularly at nonlinear equations.
For several decades, increasingly sophisticated simulations have been carried out that take advantage of advances in the achievable computational volume, such as the famous Millennium simulation. Many of these famous simulations were made possible by the Virgo Consortium for Cosmological Supercomputer Simulations, founded in 1994. It quickly grew into an international group of scientists in the United Kingdom, Germany, Great Britain, Canada, the United States and Japan.
Eagle (Evolution and Assembly of GaLaxies and Their Environments) is a simulation aimed at understanding how galaxies form and evolve. This video showed some clips from a few years ago, based on computer calculations that modeled the formation of structures in a cosmological volume of 100 megaparsecs per side (more than 300 million light years). It was large enough to contain 10,000 galaxies the size of the Milky Way or larger, allowing comparison, for example, with the entire zoo of galaxies visible in the Hubble Deep Field. The simulation begins while the universe is still very uniform – no stars or galaxies have formed yet – with cosmological parameters motivated by observations of the cosmic microwave background by the Planck satellite. The crucial parameters are the density of dark matter – which enables the growth of structures, baryonic matter – the gas from which stars form, and the cosmological constant – responsible for cosmic acceleration. © Durham University
Problematic simulations only with dark matter
In particular, Virgo had access to world-class supercomputing resources in the United Kingdom and at the University of Durham, which ran several simulations, including the so-called Eagle (Evolution and Assembly of GaLaxies and Their Environments). Today, researchers at this university, along with their colleagues elsewhere, announced that, as part of an international team of astronomers, they have gone further than previous simulations in carrying out Flamingo, considered the largest simulation of cosmological calculations ever conducted , in which not only dark matter was used but also ordinary matter.
Flamingo is therefore a new avatar of the Virgo consortium, whose acronym this time stands for Full-hydro Large-scale structure simulations with All-sky Mapping for the Interpretation of Next Generation Observations. It uses 300 billion “particles” with the mass of a small galaxy in a cubic volume ten billion light-years long.
Let’s explain the context. According to the standard cosmological model, at the end of the Big Bang, dark matter concentrations would have collapsed rapidly, bringing with it the collapse of the ordinary baryonic matter that makes up stars, planets and, of course, our bodies. We have information about the concentrations of ordinary matter at the time of the emission of fossil radiation, which provides us with the initial conditions for calculating the evolution of the matter that will form galaxies and galaxy clusters over time to date for 13.8 billion years.
At the beginning of the 1980s, the computing power of computers in the field of cosmology was still rudimentary and since dark matter is said to dominate ordinary matter in mass and thus through its gravitational force, we behave as if the cosmological structures had arisen only from dark matter, which is significantly easier and less computationally intensive to simulate numerically than if ordinary matter had to be taken into account.
Despite great successes, we then encounter several bones and failures of the predictions of computer models based on the most credible dark matter models. The biggest error is undoubtedly the prediction of many dwarf galaxies near large galaxies such as the Milky Way or the Andromeda Galaxy. In reality there are only a few.
Realistic simulations with supernovae and black holes
The most conservative solution to this problem, that is, without calling into question the Standard Model, is to carry out more complex simulations that take into account supernovae explosions and the accretion of ordinary matter by black holes and giants, without, of course, forgetting the gravity of the distribution of ordinary matter. In fact, the explosion of supernovae, which ejects ordinary matter from galaxies, or the winds created by radiation from black holes that swallow matter, can alter the growth of galaxies, and especially dwarf galaxies, by ejecting gas into the intergalactic medium . More generally, the pressure force of the ejected gas can also counteract the gravitational contraction that forms galaxies.
Over the last decade, thanks to advances in computing power, we will actually be able to perform more realistic simulations of the formation and evolution of galaxies taking ordinary matter into account. The Flamingo simulation appears today as the latest culmination of these attempts to reproduce the evolution of the observable universe from the Big Bang to the present day. This led to three articles being published in the Monthly Notices of the Royal Astronomical Society: one describing the methods, another presenting the simulations, and the third examining how well the simulations reproduce the large-scale structure of the universe and the populations of such galaxies is currently being explored by the James Webb Space Telescope.
By varying certain fundamental parameters of the cold dark matter cosmological standard model in a simulation, we can try to find out which parameters best fit the observations. In particular, cosmologists have varied the strength of galactic winds, the mass of neutrinos and key cosmological parameters.
We don’t yet know the masses of ordinary neutrinos very well, but we do know that through these masses neutrinos affect the size of dwarf galaxies, which is one way to understand why there are so few of them. Since they have been moving rapidly since the end of the Big Bang without entering into reactions with other particles of ordinary matter, we speak of cosmological neutrinos as a small component of hot matter (the faster the particles of a gas move on average, the hotter). the gas and vice versa).
The Hubble voltage is followed by the S8 voltage with galaxy clusters
We have known for some time that there is a tension in cosmology between the way we measure the rate of expansion of space with fossil radiation data, some 380,000 years after the Big Bang, and measurements with supernovae over several billion years. Either there is a well-hidden bug somewhere, or new physics needs to be introduced.
Specifically, there is a difference of opinion in the estimates of the so-called Hubble-Lemaître constant and in this context the Anglo-Saxons speak of the Hubble voltage. They also talk more and more about the excitement of S8.
What is it about ?
Here again, the study of fossil radiation provides some value as a kind of measure of the importance of galaxy clusters or, more precisely, of the fluctuations in the density of matter derived from the fluctuations in temperature and polarization of the fossil radiation. Let’s say that the observable universe should evolve in such a way that today, after more than 10 billion years, there are a certain number of clusters for a given unit of volume.
But just as with the dwarf galaxies, the number is not given; fewer than expected have formed over the course of a few billion years. The density of galaxy clusters over the past billion years can be studied by observing the extent to which these clusters produce weak gravitational lensing effects.
The simulations performed with Flamingo do not take into account the lower current value of the observed S8 parameter compared to that derived from the Standard Cosmological Model. Again, this could be a sign of new physics, especially at the level of dark matter particles, which could interact with each other due to still unknown forces, but for which we have already made proposals for theoretical models within the framework of certain particle theories physics. beyond the standard model of high energy physics already considered.