We have been talking for several years about a tension between two methods of determining the famous Hubble-Lemaître law (HL). As far as we can tell, this is actually a tension and not a crisis of cosmology. The most likely outcome is that it will end as in the case of neutrinos that appeared to exceed the speed of light, that is, by highlighting a source of error that resisted the speed of light for a while, according to researchers. Nor can we completely rule out that it is a sign that we need to introduce elements of a new physics. But what is it about?
The Hubble-Lemaître law links the distance of a galaxy’s stars from the Milky Way to the spectral shift of the light emitted by its stars, which is finally measured today after a journey through space, sometimes lasting billions of years. This can be determined by determining the distance of these stars using a method and measuring the offset. In fact, there are a whole series of successive methods to calibrate the HL law within the framework of the so-called cosmic distance scale.
Within the framework of a given cosmological relativistic model, i.e. with a specific space-time geometry/topology (for example a spherical or toric space) and an equally specific content (for example with or without dark matter), it is possible to do more than just that HL law, but also a more general law associated with temporal fluctuations in the expansion rate of space since the Big Bang or nearly so.
In cosmology, the scale of cosmic distances is used to describe a series of methods that rely on each other to determine step by step the distances of stars in the observable cosmos. It all starts with parallax measurements in the solar system, i.e. the angles that a nearby star makes on the dome of the sky at two times a year. From the geometry of the triangle, a distance can be inferred if the angles are large enough to be measurable. © Hubble, ESA
Divergent values for the Hubble-Lemaître constant
The very advanced analyzes of the properties of fossil radiation measured by the Planck satellite help determine which model we live in and what the HL constant was about 380,000 years after the Big Bang. The model then makes it possible to calculate the value that should be measured today by studying relatively close galaxies.
We can do the same by studying SN Ia type supernovae. These are explosions of white dwarfs whose brightness should not fluctuate much. Because these explosions are very luminous, they allow distances to be measured over several billion years, since the further away a “standard candle” is, the less bright it is, making it possible to determine a distance by comparing apparent luminosity with absolute Luminosity. . By measuring a spectral shift, we then derive the value of the HL constant.
This is the game that Nobel Prize winner in physics Adam Riess played with his colleagues like Saul Perlmutter. But in recent years, as error bars have been reduced, the gap between determining the HL constant using fossil radiation and using supernovae has widened. The two values differ and we still don’t know exactly why.
Futura had already dedicated a long article to the now famous tension surrounding the HL Law, to mark Futura’s 20th anniversary and for an editorial by Françoise Combes.
A way to see things more clearly and to try to make even more precise the methods that make it possible to use supernovae to determine the value of the Hubble-Lemaître constant. This is what Adam Reiss tried to do with the Hubble telescope and his observations of famous variable stars called Cepheids.
Adam Riess and his colleagues used this strategy again, but this time with the stronger view and particularly in the near-infrared of the James Webb Telescope, as explained in a NASA press release and confirmed by an article in the Astrophysical Journal that can be read freely on arXiv .
The James Webb observations confirm by refining those of Hubble and the immediate conclusion is that the conflict with fossil radiation measurements is reinforced.
In NASA’s press release, Reiss explains for him about these new results: “This could indicate the presence of exotic dark energy, exotic dark matter, a revision of our understanding of gravity, or the manifestation of a unified theory of particles and fields.” The most trivial explanation would be that multiple measurement errors are in the same direction (astronomers ruled out a single error using independent methods), which is why it is so important to measure again with greater precision. Because Webb confirms Hubble’s measurements, his measurements provide the strongest evidence yet that systematic errors in Hubble’s Cepheid photometry do not play a significant role in the current voltage. This leaves the most interesting possibilities on the table and the mystery has become deeper. »
Cepheids, a key to measuring cosmic distances
Still in the NASA press release, Adam Reiss gives more details about what was done. He first recalls that the Cepheids are variable stars for which, thanks to distance measurements using the parallax method, we have established in the Milky Way that there is a connection between the period of variation of their luminosity and their intrinsic luminosity. We could therefore also use them as standard candles to determine distances to the nearest galaxies, distances that, once known, will make it possible to calibrate estimates of the distances of SN Ia supernovae and ultimately Hubble’s law on distances from to calibrate a few million but several billion light years.
The first problem concerns the red supergiants, the Cepheids, and that their apparent visibility is particularly low beyond a hundred million light-years – finding them requires high-resolution instruments. In addition, dust and matter sandwiched between these stars and terrestrial observers cause their apparent brightness to be lower than it actually is.
The James-Webb is less susceptible to these problems, which the Hubble telescope has already encountered, because it has greater resolution and, above all, because the dust clouds in the infrared range accessible with the JWST are partially transparent, the distortion in the apparent The luminosity of the Cepheids is lower at James-Webb than at Hubble.
Reiss and his colleagues therefore focused on a more precise calibration of the luminosity-distance relationship of Cepheids by studying 320 of them in the galaxy NGC 4258. This then enabled a more precise calibration of SN Ia in nearby Cepheid galaxies.
Did you know ?
At the beginning of the last century, despite the visionary arguments of Wright and Kant, the majority of astronomers believed that galaxies were just special objects within our own Milky Way. Everything was about to change when, in 1912, Henrietta Leavitt discovered a precise mathematical relationship that linked the luminosity of certain variable stars, the Cepheids, to their pulsation period of the stars she had discovered in the two Magellanic Clouds.
We now know that Cepheids are giant Class I stars that are in the process of fusing their helium cores into carbon. The star itself is therefore enriched with helium. However, as the star’s temperature increases, the helium in its upper layers ionizes, increasing the star’s opacity. As radiation pressure increases, it can counteract gravitational forces and the star expands, becoming brighter as the surface area increases. This causes its temperature to drop and the helium ions eventually capture electrons. As the opacity of neutral helium decreases, the radiation pressure decreases and the star’s gravity causes it to contract. Its surface and thus its luminosity decrease and the star is at the beginning of a new pulsation cycle.
The Cepheids are four to fifteen times more massive than the Sun and are particularly bright, 100 to 300,000 times brighter than our star. The relationship found by Henrietta Leavitt provides a powerful way to determine the distances of Cepheid galaxies. In fact, the precise relationship between luminosity and pulsation period provides an estimate of the absolute brightness of these stars. Therefore, by comparing its apparent brightness with the absolute brightness determined by the Leavitt relation, we can estimate the distance at which the star is located. It is the same principle that allows us to determine the distance of a candle based on its luminosity; it is weaker the further away the candle is.
Using Henrietta Leavitt’s relationship, Hubble demonstrated in 1923 that the Andromeda Galaxy was more than a million light-years away (this distance is now estimated to be at least 2.4 million years. -L.). Given its apparent size, it must also have a size comparable to that of the Milky Way. The Kant-Wright kingdom of galaxies and island universes was now forcing itself upon humanity.
Henrietta Leavitt’s relationship is based on Cepheid distances determined by parallax, for example, and is therefore not error-free. This, in turn, serves to calibrate Hubble’s law at the expense of uncertainty. For astronomers, there are a range of distances that can be determined by a range of tools operating at increasingly larger scales. Because errors propagate, estimating distances becomes less accurate as we delve deeper into the depths of the observable universe. In particular, beyond a hundred million light-years, Cepheids become too faint to be easily used. Their luminosity is lost in that of the observed galaxies.