In the first working week of the new year, it is time to look back and remember the successes of 2019. The past year was remembered by technological breakthroughs and new scientific problems. Let’s take a closer look at the most interesting results.
Black hole portrait
In the spring, the Event Horizon Telescope collaboration introduced the world the first image of a supermassive black hole in the neighboring M87 galaxy. This was an excellent result of gigantic work – for several days, eight radio observatories around the world, including Antarctica, simultaneously watched a black hole. The data were processed on clusters in MIT and MPIfR, and they were delivered on hard drives – it is unrealistic to transfer the volume of the order of petabytes via the Internet from remote observatories (especially from Antarctica). A few more months were spent on image processing and reconstruction. A good story about the details of an experiment can be read, for example, on Elements (one, two) or on N + 1
In the image itself, we see an accretion disk – a heated substance, spinning in a spiral before falling into a black hole. The spot in the center is not the black hole itself, but rather the shadow from it: the light passing near the black hole is bent due to gravitational lensing, so the shadow is several times larger than the event horizon of the black hole.
The accretion disk of this black hole is turned toward us by its plane. If it were located sideways to us (like the rings of Saturn), then we would see something similar to a black hole from Interstellar. Unfortunately, astronomy is an extremely observational science: we have not the slightest chance to influence these grandiose processes or even look at them from a different angle due to the great distance to them.
It remains to add that this image alone does not prove the presence of black holes – we are confident in their existence due to the mass of other results. Its value is more likely in confirming our ideas about what is happening in the nuclei of galaxies. And of course, in creating a huge international collaboration that allows systematic observations with a telescope the size of the Earth. In the near future, the inclusion of new, shorter-wavelength telescopes in the network, and the study of the dynamics of processes around black holes in the nuclei of the M87 and the Milky Way.
Hubble Constant Problems
Our Universe is expanding: the distances between neighboring galaxies are constantly increasing; the rate of this expansion is determined by the Hubble constant. Its accurate measurements are the most important task for cosmology, and at the same time the task is very difficult. Until recently, everything converged at a value of about 70 km / s per megaparsec. Good accuracy was achieved only the year before last: an analysis of the data of the Plank satellite, which measured the anisotropy of the CMB radiation, led to a Hubble constant of 67.4 ± 0.5 km / s / Mpc. The Dark Energy Survey collaboration, which studied fluctuations in the density of matter in the Universe using a network of optical telescopes, got the same result.
But 2019 brought surprises. Several groups that collected long-term statistics on various space objects – quasars, Cepheids, space masers – converged at a value of about 74 km / s / Mpc (blue dots on the graph). In contrast to the results of last year, in all these works, the distance to the existing facilities was measured. The fluctuations of the relict radiation and the density of matter reflect what happened at the dawn of the Universe. As a result, we have a difference of more than four standard deviations between the values for the early and current Universe, which is undoubtedly intriguing and at least gives rise to discussions about new physics.
Here you can find a lot of controversial points: for example, the distances to many of the objects were calibrated using the same standard candles, so they can not be considered independent. The cherry on the cake is a measurement made by supermassive red giants (red dot): it gives a compromise result of 69.8 km / s / Mpc, but ironically, the calibration of distances to these red giants is even less accurate. Now there is a rather active debate in the community on this topic, and the reason for the discrepancy is still unclear. I would like to believe that in the near future the paradox will begin to be resolved.
Proton radius
Something similar happens in the microworld: measurements of the size of a proton (more precisely, its charge radius) give different results. And the discrepancies here are even more significant.
In general, there are two simple ways to measure the radius of a proton:
- Electron bombard a proton: the closer the electron flies to the proton, the more attractive the curvature of its path. Using the scattering pattern, one can reconstruct the radius in which the proton charge is concentrated.
- Spectroscopy of hydrogen. The hydrogen nucleus is the proton, and its size affects the energy levels at which the electron can be. By simultaneously measuring the energy of two levels, you can calculate the radius of the nucleus.
Both methods gave the same result: about 0.875 femtometers. In 2010, the MPQ team proposed replacing the electron in the hydrogen atom with a muon, a heavier elementary particle with similar properties. The heavy muon rotates closer to the proton, so the radius of the proton has a stronger effect on its energy levels. The measurement result was unexpectedly less – 0.841 fm. The measurements were repeated in 2013, the result was the same.
While the whole world was thinking why muon hydrogen behaves in a special way and if there is any new physics here, MPQ decided to repeat the experiment with ordinary hydrogen – and again got a smaller proton radius! A year later, in 2018, spectroscopy of other levels in ordinary hydrogen was repeated in Paris … and the old radius value was obtained! Here, the emphasis of discussions has shifted towards the search for commonplace errors, up to taking into account the difference in height between the two laboratories: accurate spectroscopy is essentially a comparison with the well-known frequency / time standard, and according to the general theory of relativity, time flows in Paris and Munich in slightly different ways. for different distances to the center of the earth.
The past year has pleased me already with two experiments, and even from another continent. First, a group from Toronto repeated the experiment with hydrogen spectroscopy and got the same result as MPQ. And soon it was confirmed by the electron-proton scattering experiment from the American collaboration. In parallel with this, the MPQ group began exactly the same experiment that the French conducted in 2018 – a test for reproducibility unprecedented in modern science! There are already preliminary results, but the authors have not yet disclosed them – they are only intriguing in that they will be interesting. The reason for the discrepancy is still unknown, but apparently everything will become clear in the near future.
Quantum superiority
In the fall, Nature published an article in which the Google team demonstrated quantum superiority. Their 53-qubit quantum chip Sycamore was able to solve a specific problem in 200 seconds. It would take 10 thousand years to solve it on a classic supercomputer.
The task itself, on which the result was shown, turned out to be quite banal. A quantum computer differs from a normal one in that it can, ahem, perform quantum operations inaccessible to classical computers (thanks, Cap!). Therefore, in the experiment, the quantum chip performed a random set of quantum operations, and a classic computer simulated the same set of actions.
A serious discussion has unfolded around the result. For example, IBM researchers say that an optimized classic algorithm would solve the problem not in thousands of years, but in a couple of days. The issue of error correction is even more acute: quantum memory is so fragile that software error correction does not save here, and well-known correction mechanisms in hardware complicate the architecture of quantum chips by orders of magnitude. And scaling quantum chips from dozens of qubits to at least hundreds is far beyond what is currently achievable. Therefore, the result of Google is very mixed: yes, we have stepped on the threshold of the quantum era, but how far we can go forward – and whether we can at all – remains unknown.
Compressed Light for LIGO
Everyone heard about the recent discovery of gravitational waves and the 2017 Nobel Prize that followed. Now there are three sufficiently sensitive observatories of gravitational waves in the world: two LIGO detectors in the United States and VIRGO in Italy. These are incredibly accurate laser interferometers: to achieve current accuracy, enormous forces were invested in measuring noise of various nature and optimizing them:
Today, the main source of noise is the quantum shot noise of light (lilac curve): it is caused by the fact that the laser emits photons at random times. Such noise can be dealt with using squeezed light — induced correlations in a ray of light that redistribute the noise of the light intensity into the noise of its phase, which is harmless to our purpose. This technique has already been tested on the German GEO600 interferometer, and last year it was finally put into operation on both LIGO and VIRGO. Apparently, this is the first application of compressed light to solve practical problems. Now the sensitivity of the detectors will increase significantly (up to two times in some frequency ranges), and we hope to hear more interesting phenomena from the far corners of the Universe.
And this is also a special result for Habr – for him we must thank Mikhail Shkaff, who is directly involved in this topic and has written many interesting articles about LIGO and not only. Thank you and new successes!
Neutrino mass limit
Neutrinos remain one of the most mysterious elementary particles: they practically do not interact with matter and can easily pass through the Earth through. We know that they have at least some mass from neutrino oscillations: on the way from the Sun to us, part of the neutrino turns into a neutrino of a different type.
Transformation is a dynamic process, which means that time flows in the neutrino reference frame – that is, they fly slower than the speed of light due to their mass.
Measuring this mass is much more difficult. Its lower limit – about 9 meV – we know from neutrino oscillations. The KATRIN project in Karlsruhe, Germany, took up the measurement of the upper limit. The idea was to observe the radioactive decay of tritium into helium-3, an electron and an antineutrino: it is impossible to detect the latter, but you can measure the velocities of the remaining particles and calculate the missing energy. In practice, it is easiest to work with electrons: the highest achievable speeds mean that all the decay energy has gone into the neutrino and the electron. Such cases are infrequent; therefore, the detector should be well optimized for detecting electrons of a certain energy.
For this reason, the KATRIN project took a long time to prepare, but it gave the first result after a month of operation: the upper limit of the neutrino energy was 1.1 eV, which doubled the previous estimate. It is planned that KATRIN will gain statistics for another five years, improving accuracy to 0.2 eV. And more advanced experiments based on the same idea can increase the measurement accuracy to 40 meV.
Instead of a conclusion
In my opinion, the past year turned out to be very social: the achievements that he remembered are due to the joint efforts of many groups, and to new issues – the differences between them. Teamwork in science – from desktop experiments to international collaborations – is becoming increasingly important to achieve meaningful results. I hope that we will make every effort to ensure that our work is even more productive, and that the results of the coming year are no less interesting.
2 comments
Fantastic beat ! I wish to apprentice while you amend your web site, how can i subscribe for a blog site? The account helped me a acceptable deal. I had been a little bit acquainted of this your broadcast offered bright clear concept
Sweet blog! I found it while searching on Yahoo News. Do you have any suggestions on how to get listed in Yahoo News? I’ve been trying for a while but I never seem to get there! Thank you
Walking between streaming platforms
Elon Musk: Tesla will soon learn to speak with people