US experiments find hint of a break in the laws of physics

At 9 pm India time on April 7, physicists at an American research facility delivered a shot in the arm to efforts to find flaws in a powerful theory that explains how the building blocks of the universe work.

Physicists are looking for flaws in it because the theory doesn’t have answers to some questions – like “what is dark matter?”. They hope to find a crack or a hole that might reveal the presence of a deeper, more powerful theory of physics that can lay unsolved problems to rest.

The story begins in 2001, when physicists performing an experiment in Brookhaven National Lab, New York, found that fundamental particles called muons weren’t behaving the way they were supposed to in the presence of a magnetic field. This was called the g-2 anomaly (after a number called the gyromagnetic factor).

An incomplete model

Muons are subatomic and can’t be seen with the naked eye, so it could’ve been that the instruments the physicists were using to study the muons indirectly were glitching. Or it could’ve been that the physicists had made a mistake in their calculations. Or, finally, what the physicists thought they knew about the behaviour of muons in a magnetic field was wrong.

In most stories we hear about scientists, the first two possibilities are true more often: they didn’t do something right, so the results weren’t what they expected. But in this case, the physicists were hoping they were wrong. This unusual wish was the product of working with the Standard Model of particle physics.

According to physicist Paul Kyberd, the fundamental particles in the universe “are classified in the Standard Model of particle physics, which theorises how the basic building blocks of matter interact, governed by fundamental forces.” The Standard Model has successfully predicted the numerous properties and behaviours of these particles. However, it’s also been clearly wrong about some things. For example, Kyberd has written:

When we collide two fundamental particles together, a number of outcomes are possible. Our theory allows us to calculate the probability that any particular outcome can occur, but at energies beyond which we have so far achieved, it predicts that some of these outcomes occur with a probability of greater than 100% – clearly nonsense.

The Standard Model also can’t explain what dark matter is, what dark energy could be or if gravity has a corresponding fundamental particle. It predicted the existence of the Higgs boson but was off about the particle’s mass by a factor of 100 quadrillion.

All these issues together imply that the Standard Model is incomplete, that it could be just one piece of a much larger ‘super-theory’ that works with more particles and forces than we currently know. To look for these theories, physicists have taken two broad approaches: to look for something new, and to find a mistake with something old.

For the former, physicists use particle accelerators, colliders and sophisticated detectors to look for heavier particles thought to exist at higher energies, and whose discovery would prove the existence of a physics beyond the Standard Model. For the latter, physicists take some prediction the Standard Model has made with a great degree of accuracy and test it rigorously to see if it holds up. Studies of muons in a magnetic field are examples of this.

According to the Standard Model, a number associated with the way a muon swivels in a magnetic field is equal to 2 plus 0.00116591804 (with some give or take). This minuscule addition is the handiwork of fleeting quantum effects in the muon’s immediate neighbourhood, and which make it wobble. (For a glimpse of how hard these calculations can be, see this description.)

Fermilab result

In the early 2000s, the Brookhaven experiment measured the deviation to be slightly higher than the model’s prediction. Though it was small – off by about 0.00000000346 – the context made it a big deal. Scientists know that the Standard Model has a habit of being really right, so when it’s wrong, the wrongness becomes very important. And because we already know the model is wrong about other things, there’s a possibility that the two things could be linked. It’s a potential portal into ‘new physics’.

“It’s a very high-precision measurement – the value is unequivocal. But the Standard Model itself is unequivocal,” Thomas Kirk, an associate lab director at Brookhaven, had told Science in 2001. The disagreement between the values implied “that there must be physics beyond the Standard Model.”

This is why the results physicists announced today are important.

The Brookhaven experiment that ascertained the g-2 anomaly wasn’t sensitive enough to say with a meaningful amount of confidence that its measurement was really different from the Standard Model prediction, or if there could be a small overlap.

Science writer Brianna Barbu has likened the mystery to “a single hair found at a crime scene with DNA that didn’t seem to match anyone connected to the case. The question was – and still is – whether the presence of the hair is just a coincidence, or whether it is actually an important clue.”

So to go from ‘maybe’ to ‘definitely’, physicists shipped the 50-foot-wide, 15-tonne magnet that the Brookhaven facility used in its Muon g-2 experiment to Fermilab, the US’s premier high-energy physics research facility in Illinois, and built a more sensitive experiment there.

The new result is from tests at this facility: that the observation differs from the Standard Model’s predicted value by 0.00000000251 (give or take a bit).

The Fermilab results are expected to become a lot better in the coming years, but even now they represent an important contribution. The statistical significance of the Brookhaven result was just below the threshold at which scientists could claim evidence but the combined significance of the two results is well above.

Potential dampener

So for now, the g-2 anomaly seems to be real. It’s not easy to say if it will continue to be real as physicists further upgrade the Fermilab g-2’s performance.

In fact there appears to be another potential dampener on the horizon. An independent group of physicists has had a paper published today saying that the Fermilab g-2 result is actually in line with the Standard Model’s prediction and that there’s no deviation at all.

This group, called BMW, used a different way to calculate the Standard Model’s value of the number in question than the Fermilab folks did. Aida El-Khadra, a theoretical physicist at the University of Illinois, told Quanta that the Fermilab team had yet to check BMW’s approach, but if it was found to be valid, the team would “integrate it into its next assessment”.

The ‘Fermilab approach’ itself is something physicists have worked with for many decades, so it’s unlikely to be wrong. If the BMW approach checks out, it’s possible according to Quanta that just the fact that two approaches lead to different predictions of the number’s value is likely to be a new mystery.

But physicists are excited for now. “It’s almost the best possible case scenario for speculators like us,” Gordan Krnjaic, a theoretical physicist at Fermilab who wasn’t involved in the research, told Scientific American. “I’m thinking much more that it’s possibly new physics, and it has implications for future experiments and for possible connections to dark matter.”

The current result is also important because the other way to look for physics beyond the Standard Model – by looking for heavier or rarer particles – can be harder.

This isn’t simply a matter of building a larger particle collider, powering it up, smashing particles and looking for other particles in the debris. For one, there is a very large number of energy levels at which a particle might form. For another, there are thousands of other particle interactions happening at the same time, generating a tremendous amount of noise. So without knowing what to look for and where, a particle hunt can be like looking for a very small needle in a very large haystack.

The ‘what’ and ‘where’ instead come from different theories that physicists have worked out based on what we know already, and design experiments depending on which one they need to test.

Into the hospital

One popular theory is called supersymmetry: it predicts that every elementary particle in the Standard Model framework has a heavier partner particle, called a supersymmetric partner. It also predicts the energy ranges in which these particles might be found. The Large Hadron Collider (LHC) in CERN, near Geneva, was powerful enough to access some of these energies, so physicists used it and went looking last decade. They didn’t find anything.

A table showing searches for particles associated with different post-standard-model theories (orange labels on the left). The bars show the energy levels up to which the ATLAS detector at the Large Hadron Collider has not found the particles. Table: ATLAS Collaboration/CERN

Other groups of physicists have also tried to look for rarer particles: ones that occur at an accessible energy but only once in a very large number of collisions. The LHC is a machine at the energy frontier: it probes higher and higher energies. To look for extremely rare particles, physicists explore the intensity frontier – using machines specialised in generating collisions.

The third and last is the cosmic frontier, in which scientists look for unusual particles coming from outer space. For example, early last month, researchers reported that they had detected an energetic anti-neutrino (a kind of fundamental particle) coming from outside the Milky Way participating in a rare event that scientists predicted in 1959 would occur if the Standard Model is right. The discovery, in effect, further cemented the validity of the Standard Model and ruled out one potential avenue to find ‘new physics’.

This event also recalls an interesting difference between the 2001 and 2021 announcements. The late British scientist Francis J.M. Farley wrote in 2001, after the Brookhaven result:

… the new muon (g-2) result from Brookhaven cannot at present be explained by the established theory. A more accurate measurement … should be available by the end of the year. Meanwhile theorists are looking for flaws in the argument and more measurements … are underway. If all this fails, supersymmetry can explain the data, but we would need other experiments to show that the postulated particles can exist in the real world, as well as in the evanescent quantum soup around the muon.

Since then, the LHC and other physics experiments have sent supersymmetry ‘to the hospital’ on more than one occasion. If the anomaly continues to hold up, scientists will have to find other explanations. Or, if the anomaly whimpers out, like so many others of our time, we’ll just have to put up with the Standard Model.

Featured image: A storage-ring magnet at Fermilab whose geometry allows for a very uniform magnetic field to be established in the ring. Credit: Glukicov/Wikimedia Commons, CC BY-SA 4.0.

The Wire Science
April 8, 2021

Good writing is an atom

https://twitter.com/HochTwit/status/1174875013708746752

The act of writing well is like an atom, or the universe. There is matter but it is thinly distributed, with lots of empty space in between. Removing this seeming nothingness won’t help, however. Its presence is necessary for things to remain the way they are and work just as well. Similarly, writing is not simply the deployment of words. There is often the need to stop mid-word and take stock of what you have composed thus far and what the best way to proceed could be, even as you remain mindful of the elegance of the sentence you are currently constructing and its appropriate situation in the overarching narrative. In the end, there will be lots of words to show for your effort but you will have spent even more time thinking about what you were doing and how you were doing it. Good writing, like the internal configuration of a set of protons, neutrons and electrons, is – physically speaking – very little about the labels attached to describe them. And good writing, like the vacuum energy of empty space, acquires its breadth and timelessness because it encompasses a lot of things that one cannot directly see.

A new LHC: 10 things to look out for

Through an extraordinary routine, the most powerful machine built by humankind is slowly but surely gearing up for its relaunch in March 2015. The Large Hadron Collider (LHC), straddling the national borders of France and Switzerland, will reawaken after two years of upgrades and fixes to smash protons at nearly twice the energy it did during its first run that ended in March 2012. Here are 10 things to look out for: five upgrades and five possible exciting discoveries.

Technical advancements

  1. Higher collision energy – In its previous run, each beam of protons destined for collision with other beams was accelerated to 3.5-4 TeV. By May 2015, each beam will be accelerated to 6.5-7 TeV. By doubling the collision energy, scientists hope to be able to observe higher-energy phenomena, such as heavier, more elusive particles.
  2. Higher collision frequency – Each beam has bunches of protons that are collided with other oncoming bunches at a fixed frequency. During the previous run, this frequency was once every 50 nanoseconds. In the new run, this will be doubled to once every 25 nanoseconds. With more collisions happening per unit time, rarer phenomena will happen more frequently and become easier to spot.
  3. Higher instantaneous luminosity – This is the detectability of particles per second. It will be increased by 10 times, to 1 × 1034 per cm2 per second. By 2022, engineers will aim to increase it to 7.73 × 1034 per cm2 per second.
  4. New pixel sensors – An extra layer of pixel sensors, to handle the higher luminosity regime, will be added around the beam pipe within the ATLAS and CMS detectors. While the CMS was built with higher luminosities in mind, ATLAS wasn’t, and its pixel sensors are expected to wear out within a year. As an intermediate solution, a temporary layer of sensors will be added to last until 2018.
  5. New neutron shields – Because of the doubled collision energy and frequency, instruments could be damaged by high-energy neutrons flying out of the beam pipe. To prevent this, advanced neutron shields will be screwed on around the pipe.

Research advancements

  1. Dark matter – The LHC is adept at finding particles both fundamental and composite previously unseen before. One area of physics desperately looking for a particle of its own is dark matter. It’s only natural for both quests to converge at the collider. A leader candidate particle for dark matter is the WIMP: weakly-interacting massive particle. If the LHC finds it, or finds something like it, it could be the next big thing after the Higgs boson, perhaps bigger.
  2. Dark energy – The universe is expanding at an accelerating pace. There is a uniform field of energy pervading it throughout that is causing this expansion, called the dark energy field. The source of dark energy’s potential is the vacuum of space, where extremely short-lived particles continuously pop in and out of existence. But to drive the expansion of the entire universe, the vacuum’s potential should be 10120 times what observations show it to be. At the LHC, the study of fundamental particles could drive better understanding of what the vacuum actually holds and where dark energy’s potential comes from.
  3. Supersymmetry – The Standard Model of particle physics defines humankind’s understanding of the behavior of all known fundamental particles. However, some of their properties are puzzling. For example, some natural forces are too strong for no known reason; some particles are too light. For this, physicists have a theory of particulate interactions called supersymmetry, SUSY for short. And SUSY predicts the existence of some particles that don’t exist in the Model yet, called supersymmetric partners. These are heavy particles that could show themselves in the LHC’s new higher-energy regime. Like with the dark matter WIMPs, finding a SUSY particle could by a Nobel Prize-winner.
  4. Higgs boson – One particle that’s too light in the Standard Model is the Higgs boson. As a result, physicists think it might not be the only Higgs boson out there. Perhaps there are others with the same properties but weigh lesser or more.
  5. Antimatter reactions – Among the class of particles called mesons, one – designated B0 – holds the clue to answering a question that has astrophysicists stymied for decades: Why does the universe have more matter than antimatter if, when it first came into existence, there were equal amounts of both? An older result from the LHC shows the B0 meson decays into more matter particles than antimatter ones. Probing further about why this is so will be another prominent quest of the LHC’s.

Bonus: Extra dimensions – Many alternate theories of fundamental particles require the existence of extra dimensions. The way to look for them is to create extremely high energies and then look for particles that might pop into one of the three dimensions we occupy from another that we don’t.

EUCLID/ESA: A cosmic vision looking into the darkness

I spoke to Dr. Giuseppe Racca and Dr. Rene Laureijs, both of the ESA, regarding the EUCLID mission, which will be the world’s first space-telescope launched to study dark energy and dark matter. For the ESA, EUCLID will be the centerpiece of their Cosmic Vision program (2015-2025). Dr. Racca is the mission’s project manager while Dr. Laureijs is a project scientist.

Could you explain, in simple terms, what the Lagrange point is, and how being able to study the universe from that vantage point could help the study? 

GR: Sun-Earth Lagrangian point 2 (SEL2) is a point in space about 1.5 million km from Earth in the direction opposite to the sun, co-rotating with the Earth around the Sun. It is a nice and calm point to make observations. It is not disturbed by the heat fluxes from the Earth but at the same time is not too far away to allow to send to Earth the large amount of data from the observation. The orbit around SEL2 that Euclid will employ is rather large and it is easy to reach (in terms of launcher capability) and not expensive to control (in terms of fuel required for the orbit corrections and maintenance manoeuvres).

Does Euclid in any way play into a broader program by ESA to delve into the Cosmic Frontier? Are there future upgrades/extensions planned? 

RL: Euclid is the second approved medium class mission of ESA’s Cosmic Vision programme. The first one is Solar Orbiter, which studies the Sun at short distance. The Cosmic Vision programme sets out a plan for Large, Medium and Small size missions in the decade 2015-2025. ESA’s missions Planck, which is presently in operation in L2, and Euclid will study the beginning, the evolution, and the predicted end of our Universe.

GR: A theme of this programme is: “How did the Universe originate and what is it made of?” Euclid is the first mission of this part of Cosmic Vision 2015-2025. There will be other missions, which have not been selected yet.

What’s NASA’s role in all of this? What are the different ways in which they will be participating in the Euclid mission? Is this a mission-specific commitment or, again, is it encompassed by a broader participation agreement?

GR: The NASA participation in the Euclid mission is very important but rather limited in extent. They will provide the Near-infrared detectors for one of the two Euclid instruments. In addition they will contribute to the scientific investigation with a team of about 40 US scientists. Financially speaking NASA contribution is limited to some 3-4% of the total Euclid mission cost.

RL: The Euclid Memorandum of Understanding between ESA and NASA is mission specific and does not involve a broader participation agreement. First of all, NASA will provide the detectors for the infrared instrument. Secondly, NASA will support 40 US scientists to participate in the scientific exploitation of the data. These US scientists will be part of the larger Euclid Consortium, which contains nearly 1000 mostly European scientists.

Do you have any goals in mind? Anything specific or exciting that you expect to find? Who gets the data?

GR: The goals of the Euclid mission are extremely exciting: in few words we want to investigate the nature and origin of the unseen Universe: the dark matter, five times more abundant than the ordinary matter made of atoms, and the dark energy, causing the accelerating expansion of the Universe. The “dark Universe” is reckoned today to amount at 95% of the total matter-energy density. Euclid will survey about 40% of the sky, looking back in cosmic time up to 10 billion years. A smaller part (1% of the sky) will look back to when the universe was only few million years old. This three dimensional survey will allow to map the extent and history of dark matter and dark energy. The results of the mission will allow to understand the nature of the dark matter and its position as part of an extension of the current standard model. Concerning the dark energy we will be able to distinguish between the so called “quintessence” or a modification necessary to current theories of gravity, including General Relativity.

RL: Euclid goals are to measure the accelerated expansion of the universe which tells us about Dark Energy, to determine the properties of gravity on cosmic scales, to learn about the properties of dark matter, and to refine the initial conditions leading to the Universe we see now. These goals have been chosen carefully, the instrumentation of Euclid is optimised to reach these goals as best as possible. The Euclid data opens the discovery space for many other areas in astronomy: Euclid will literally measure billions of stars and galaxies at visible and infrared wavelengths, with a very high image quality, comparable to that of Hubble Space Telescope. The most exiting prospect is the availability of these sharp images, which will certainly reveal new classes of objects with new science. The nominal mission will last for 6 years, but the first year of data will become already public 26 months after the start of the survey.

When will the EUCLID data be released?

GR: The Euclid data will be released to the public one year after their collection and will be made available to all researchers in the world.

Rubbernecking at redshifting

The interplay of energy and matter is simply wonderful because, given the presence of some intrinsic properties, the results of their encounters can be largely predicted. The presence of smoke indicates fire, the presence of shadows both darkness and light, the presence of winds a pressure gradient, the presence of mass a gravitational potential. And a special radiological extension of the last correlation gives rise to a phenomenon called gravitational redshift.

The wave-particle duality insists that electromagnetic radiation, if conceived as a stream of photons, can also be thought of as propagating as waves. All waves have two fundamental properties: wavelength and frequency. If a wave consists of a crest and a trough, the length of a crest-trough pair is its wavelength, and the number of wavelengths traversed by the wave in a second its frequency. Also, the energy contained in a wave is directly proportional to its frequency.

A wave undergoes a gravitational redshift when it moves from a region of lower gravitational potential to a region of higher gravitational potential. Such a potential gradient may be experienced when one moves away from a massive body, from regions of stronger to weaker gravitational pull (note the inverse variation). And when you think of radiation, such as light, moving from the surface of a star and toward a far-away observer, the light gets redshifted. The phenomenon was proposed, implicitly, in 1916 by Albert Einstein through his, and so called, Einstein Field Equations (EFE) that described the general theory of relativity (GR).

When radiation gets redshifted, its frequency gets reduced toward the red portion of the electromagnetic spectrum, hence the name. Agreed, the phenomenon is counter-intuitive. Usually, when the leash on an escaping object is loosened, the object speeds up. In the case of a redshift, however, the frequency is lowered (or the particle slowed).

The real wonder lies in the predictive power of such physics. It doesn’t matter whence the mass and what the wave: their interaction is always preceded and succeeded by a blueshift and a redshift. More, speaking from an application-oriented perspective, the radiation reaching Earth from outer space will always be redshifted. Consider it: the waves will have left the gravitational pull of some body behind on their way toward Earth. In thinking so, given some radiation, its source, and thus the radiation’s initial frequency, it becomes easy to calculate how much mass lies between the source and Earth.

A universal map of the cosmic microwave background (CMB) radiation as recorded by the Wilkinson Microwave Anisotropy Probe (WMAP) after its launch in 2011 (This map serves as a reliable benchmark against which to compare the locally observed frequencies of CMB)

As a naturally available resource, consider the cosmic microwave background (CMB) radiation. The CMB was born when the universe was around 379,000 years old, when the ionic plasma born moments after the Big Bang had cooled to a temperature at which electrons and protons could combine to form hydrogen atoms, leaving the photons decoupled from matter and loosened upon the universe as residual radiation (currently at a temperature of  2.72548 ± 0.00057 K).

And in the CMB-context, the Sachs-Wolfe effect is of two kinds: integrated and non-integrated. The non-integrated Sachs-Wolfe effect occurs at the surface-of-last-scattering, and the integrated version between the surface-of-last-scattering and Earth. The surface mentioned here can be thought of as an imaginary surface in space where the last matter-radiation decouplings occurred. What we’re interested in is the integrated Sachs-Wolfe effect.

Assuming that the photons have just left a star behind, and been gravitationally redshifted in the process, there is a lot of matter they could still encounter on their way to Earth even if our home planet maintains a clear line-of-sight to the star. This includes dust, stray rocks, gases blown around by stellar winds, and – if it does exist – dark energy.

The iconic Pillars of Creation, snapped by the Hubble Space Telescope on April 1, 1995, show columns of intergalactic dust in the midst of star-creation while also being eroded by starlight and stellar winds from other stars in the neighborhood.

Therefore, a great way to detect the presence of dark energy between two points in space would be easy, wouldn’t it? All we’d have to do is measure the redshift in radiation detected by a satellite in orbit around Earth coming from a selected region, and compare it with a map of that region. An analysis of the redshift “leftover” from subtracting the redshift due to matter should yield the amount of dark energy! (See also: WMAP)

This procedure was suggested in 1996 by Neil Turok and Robert Crittenden of the Perimeter Institute, Canada. However, after the first evidence of the integrated Sachs-Wolfe effect was detected in 2003, the correlation between the observed data and already-available maps was very low. This lead some skeptics to suggest that the effect could have instead been caused by space dust. The possibility of their being right was indeed high, until September 11, 2012, when their skepticism was almost conclusively refuted by a team of scientists from the University of Portsmouth and the LMU University Munich.

The study, lead by Tommaso Giannantonio and Crittenden, lasted two years and established at a confidence level of 5.4 sigma (or 99.996%) that the ’03 observation indeed corresponded to dark energy and not any other source of gravitational potential.

The phenomenological legacy of redshifts is derived from its special place in Einstein’s GR. The descriptive EFE first opened even the theoretical possibilities of such redshifts and their applications in astrophysics research.