A gear-train for particle physics

It has come under scrutiny at various times by multiple prominent physicists and thinkers, but it’s not hard to see why, when the idea of ‘grand unification’ first set out, it seemed plausible to so many. The first time it was seriously considered was about four decades ago, shortly after physicists had realised that two of the four fundamental forces of nature were in fact a single unified force if you ramped up the energy at which it acted. (electromagnetic + weak = electroweak). The thought that followed was simply logical: what if, at some extremely high energy (like what was in the Big Bang), all four forces unified into one? This was 1974.

There has been no direct evidence of such grand unification yet. Physicists don’t know how the electroweak force will unify with the strong nuclear force – let alone gravity, a problem that actually birthed one of the most powerful mathematical tools in an attempt to solve it. Nonetheless, they think they know the energy at which such grand unification should occur if it does: the Planck scale, around 1019 GeV. This is about as much energy as is contained in a few litres of petrol, but it’s stupefyingly large when you have to accommodate all of it in a particle that’s 10-15 metres wide.

This is where particle accelerators come in. The most powerful of them, the Large Hadron Collider (LHC), uses powerful magnetic fields to accelerate protons to close to light-speed, when their energy approaches about 7,000 GeV. But the Planck energy is still 10 million billion orders of magnitude higher, which means it’s not something we might ever be able to attain on Earth. Nonetheless, physicists’ theories show that that’s where all of our physical laws should be created, where the commandments by which all that exists does should be written.

… Or is it?

There are many outstanding problems in particle physics, and physicists are desperate for a solution. They have to find something wrong with what they’ve already done, something new or a way to reinterpret what they already know. The clockwork theory is of the third kind – and its reinterpretation begins by asking physicists to dump the idea that new physics is born only at the Planck scale. So, for example, it suggests that the effects of quantum gravity (a quantum-mechanical description of gravity) needn’t necessarily become apparent only at the Planck scale but at a lower energy itself. But even if it then goes on to solve some problems, the theory threatens to present a new one. Consider: If it’s true that new physics isn’t born at the highest energy possible, then wouldn’t the choice of any energy lower than that just be arbitrary? And if nothing else, nature is not arbitrary.

To its credit, clockwork sidesteps this issue by simply not trying to find ‘special’ energies at which ‘important’ things happen. Its basic premise is that the forces of nature are like a set of interlocking gears moving against each other, transmitting energy – rather potential – from one wheel to the next, magnifying or diminishing the way fundamental particles behave in different contexts. Its supporters at CERN and elsewhere think it can be used to explain some annoying gaps between theory and experiment in particle physics, particularly the naturalness problem.

Before the Higgs boson was discovered, physicists predicted based on the properties of other particles and forces that its mass would be very high. But when the boson’s discovery was confirmed at CERN in January 2013, its mass implied that the universe would have to be “the size of a football” – which is clearly not the case. So why is the Higgs boson’s mass so low, so unnaturally low? Scientists have fronted many new theories that try to solve this problem but their solutions often require the existence of other, hitherto undiscovered particles.

Clockwork’s solution is a way in which the Higgs boson’s interaction with gravity – rather gravity’s associated energy – is mediated by a string of effects described in quantum field theory that tamp down the boson’s mass. In technical parlance, the boson’s mass becomes ‘screened’. An explanation for this that’s both physical and accurate is hard to draw up because of various abstractions. So as University of Bruxelles physicist Daniele Teresi suggests, imagine this series: Χ = 0.5 × 0.5 × 0.5 × 0.5 × … × 0.5. Even if each step reduces Χ’s value by only a half, it is already an eighth after three steps; after four, a sixteenth. So the effect can get quickly drastic because it’s exponential.

And the theory provides a mathematical toolbox that allows for all this to be achieved without the addition of new particles. This is advantageous because it makes clockwork relatively more elegant than another theory that seeks to solve the naturalness problem, called supersymmetry, SUSY for short. Physicists like SUSY also because it allows for a large energy hierarchy: a distribution of particles and processes at energies between electroweak unification and grand unification, instead of leaving the region bizarrely devoid of action like the Standard Model does. But then SUSY predicts the existence of 17 new particles, none of which have been detected yet.

Even more, as Matthew McCullough, one of clockwork’s developers, showed at an ongoing conference in Italy, its solutions for a stationary particle in four dimensions exhibit conceptual similarities to Maxwell’s equations for an electromagnetic wave in a conductor. The existence of such analogues is reassuring because it recalls nature’s tendency to be guided by common principles in diverse contexts.

This isn’t to say clockwork theory is it. As physicist Ben Allanach has written, it is a “new toy” and physicists are still playing with it to solve different problems. Just that in the event that it has an answer to the naturalness problem – as well as to the question why dark matter doesn’t decay, e.g. – it is notable. But is this enough: to say that clockwork theory mops up the math cleanly in a bunch of problems? How do we make sure that this is how nature works?

McCullough thinks there’s one way, using the LHC. Very simplistically: clockwork theory induces fluctuations in the probabilities with which pairs of high-energy photons are created at some energies at the LHC. These should be visible as wavy squiggles in a plot with energy on the x-axis and events on the y-axis. If these plots can be obtained and analysed, and the results agree with clockwork’s predictions, then we will have confirmed what McCullough calls an “irreducible prediction of clockwork gravity”, the case of using the theory to solve the naturalness problem.

To recap: No free parameters (i.e. no new particles), conceptual elegance and familiarity, and finally a concrete and unique prediction. No wonder Allanach thinks clockwork theory inhabits fertile ground. On the other hand, SUSY’s prospects have been bleak since at least 2013 (if not earlier) – and it is one of the more favoured theories among physicists to explain physics beyond the Standard Model, physics we haven’t observed yet but generally believe exists. At the same time, and it bears reiterating, clockwork theory will also have to face down a host of challenges before it can be declared a definitive success. Tik tok tik tok tik tok

Hopes for a new particle at the LHC offset by call for more data

At a seminar at CERN on Tuesday, scientists working with the Large Hadron Collider provided the latest results from the particle-smasher at the end of its operations for 2015. The results make up the most detailed measurements of the properties of some fundamental particles made to date at the highest energy at which humankind has been able to study them.

The data discussed during the seminar originated from observations at two experiments: ATLAS and CMS. And while the numbers were consistent between them, neither experimental collaboration could confirm any of the hopeful rumours doing the rounds – that a new particle might have been found. However, they were able to keep the excitement going by not denying some of the rumours either. All they said was they needed to gather more data.

One rumour that was neither confirmed nor denied was the existence of a particle at an energy of about 750 GeV (that’s about 750x the mass of a proton). That’s a lot of mass for a single particle – the heaviest known elementary particle is the top quark, weighing 175 GeV. As a result, it’d be extremely short-lived (if it existed) and rapidly decay into a combination of lighter particles, which are then logged by the detectors.

When physicists find such groups of particles, they use statistical methods and simulations to reconstruct the properties of the particle that could’ve produced them in the first place. The reconstruction shows up as a bump in the data where otherwise there’d have been a smooth curve.

This is the ATLAS plot displaying said bump (look in the area over 750 GeV on the x-axis):

ATLAS result showing a small bump in the diphoton channel at 750 GeV in the run-2 data. Credit: CERN
ATLAS result showing a small bump in the diphoton channel at 750 GeV in the run-2 data. Credit: CERN

It was found in the diphoton channel – i.e. the heavier particle decayed into two energetic photons which then impinged on the ATLAS detector. So why aren’t physicists celebrating if they can see the bump?

Because it’s not a significant bump. Its local significance is 3.6σ (that’s 3.6 times more than the average size of a fluctuation) – which is pretty significant by itself. But the more important number is the global significance that accounts for the look-elsewhere effect. As experimental physicist Tommaso Dorigo explains neatly here,

… you looked in many places [in the data] for a possible [bump], and found a significant effect somewhere; the likelihood of finding something in the entire region you probed is greater than it would be if you had stated beforehand where the signal would be, because of the “probability boost” of looking in many places.

The global significance is calculated by subtracting the effect of this boost. In the case of the 750-GeV particle, the bump stood at a dismal 1.9σ. A minimum of 3 is required to claim evidence and 5 for a discovery.

A computer’s reconstruction of the diphoton event observed by the ATLAS detector. Credit: ATLAS/CERN
A computer’s reconstruction of the diphoton event observed by the ATLAS detector. Credit: ATLAS/CERN

Marumi Kado, the physicist who presented the ATLAS results, added that when the bump was studied across a 45 GeV swath (on the x-axis), its significance went up to 3.9σ local and 2.3σ global. Kado is affiliated with the Laboratoire de l’Accelerateur Lineaire, Orsay.

A similar result was reported by James Olsen, of Princeton University, speaking for the CMS team with a telltale bump at 745 GeV. However, the significance was only 2.6σ local and 1.2σ global. Olsen also said the CMS detector had only one-fourth the data that ATLAS had in the same channel.

Where all this leaves us is that the Standard Model, which is the prevailing theory + equations used to describe how fundamental particles behave, isn’t threatened yet. Physicists would much like it to be: though it’s been able to correctly predict the the existence of many particles and fundamental forces, it’s been equally unable to explain some findings (like dark matter). And finding a particle weighing ~750 GeV, which the model hasn’t predicted so far, could show physicists what could be broken about the model and pave the way for a ‘new physics’.

However, on the downside, some other new-physics hypotheses didn’t find validation. One of the more prominent among them is called supersymmetry, SUSY for short, and it requires the existence of some heavier fundamental particles. Kado and Olsen both reported that no signs of such particles have been observed, nor of heavier versions of the Higgs boson, whose discovery was announced mid-2012 at the LHC. Thankfully they also appended that the teams weren’t done with their searches and analyses yet.

So, more data FTW – as well as looking forward to the Rencontres de Moriond (conference) in March 2016.