Is the universe as we know it stable?

The anthropic principle has been a cornerstone of fundamental physics, being used by some physicists to console themselves about why the universe is the way it is: tightly sandwiched between two dangerous states. If the laws and equations that define it had slipped during its formation just one way or the other in their properties, humans wouldn’t have existed to be able to observe the universe, and conceive the anthropic principle. At least, this is the weak anthropic principle – that we’re talking about the anthropic principle because the universe allowed humans to exist, or we wouldn’t be here. The strong anthropic principle thinks the universe is duty-bound to conceive life, and if another universe was created along the same lines that ours was, it would conceive intelligent life, too, give or take a few billion years.

The principle has been repeatedly resorted to because physicists are at that juncture in history where they’re not able to tell why some things are the way they are and – worse – why some things aren’t the way they should be. The latest significant addition to this list, and an illustrative example, is the Higgs boson, whose discovery was announced on July 4, 2012, at the CERN supercollider LHC. The Higgs boson’s existence was predicted by three independently working groups of physicists in 1964. In the intervening decades, from hypothesis to discovery, physicists spent a long time trying to find its mass. The now-shut American particle accelerator Tevatron helped speed up this process, using repeated measurements to steadily narrow down the range of masses in which the boson could lie. It was eventually found at the LHC at 125.6 GeV (a proton weighs about 0.98 GeV).

It was a great moment, the discovery of a particle that completed the Standard Model group of theories and equations that governs the behaviour of fundamental particles. It was also a problematic moment for some, who had expected the Higgs boson to weigh much, much more. The mass of the Higgs boson is connected to the energy of the universe (because the Higgs field that generates the boson pervades throughout the universe), so by some calculations 125.6 GeV implied that the universe should be the size of a football. Clearly, it isn’t, so physicists got the sense something was missing from the Standard Model that would’ve been able to explain the discrepancy. (In another example, physicists have used the discovery of the Higgs boson to explain why there is more matter than antimatter in the universe though both were created in equal amounts.)

The energy of the Higgs field also contributes to the scalar potential of the universe. A good analogy lies with the electrons in an atom. Sometimes, an energised electron sees fit to lose some extra energy it has in the form of a photon and jump to a lower-energy state. At others, a lower-energy electron can gain some energy to jump to a higher state, a phenomenon commonly observed in metals (where the higher-energy electrons contribute to conducting electricity). Like the electrons can have different energies, the scalar potential defines a sort of energy that the universe can have. It’s calculated based on the properties of all the fundamental forces of nature: strong nuclear, weak nuclear, electromagnetic, gravitational and Higgs.

For the last 13.8 billion years, the universe has existed in a particular way that’s been unchanged, so we know that it is at a scalar-potential minimum. The apt image is of a mountain-range, like so:

valleys1

The point is to figure out if the universe is lying at the deepest point of the potential – the global minimum – or at a point that’s the deepest in a given range but not the deepest overall – the local minimum. This is important for two reasons. First: the universe will always, always try to get to the lowest energy state. Second: quantum mechanics. With the principles of classical mechanics, if the universe were to get to the global minimum from the local minimum, its energy will first have to be increased so it can surmount the intervening peaks. But with the principles of quantum mechanics, the universe can tunnel through the intervening peaks to sink into the global minimum. And such tunnelling could occur if the universe is currently in a local minimum only.

To find out, physicists try and calculate the shape of the scalar potential in its entirety. This is an intensely complicated mathematical process and takes lots of computing power to tackle, but that’s beside the point. The biggest problem is that we don’t know enough about the fundamental forces, and we don’t know anything about what else could be out there at higher energies. For example, it took an accelerator capable of boosting particles to 3,500 GeV and then smash them head-on to discover a particle weighing 125 GeV. Discovering anything heavier – i.e. more energetic – would take ever more powerful colliders costing many billions of dollars to build.

Almost sadistically, theoretical physicists have predicted that there exists an energy level at which the gravitational force unifies with the strong/weak nuclear and electromagnetic forces to become one indistinct force: the Planck scale, 12,200,000,000,000,000,000 GeV. We don’t know the mechanism of this unification, and its rules are among the most sought-after in high-energy physics. Last week, Chinese physicists announced that they were planning to build a supercollider bigger than the LHC, called the Circular Electron-Positron Collider (CEPC), starting 2020. The CEPC is slated to collide particles at 100,000 GeV, more than 7x the energy at which the LHC collides particles now, in a ring 54.7 km long. Given the way we’re building our most powerful particle accelerators, one able to smash particles together at the Planck scale would have to be as large as the Milky Way.

(Note: 12,200,000,000,000,000,000 GeV is the energy produced when 57.2 litres of gasoline are burnt, which is not a lot of energy at all. The trick is to contain so much energy in a particle as big as the proton, whose diameter is 0.000000000000001 m. That is, the energy density is 1064 GeV/m3.)

We also don’t know how the Standard Model scales from the energy levels it currently inhabits unto the Planck scale. If it changes significantly as it scales up, then the forces’ contributions to the scalar potential will change also. Physicists think that if any new bosons, essentially new forces, appear along the way, then the equations defining the scalar potential – our picture of the peaks and valleys – will have to be changed themselves. This is why physicists want to arrive at more precise values of, say, the mass of the Higgs boson.

Or the mass of the top quark. While force-carrying particles are called bosons, matter-forming particles are called fermions. Quarks are a type of fermion; together with force-carriers called gluons, they make up protons and neutrons. There are six kinds, or flavours, of quarks, and the heaviest is called the top quark. In fact, the top quark is the heaviest known fundamental particle. The top quark’s mass is particularly important. All fundamental particles get their mass from interacting with the Higgs field – the more the level of interaction, the higher the mass generated. So a precise measurement of the top quark’s mass indicates the Higgs field’s strongest level of interaction, or “loudest conversation”, with a fundamental particle, which in turn contributes to the scalar potential.

On November 9, a group of physicists from Russia published the results of an advanced scalar-potential calculation to find where the universe really lay: in a local minimum or in a stable global minimum. They found that the universe was in a local minimum. The calculations were “advanced” because they used the best estimates available for the properties of the various fundamental forces, as well as of the Higgs boson and the top quark, to arrive at their results, but they’re still not final because the estimates could still vary. Hearteningly enough, the physicists also found that if the real values in the universe shifted by just 1.3 standard deviations from our best estimates of them, our universe would enter the global minimum and become truly stable. In other words, the universe is situated in a shallow valley on one side of a peak of the scalar potential, and right on the other side lies the deepest valley of all that it could sit in for ever.

If the Russian group’s calculations are right (though there’s no quick way for us to know if they aren’t), then there could be a distant future – in human terms – where the universe tunnels through from the local to the global minimum and enters a new state. If we’ve assumed that the laws and forces of nature haven’t changed in the last 13.8 billion years, then we can also assume that in the fully stable state, these laws and forces could change in ways we can’t predict now. The changes would sweep over from one part of the universe into others at the speed of light, like a shockwave, redefining all the laws that let us exist. One moment we’d be around and gone the next. For all we know, that breadth of 1.3 standard deviations between our measurements of particles’ and forces’ properties and their true values could be the breath of our lives.

The Wire
November 11, 2015

Why you should care about the mass of the top quark

In a paper published in Physical Review Letters on July 17, 2014, a team of American researchers reported the most precisely measured value yet of the mass of the top quark, the heaviest fundamental particle. Its mass is so high that can exist only in very high energy environments – such as inside powerful particle colliders or in the very-early universe – and not anywhere else.

For this, the American team’s efforts to measure its mass come across as needlessly painstaking. However, there’s an important reason to get as close to the exact value as possible.

That reason is 2012’s possibly most famous discovery. It was drinks-all-round for the particle physics community when the Higgs boson was discovered by the ATLAS and CMS experiments on the Large Hadron Collider (LHC). While the elation lasted awhile, there were already serious questions being asked about some of the boson’s properties. For one, it was much lighter than is anticipated by some promising areas of theoretical particle physics. Proponents of an idea called naturalness pegged it to be 19 orders of magnitude higher!

Because the Higgs boson is the particulate residue of an omnipresent energy field called the Higgs field, the boson’s mass has implications for how the universe should be. Being much lighter, physicists couldn’t explain why the boson didn’t predicate a universe the size of a football – while their calculations did.

In the second week of September 2014, Stephen Hawking said the Higgs boson will cause the end of the universe as we know it. Because it was Hawking who said and because his statement contained the clause “end of the universe”, the media hype was ridiculous yet to be expected. What he actually meant was that the ‘unnatural’ Higgs mass had placed the universe in a difficult position.

The universe would ideally love to be in its lowest energy state, like you do when you’ve just collapsed into a beanbag with beer, popcorn and Netflix. However, the mass of the Higgs has trapped it on a chair instead. While the universe would still like to be in the lower-energy beanbag, it’s reluctant to get up from the higher-energy yet still comfortable chair.

Someday, according to Hawking, the universe might increase in energy (get out of the chair) and then collapsed into its lowest energy state (the beanbag). And that day is trillions of years away.

What does the mass of the top quark have to do with all this? Quite a bit, it turns out. Fundamental particles like the top quark possess their mass in the form of potential energy. They acquire this energy when they move through the Higgs field, which is spread throughout the universe. Some particles acquire more energy than others. How much energy is acquired depends on two parameters: the strength of the Higgs field (which is constant), and the particle’s Higgs charge.

The Higgs charge determines how strongly a particle engages with the Higgs field. It’s the highest for the top quark, which is why it’s also the heaviest fundamental particle. More relevant for our discussion, this unique connection between the top quark and the Higgs boson is also what makes the top quark an important focus of studies.

Getting the mass of the top quark just right is important to better determining its Higgs charge, ergo the extent of its coupling with the Higgs boson, ergo better determining the properties of the Higgs boson. Small deviations in the value of the top quark’s mass could spell drastic changes in when or how our universe will switch from the chair to the beanbag.

If it does, all our natural laws would change. Life would become impossible.

The American team that made the measurements of the top quark used values obtained from the D0 experiment on the Tevatron particle collider, at the Fermi National Accelerator Laboratory. The Tevatron was shut in 2011, so their measurements are the collider’s last words on top quark mass: 174.98 ± 0.76 GeV/c2 (the Higgs boson weighs around 126 GeV/c2; a gold atom, considered pretty heavy, weighs around 210 GeV/c2). This is a precision of better than 0.5%, the finest yet. This value is likely to be updated once the LHC restarts early next year.

Featured image: Screenshot from Inception

Where does the Higgs boson come from?

When the Chelyabinsk meteor – dubbed Chebarkul – entered Earth’s atmosphere at around 17 km/s, it started to heat up due to friction. After a point, cracks already present on the chunk of rock weighing 9,000-tonnes became licensed to widen and eventually split off Chebarkul into smaller parts.

While the internal structure of Chebarkul was responsible for where the cracks widened and at what temperature and other conditions, the rock’s heating was the tipping point. Once it got hot enough, its crystalline structure began to disintegrate in some parts.

Spontaneous symmetry-breaking

About 13.75 billion years ago, this is what happened to the universe. At first, there was a sea of energy, a symmetrically uniform block. Suddenly, this block was rapidly exposed to extreme heat. Once it hit about 1015 kelvin – 173 billion times hotter than our Sun’s surface – the block disintegrated into smaller packets called particles. Its symmetry was broken. The Big Bang had happened.


The Big Bang splashed a copious amount of energy across the universe, whose residue is perceivable as the CMBR.

Quickly, the high temperature fell off, but the particles couldn’t return to their original state of perfect togetherness. The block was broken forever, and the particles now had to fend for themselves. There was a disturbance, or perturbations, in the system, and the forces started to act. Physicists today call this the Nambu-Goldstone (NG) mode, named for Jeffrey Goldstone and Yoichiro Nambu.

In the tradition of particle physics treating with everything in terms of particles, the forces in the NG mode were characterised in terms of NG bosons. The exchange of these bosons between two particles meant they were exchanging forces. Since each boson is also a particle, a force can be thought of as the exchange of energy between two particles or bodies.

This is just like the concept of phonons in condensed matter physics: when atoms part of a perfectly arranged array vibrate, physicists know they contain some extra energy that makes them restless. They isolate this surplus in the form of a particle called a phonon, and address the entire array’s surplus in terms of multiple phonons. So, as a series of restlessness moves through the solid, it’ll be like a sound wave moving through it. Simplifies the math.

Anyway, the symmetry-breaking also gave rise to some fundamental forces. They’re called ‘fundamental’ because of their primacy, and because they’re still around. They were born because the disturbances in the energy block, encapsulated as the NG bosons, were interacting with an all-pervading background field called the Higgs field.

The Higgs field has four components, two charged and two uncharged. Another, more common, example of a field is the electric field, which has two components: some strength at a point (charged) and the direction of the strength at that point (neutral). Components of the Higgs field perturbed the NG bosons in a particular way to give rise to four fundamental forces, one for each component.

So, just like in Chebarkul’s case, where its internal structure dictated where the first cracks would appear, in the block’s case, the heating had disturbed the energy block to awaken different “cracks” at different points.

The Call of Cthulhu

The first such “crack” to be born was the electroweak force. As the surroundings of these particles continued to cool, the electroweak force split into two: electromagnetic (eM) and weak forces.

The force-carrier for the eM force is called a photon. Photons can exist at different energies, and at each energy-level, they have a corresponding frequency. If a photon happens to be in the “visible range” of energy-levels, then each frequency shows itself as a colour. And so on…

The force-carriers of the weak forces are the W+, W-, and Z bosons. At the time the first W/Z bosons came to life, they were massless. We know now because of Einstein’s mass-energy equivalence that this means the bosons had no energy. How were they particulate, then?

Imagine an auditorium where an important lecture’s about to be given. You get there early, your friend is late, and you decide to reserve a seat for her. Then, your friend finally arrives 10 minutes after the lecture’s started and takes her seat. In this scenario, after your arrival, the seat was there all along as ‘friend’s seat’, even though your friend took her time to get there.

Similarly, the W/Z bosons, which became quite massive later on, were initially massless. They had to have existed when the weak force came to life, if only to account for a new force that had been born. The debut of massiveness happened when they “ate” the NG bosons – the disturbed block’s surplus energy – and became very heavy.

Unfortunately for them, their snacking was irreversible. The W/Z bosons couldn’t regurgitate the NG bosons, so they were doomed to be forever heavy and, consequently, short-ranged. That’s why the force that they mediate is called the weak force: because it acts over very small distances.

You’ll notice that the W+, W-, and Z bosons make up for only three components of the Higgs field. What about the fourth component?

Enter: Higgs boson

That’s the Higgs boson. And now, getting closer to pinning down the Higgs boson means we’re also getting closer to pinning down the Higgs mechanism as valid, a quantum mechanical formulation within which we understand the behaviours of these particles and forces. This formulation is called the Standard Model.

(This blog post first appeared at The Copernican on March 8, 2013.)

What’s allowed and disallowed in the name of SUSY

The International Conference on High Energy Physics (ICHEP) is due to begin on July 7 in Melbourne. This is the 26th episode of the most prestigious scientific conference on particle physics. In keeping with its stature, scientists from the ATLAS and CMS collaborations at the LHC plan to announce the results of preliminary tests conducted to look for the Higgs boson on July 4. Although speculations still will run rife within the high-energy and particle physics communities, they will be subdued; after all, nobody wants to be involved in another OPERAtic fiasco.

Earlier this year, CERN announced that the beam energy at the LHC would be increased from 3.5 TeV/beam to 4 TeV/beam. This means the collision energy will see a jump from 7 TeV to 8 TeV, increasing the chances of recreating the elusive Higgs boson, the “God particle”, and confirming if the Standard Model is able to explain the mechanism of mass formation in this universe. While this was the stated goal when the LHC was being constructed, another particle physics hypothesis was taking shape that lent itself to the LHC’s purpose.

In 1981, Howard Georgi and Savas Dimopoulos proposed a correction to the Standard Model to solve for what is called the hierarchy problem. Specifically, the question is why the weak force (mediated by the W± and Z bosons) is 1032 times stronger than gravity. Both forces are mediated by natural constants: Fermi’s constant for the weak force and for gravity, Newton’s constant. However, when operations of the Standard Model are used to quantum-correct for Fermi’s constant (a process that involves correcting for errors), its value starts to deviate from closer to Newton’s constant to something much, much higher.

Savas Dimopoulos (L) and Howard Georgi

Even by the late 1960s, the propositions of the Standard Model were cemented strongly enough into the psyche of mathematicians and scientists the world over: it had predicted with remarkable accuracy most naturally occurring processes and had predicted the existence of other particles, too, discovered later at detectors such as the Tevatron, ATLAS, CMS, and ZEUS. In other words, it was inviolable. At the same time, there were no provisions to correct for the deviation, indicating that there could be certain entities – particles and forces – that were yet to be discovered and that could solve the hierarchy problem, and perhaps explain the nature of dark matter, too.

So, the 1981 Georgi-Dimopoulos solution was called the Minimal Supersymmetric Standard Model (MSSM), a special formulation of supersymmetry, first proposed in 1966 by Hironari Miyazawa, that paired particles of half-integer spin with those of integer spin and vice versa. (The spin of a particle is the quantum mechanical equivalent of its orbital angular momentum, although one has never been representative of the other. Expressed in multiples of the reduced Planck’s constant, particle spin is denoted in natural units as simply an integer or half-integer.)

Particles of half-integer spin are called fermions and include leptons and quarks. Particles with integer spin are called bosons and comprise photons, the W± and Z bosons, eight gluons, and the hypothetical, scalar boson named after co-postulator Peter Higgs. The principle of supersymmetry (SUSY) states that for each fermion, there is a corresponding boson, and for each boson, there is a corresponding fermion. Also, if SUSY is assumed to possess an unbroken symmetry, then a particle and its superpartner will have the same mass. The superpartners are yet to be discovered, and if anyone has a chance of finding them, it has to be at the LHC.

MSSM solved for the hierarchy problem, which could be restated as the mass of the Higgs boson being much lower than the mass at which new physics appears (Planck mass), by exploiting the effects of what is called the spin-statistics theorem (SST). SST implies that the quantum corrections to the Higgs-mass-squared will be positive if from a boson, and negative if from a fermion. Along with MSSM, however, because of the existence of a superpartner to every particle, the contribution to the correction, Δm2H, is zero. This result leaves the Higgs mass lower than the Planck mass.

The existence of extra dimensions has been proposed to explain the hierarchy problem. However, the law of parsimony, insofar as SUSY seems validatable, prevents physicists from turning so radical.

MSSM didn’t just stabilize the weak scale: in turn, it necessitated the existence of more than one Higgs field for mass-coupling since the Higgs boson would have a superpartner, the fermionic Higgsino. For all other particles, though, particulate doubling didn’t involve an invocation of special fields or extrinsic parameters and was fairly simple. The presence of a single Higgsino in the existing Higgs field would supply an extra degree of freedom (DoF), leaving the Higgs mechanism theoretically inconsistent. However, the presence of two Higgsinos instead of one doesn’t lead to this anomaly (called the gauge anomaly).

The necessity of a second Higgs field was reinforced by another aspect of the Higgs mechanism: mass-coupling. The Higgs boson binds stronger to the heavier particle, which means that there must be a coupling constant to describe the proportionality. This was named after Hideki Yukawa, a Japanese theoretical physicist, and termed λf. When a Higgs boson couples with an up-quark, λf = +1/2; when it couples with a down-quark, λf = -1/2. SUSY, however, prohibits this switch to the value’s complex conjugate (a mass-reducing move), and necessitates a second Higgs field to describe the interactions.

A “quasi-political” explanation of the Higgs mechanism surfaced in 1993 and likened the process to a political leader entering a room full of party members. As she moved through the room, the members moved out of their evenly spaced “slots” and towards her, forming a cluster around her. The speed of the leader was then restricted because there were always a knot of people around her, and she became slowed (like a heavy particle). Finally, as she moved away, the members returned to their original positions in the room.

The MSSM-predicted superpartners are thought to have masses 100- to 1,000-times that of the proton, and require extremely large energies to be recreated in a hadronic collision. The sole, unambiguous way to validate the MSSM theory is to spot the particles in a laboratory experiment (such as those conducted at CERN, not in a high-school chemistry lab). Even as the LHC prepares for that, however, there are certain aspects of MSSM that aren’t understood even theoretically.

The first is the mu problem (that arises in describing the superpotential, or mass, of the Higgsino). Mu appears in the term μHuHd, and in order to perfectly describe the quantum vacuum expectation value of the Higgsino after electroweak symmetry breaking (again, the Higgsino’s mass), mu’s value must be of that order of magnitude close to the electroweak scale (As an analog of electroweak symmetry breaking, MSSM also introduces a soft SUSY-breaking, the terms of which must also be of the order of magnitude of the electroweak scale). The question is whence these large differences in magnitudes, whether they are natural, and if they are, then how.

The second is the problem of flavour mixing. Neutrinos and quarks exhibit a property called flavours, which they seem to change through a mechanism called flavour-mixing. Since no instances of this phenomenon have been observed outside the ambit of the Standard Model, the new terms introduced by MSSM must not interfere with it. In other words, MSSM must be flavour-invariant, and, by an extension of the same logic, CP-invariant.

Because of its involvement in determining which particle has how much mass, MSSM plays a central role in clarifying our understanding of gravity as well as, it has been theorized, in unifying gravity with special relativity. Even though it exists only in the theoretical realm, even though physicists are attracted to it because its consequences seem like favourable solutions, the mathematics of MSSM does explain many of the anomalies that threaten the Standard Model. To wit, dark matter is hypothesized to be the superpartner of the graviton, the particle that mediates the gravitational force, and is given the name gravitino (Here’s a paper from 2007 that attempts to explain the thermal production of gravitinos in the early universe).

While the beam energies were increased in pursuit of the Higgs boson after CERN’s landmark December 13, 2011 announcement, let’s hope that the folks at ATLAS, CMS, ALICE, and other detectors have something to say about opening the next big chapter in particle physics, the next big chapter that will bring humankind one giant leap closer to understanding the universe and the stuff that we’re made of.