US experiments find hint of a break in the laws of physics

At 9 pm India time on April 7, physicists at an American research facility delivered a shot in the arm to efforts to find flaws in a powerful theory that explains how the building blocks of the universe work.

Physicists are looking for flaws in it because the theory doesn’t have answers to some questions – like “what is dark matter?”. They hope to find a crack or a hole that might reveal the presence of a deeper, more powerful theory of physics that can lay unsolved problems to rest.

The story begins in 2001, when physicists performing an experiment in Brookhaven National Lab, New York, found that fundamental particles called muons weren’t behaving the way they were supposed to in the presence of a magnetic field. This was called the g-2 anomaly (after a number called the gyromagnetic factor).

An incomplete model

Muons are subatomic and can’t be seen with the naked eye, so it could’ve been that the instruments the physicists were using to study the muons indirectly were glitching. Or it could’ve been that the physicists had made a mistake in their calculations. Or, finally, what the physicists thought they knew about the behaviour of muons in a magnetic field was wrong.

In most stories we hear about scientists, the first two possibilities are true more often: they didn’t do something right, so the results weren’t what they expected. But in this case, the physicists were hoping they were wrong. This unusual wish was the product of working with the Standard Model of particle physics.

According to physicist Paul Kyberd, the fundamental particles in the universe “are classified in the Standard Model of particle physics, which theorises how the basic building blocks of matter interact, governed by fundamental forces.” The Standard Model has successfully predicted the numerous properties and behaviours of these particles. However, it’s also been clearly wrong about some things. For example, Kyberd has written:

When we collide two fundamental particles together, a number of outcomes are possible. Our theory allows us to calculate the probability that any particular outcome can occur, but at energies beyond which we have so far achieved, it predicts that some of these outcomes occur with a probability of greater than 100% – clearly nonsense.

The Standard Model also can’t explain what dark matter is, what dark energy could be or if gravity has a corresponding fundamental particle. It predicted the existence of the Higgs boson but was off about the particle’s mass by a factor of 100 quadrillion.

All these issues together imply that the Standard Model is incomplete, that it could be just one piece of a much larger ‘super-theory’ that works with more particles and forces than we currently know. To look for these theories, physicists have taken two broad approaches: to look for something new, and to find a mistake with something old.

For the former, physicists use particle accelerators, colliders and sophisticated detectors to look for heavier particles thought to exist at higher energies, and whose discovery would prove the existence of a physics beyond the Standard Model. For the latter, physicists take some prediction the Standard Model has made with a great degree of accuracy and test it rigorously to see if it holds up. Studies of muons in a magnetic field are examples of this.

According to the Standard Model, a number associated with the way a muon swivels in a magnetic field is equal to 2 plus 0.00116591804 (with some give or take). This minuscule addition is the handiwork of fleeting quantum effects in the muon’s immediate neighbourhood, and which make it wobble. (For a glimpse of how hard these calculations can be, see this description.)

Fermilab result

In the early 2000s, the Brookhaven experiment measured the deviation to be slightly higher than the model’s prediction. Though it was small – off by about 0.00000000346 – the context made it a big deal. Scientists know that the Standard Model has a habit of being really right, so when it’s wrong, the wrongness becomes very important. And because we already know the model is wrong about other things, there’s a possibility that the two things could be linked. It’s a potential portal into ‘new physics’.

“It’s a very high-precision measurement – the value is unequivocal. But the Standard Model itself is unequivocal,” Thomas Kirk, an associate lab director at Brookhaven, had told Science in 2001. The disagreement between the values implied “that there must be physics beyond the Standard Model.”

This is why the results physicists announced today are important.

The Brookhaven experiment that ascertained the g-2 anomaly wasn’t sensitive enough to say with a meaningful amount of confidence that its measurement was really different from the Standard Model prediction, or if there could be a small overlap.

Science writer Brianna Barbu has likened the mystery to “a single hair found at a crime scene with DNA that didn’t seem to match anyone connected to the case. The question was – and still is – whether the presence of the hair is just a coincidence, or whether it is actually an important clue.”

So to go from ‘maybe’ to ‘definitely’, physicists shipped the 50-foot-wide, 15-tonne magnet that the Brookhaven facility used in its Muon g-2 experiment to Fermilab, the US’s premier high-energy physics research facility in Illinois, and built a more sensitive experiment there.

The new result is from tests at this facility: that the observation differs from the Standard Model’s predicted value by 0.00000000251 (give or take a bit).

The Fermilab results are expected to become a lot better in the coming years, but even now they represent an important contribution. The statistical significance of the Brookhaven result was just below the threshold at which scientists could claim evidence but the combined significance of the two results is well above.

Potential dampener

So for now, the g-2 anomaly seems to be real. It’s not easy to say if it will continue to be real as physicists further upgrade the Fermilab g-2’s performance.

In fact there appears to be another potential dampener on the horizon. An independent group of physicists has had a paper published today saying that the Fermilab g-2 result is actually in line with the Standard Model’s prediction and that there’s no deviation at all.

This group, called BMW, used a different way to calculate the Standard Model’s value of the number in question than the Fermilab folks did. Aida El-Khadra, a theoretical physicist at the University of Illinois, told Quanta that the Fermilab team had yet to check BMW’s approach, but if it was found to be valid, the team would “integrate it into its next assessment”.

The ‘Fermilab approach’ itself is something physicists have worked with for many decades, so it’s unlikely to be wrong. If the BMW approach checks out, it’s possible according to Quanta that just the fact that two approaches lead to different predictions of the number’s value is likely to be a new mystery.

But physicists are excited for now. “It’s almost the best possible case scenario for speculators like us,” Gordan Krnjaic, a theoretical physicist at Fermilab who wasn’t involved in the research, told Scientific American. “I’m thinking much more that it’s possibly new physics, and it has implications for future experiments and for possible connections to dark matter.”

The current result is also important because the other way to look for physics beyond the Standard Model – by looking for heavier or rarer particles – can be harder.

This isn’t simply a matter of building a larger particle collider, powering it up, smashing particles and looking for other particles in the debris. For one, there is a very large number of energy levels at which a particle might form. For another, there are thousands of other particle interactions happening at the same time, generating a tremendous amount of noise. So without knowing what to look for and where, a particle hunt can be like looking for a very small needle in a very large haystack.

The ‘what’ and ‘where’ instead come from different theories that physicists have worked out based on what we know already, and design experiments depending on which one they need to test.

Into the hospital

One popular theory is called supersymmetry: it predicts that every elementary particle in the Standard Model framework has a heavier partner particle, called a supersymmetric partner. It also predicts the energy ranges in which these particles might be found. The Large Hadron Collider (LHC) in CERN, near Geneva, was powerful enough to access some of these energies, so physicists used it and went looking last decade. They didn’t find anything.

A table showing searches for particles associated with different post-standard-model theories (orange labels on the left). The bars show the energy levels up to which the ATLAS detector at the Large Hadron Collider has not found the particles. Table: ATLAS Collaboration/CERN

Other groups of physicists have also tried to look for rarer particles: ones that occur at an accessible energy but only once in a very large number of collisions. The LHC is a machine at the energy frontier: it probes higher and higher energies. To look for extremely rare particles, physicists explore the intensity frontier – using machines specialised in generating collisions.

The third and last is the cosmic frontier, in which scientists look for unusual particles coming from outer space. For example, early last month, researchers reported that they had detected an energetic anti-neutrino (a kind of fundamental particle) coming from outside the Milky Way participating in a rare event that scientists predicted in 1959 would occur if the Standard Model is right. The discovery, in effect, further cemented the validity of the Standard Model and ruled out one potential avenue to find ‘new physics’.

This event also recalls an interesting difference between the 2001 and 2021 announcements. The late British scientist Francis J.M. Farley wrote in 2001, after the Brookhaven result:

… the new muon (g-2) result from Brookhaven cannot at present be explained by the established theory. A more accurate measurement … should be available by the end of the year. Meanwhile theorists are looking for flaws in the argument and more measurements … are underway. If all this fails, supersymmetry can explain the data, but we would need other experiments to show that the postulated particles can exist in the real world, as well as in the evanescent quantum soup around the muon.

Since then, the LHC and other physics experiments have sent supersymmetry ‘to the hospital’ on more than one occasion. If the anomaly continues to hold up, scientists will have to find other explanations. Or, if the anomaly whimpers out, like so many others of our time, we’ll just have to put up with the Standard Model.

Featured image: A storage-ring magnet at Fermilab whose geometry allows for a very uniform magnetic field to be established in the ring. Credit: Glukicov/Wikimedia Commons, CC BY-SA 4.0.

The Wire Science
April 8, 2021

The Large Hadron Collider is back online, ready to shift from the “what” of reality to “why”

The world’s single largest science experiment will restart on March 23 after a two-year break. Scientists and administrators at the European Organization for Nuclear Research – known by its French acronym CERN – have announced the status of the agency’s upgrades on its Large Hadron Collider (LHC) and its readiness for a new phase of experiments running from now until 2018.

Before the experiment was shut down in late 2013, the LHC became famous for helping discover the elusive Higgs boson, a fundamental (that is, indivisible) particle that gives other fundamental particles their mass through a complicated mechanism. The find earned two of the physicists who thought up the mechanism in 1964, Peter Higgs and Francois Englert, a Nobel Prize in that year.

Though the LHC had fulfilled one of its more significant goals by finding the Higgs boson, its purpose is far from complete. In its new avatar, the machine boasts of the energy and technical agility necessary to answer questions that current theories of physics are struggling to make sense of.

As Alice Bean, a particle physicist who has worked with the LHC, said, “A whole new energy region will be waiting for us to discover something.”

The finding of the Higgs boson laid to rest speculations of whether such a particle existed and what its properties could be, and validated the currently reigning set of theories that describe how various fundamental particles interact. This is called the Standard Model, and it has been successful in predicting the dynamics of those interactions.

From the what to the why

But having assimilated all this knowledge, what physicists don’t know, but desperately want to, is why those particles’ properties have the values they do. They have realized the implications are numerous and profound: ranging from the possible existence of more fundamental particles we are yet to encounter to the nature of the substance known as dark matter, which makes up a great proportion of matter in the universe while we know next to nothing about it. These mysteries were first conceived to plug gaps in the Standard Model but they have only been widening since.

With an experiment now able to better test theories, physicists have started investigating these gaps. For the LHC, the implication is that in its second edition it will not be looking for something as much as helping scientists decide where to look to start with.

As Tara Shears, a particle physicist at the University of Liverpool, told Nature, “In the first run we had a very strong theoretical steer to look for the Higgs boson. This time we don’t have any signposts that are quite so clear.”

Higher energy, luminosity

The upgrades to the LHC that would unlock new experimental possibilities were evident in early 2012.

The machine works by using powerful electric currents and magnetic fields to accelerate two trains, or beams, of protons in opposite directions, within a ring 27 km long, to almost the speed of light and then colliding them head-on. The result is a particulate fireworks of such high energy that the most rare, short-lived particles are brought into existence before they promptly devolve into lighter, more common particles. Particle detectors straddling the LHC at four points on the ring record these collisions and their effects for study.

So, to boost its performance, upgrades to the LHC were of two kinds: increasing the collision energy inside the ring and increasing the detectors’ abilities to track more numerous and more powerful collisions.

The collision energy has been nearly doubled in its second life, from 7-8 TeV to 13-14 TeV. The frequency of collisions has also been doubled from one set every 50 nanoseconds (billionth of a second) to one every 25 nanoseconds. Steve Myers, CERN’s director for accelerators and technology, had said in December 2012, “More intense beams mean more collisions and a better chance of observing rare phenomena.”

The detectors have received new sensors, neutron shields to protect from radiation damage, cooling systems and superconducting cables. An improved fail-safe system has also been installed to forestall accidents like the one in 2008, when failing to cool a magnet led to a shut-down for eight months.

In all, the upgrades cost approximately $149 million, and will increase CERN’s electricity bill by 20% to $65 million. A “massive debugging exercise” was conducted last week to ensure all of it clicked together.

Going ahead, these new specifications will be leveraged to tackle some of the more outstanding issues in fundamental physics.

CERN listed a few–presumably primary–focus areas. They include investigating if the Higgs boson could betray the existence of undiscovered particles, the particles dark matter could be made of, why the universe today has much more matter than antimatter, and if gravity is so much weaker than other forces because it is leaking into other dimensions.

Stride forward in three frontiers

Physicists are also hopeful for the prospects of discovering a class of particles called supersymmetric partners. The theory that predicts their existence is called supersymmetry. It builds on some of the conclusions of the Standard Model, and offers predictions that plug its holes as well with such mathematical elegance that it has many of the world’s leading physicists enamored. These predictions involve the existence of new particles called partners.

In a neat infographic by Elizabeth Gibney in Nature, she explains that the partner that will be easiest to detect will be the ‘stop squark’ as it is the lightest and can show itself in lower energy collisions.

In all, the LHC’s new avatar marks a big stride forward not just in the energy frontier but also in the intensity and cosmic frontiers. With its ability to produce and track more collisions per second as well as chart the least explored territories of the ancient cosmos, it’d be foolish to think this gigantic machine’s domain is confined to particle physics and couldn’t extend to fuel cells, medical diagnostics or achieving systems-reliability in IT.

Here’s a fitting video released by CERN to mark this momentous occasion in the history of high-energy physics.

Featured image: A view of the LHC. Credit: CERN

Update: After engineers spotted a short-circuit glitch in a cooled part of the LHC on March 21, its restart was postponed from March 23 by a few weeks. However, CERN has assured that its a fully understood problem and that it won’t detract from the experiment’s goals for the year.

HESS telescopes discover new source of gamma rays called a superbubble

Optical image of the Milky Way and a multi-wavelength (optical, Hα) zoom into the Large Magellanic Cloud with superimposed H.E.S.S. sky maps.
Optical image of the Milky Way and a multi-wavelength (optical, Hα) zoom into the Large Magellanic Cloud with superimposed H.E.S.S. sky maps. (Milky Way image: © H.E.S.S. Collaboration, optical: SkyView, A. Mellinger; LMC image: © H.E.S.S. Collaboration, Hα: R. Kennicutt, J.E. Gaustad et al. (2001), optical (B-band): G. Bothun

Astronomers using the HESS telescopes have discovered a new source of high-energy gamma rays. Dubbed a superbubble, it appears to be a massive shell of gas and dust 270 light-years in diameter being blown outward by the radiation from multiple stars and supernovas. HESS also discovered two other gamma-ray sources, each a giant of its kind. One is a powerful supernova remnant and the other a pulsar wind nebula. All three objects are located in the Large Magellanic Cloud, a small satellite galaxy orbiting the Milky Way at a distance of 170,000 ly. As a result, these objects are not only the most luminous gamma-ray sources discovered to date but also the first sources discovered outside the Milky Way.

Gamma-rays are emitted when very energetic charged particles collide with other particles, such as in a cloud of gas. Therefore, gamma radiation in the sky is often used as a proxy for high-energy phenomena. And astronomers have for long known that the Large Magellanic Cloud houses many such clusters of frenzied activity: weight for weight of their stars, the Cloud’s supernova rate is five times that of the Milky Way. It also hosts the Tarantula Nebula, which is the most active star-forming region in the Local Group of galaxies (which includes the Milky Way, Andromeda, the Cloud and more than 50 others).

Super-luminous sources

It is in this environment that the superbubble – designated 30 Dor C – thrives. According to the HESS team’s notice, it “appears to have been created by several supernovae and strong stellar winds”. In the data, it is visible as a strong source of gamma-rays because it is filled by highly energetic particles. The notice adds that this freak of nature

“represents a new class of sources in the very high-energy regime.”

The other two super-luminous sources are familiar to astronomers. Pulsars, especially, are the extremely dense remnants of stars that have run out of hydrogen to fuse and imploded, resulting in a rapidly spinning core composed of neutrons and wound by fierce magnetic fields. They emit a jet of energetic particles from polar points on their surface that form nebulaic clouds. One such cloud is N 157B, emitted by PSR J0537 – 6910. According to the HESS team, N 157B outshines the Crab Nebula in gamma-rays. The Crab Nebula is Milky Way’s most famous and most powerful source of gamma-rays.

The third is a supernova remnant: the rapidly expanding shell of gas that a once-heavy dying star blows away as its core collapses. The shell can be expelled at more than thousand times the speed of sound, resulting in a shockwave that can accelerate nearby particles and heat up upstream gas clouds to millions of kelvin. The resulting glow can last for thousands of years – but the one HESS has seen in the Cloud seems to going strong for 2,500-6,000 years, much longer than astronomers thought possible. It’s called N132D.

“Obviously, the high star formation rate of the LMC causes it to breed very extreme objects,” said Chia Chun Lu, a student at the Max Planck Institute for Astronomy in Heidelberg who analyzed the data for her thesis.

Imaging Cherenkov radiation

Detecting gamma-rays is no easy task because it requires the imaging of Cherenkov radiation. Just as when a jet flies through air at faster than the speed of sound and results in a sonic boom, a charged particle traveling at faster than the speed of light in that medium results in a shockwave of energy called Cherenkov radiation. This typically lasts a few billionths of a second and requires extremely sensitive cameras to capture.

When high-energy particles collide with the upper strata of Earth’s atmosphere, they percolate through while triggering the release of Cherenkov radiation. The five ground-based HESS telescopes – whose name stands for High Energy Stereoscopic System – quickly capture their bluish flashes before they disappear, and reconstruct their sources’ energy based on theirs. So, while gamma-rays can be a proxy for high-energy phenomena in the distant reaches of the cosmos, Cherenkov radiation in the upper atmosphere is a proxy for the gamma radiation itself.

Very-high-energy gamma-rays, of the order emitted by the Crab pulsar at the center of its nebula, are often the result of events that have made astronomers redefine what they consider anomalous. A good example is of GRB 080916C, a gamma-ray burst spotted in 2009 at about 12 billion ly from Earth. It was the result of a star collapsing into a black hole, with consequent ‘burp’ of energy lasting for a whopping 23 minutes. Valerie Connaughton, of the University of Alabama, Huntsville, and one of the members of the team studying the burst, said of its energy: “… it would be equivalent to 4.9 times the mass of the sun being converted to gamma rays in a matter of minutes”.

Natural particle accelerators

Such profuse emissions can behave like natural particle accelerators, often reaching energies the Large Hadron Collider can only dream of. They give scientists the opportunity to study particles as well as the vacuum of space in conditions closer to that prevalent at the time of the Big Bang, in effect rendering the telescopes that study them as probes of fundamental physics. In the case of GRB 080916C, for example, low-energy gamma-rays dominated the first five seconds of emissions, following by the high-energy gamma-rays for the next twenty minutes. As astronomy-blogger Paul Gilster interpreted this,

They might also give us a read on theories of quantum gravity that suggest empty space is actually a froth of quantum foam, one that would allow lighter, lower-energy gamma rays to move more quickly than their higher-energy cousins. Future observations to study unusual time lags like these should help us pin down a plausible explanation.

The Fermi orbiting telescope that spotted the burst is also used to look for dark matter. When certain hypothetical particles of dark matter annihilate or decay, they yield high-energy antielectrons that could then annihilate upon colliding with electrons and yield gamma-rays. These are measured by Fermi. Then, astronomers use preexisting data as a filter to extrude anomalous observations and use it inform their theories of dark matter.

In this sense, the HESS telescopes are important observers of the universe. They comprise five telescopes, of which four, each 12 meters in diameter, are situated on the corners of a square of side 120 m. At the center is the fifth telescope of diameter 28 m. The array, fixed up with computers to work as one big telescope, is located in Namibia, and is capable of observing gamma-ray fluxes in the range 30 GeV to 100 TeV. In 2015, in fact, construction for the more-impressive $268-million Cherenkov Telescope Array will start. Upon completion, it will be able to study gamma-ray fluxes of 100 TeV but with a wider angle of observation and much larger collecting area.

Whether or not the CTA can pinpoint the existence of dark matter, it will likely allow astronomers to discover more superbubbles, pulsar wind nebulae, supernova remnants and gamma-ray bursts, each more revealing than the last about the universe’s deepest secrets.

Possible first signal of dark matter detected

A simulated map of the universe's dark matter (in blue) compiled after extensive observations with the Hubble space telescope.
A simulated map of the universe’s dark matter (in blue) compiled after extensive observations with the Hubble space telescope. Credit: J.-P. Kneib (Observatoire Midi-Pyrenees, Caltech) et al., ESA, NASA

Dark matter is thought to make up more than 80% of the matter in our universe. However, it is relatively difficult to detect for various reasons. The two most important are because

  1. Scientists don’t know what the constituent particles of dark matter are, or how much they could weigh. There are various theories – each of them describes a different particle with distinct properties. Various observatories around the world and in space have been looking for them, with little success.
  2. Dark matter interacts with normal matter only through the force of gravity. And of the four known fundamental forces in nature, gravity is the weakest even though it acts across the largest distances. Moreover, there are enough objects in the universe that exert the gravitational forces. Filtering out a gravitational signal coming solely from dark matter is difficult in this sense.

Thankfully, many of these theories postulate other ways to find dark matter. One of them predicts that the particles of dark matter are sterile neutrinos. Neutrinos are a class of fundamental particles that have the lowest mass in nature – aside from the massless photon – and interact excruciatingly rarely with normal matter.

These interactions are confined to happen through the gravitational force and the weak force. However, sterile neutrinos are unique because they interact only through the gravitational force.

The sterile neutrino

When a sterile neutrino decays, it yields one massless photon and one normal neutrino, according to the sterile neutrino model. Because of the configuration of masses, the photon is detectable as an X-ray emission. Moreover, if the dark matter particle has mass in the keV region, the X-ray photon should have an energy of a few keVs.

Precisely this emission line was detected by two groups of astrophysicists who were studying the Perseus cluster of galaxies, located in the constellation of Perseus, in 2012. It is one of the most massive objects in the universe, and is thought to contain 190 million trillion trillion trillion kilograms of dark matter. This vast quantity means that even if the dark matter decay rate is slow – with lifetimes of 1021 years – there are still about 1077 sterile neutrinos of keV mass decaying into X-ray photons and neutrinos in the Perseus cluster.

One group, lead from the Institute for Theoretical Physics in the University of Leiden, Germany, used the ESA XMM-Newton X-ray observatory to measure an X-ray emission of 3.5 keV coming from the Perseus cluster. Another group, lead from the Harvard-Smithsonian Center for Astrophysics (CfA), USA, used the NASA Chandra X-ray observatory to observe the same emissions. XMM-Newton and Chandra are space-based observatories.

Both groups published their papers in December 2014, and both groups were able to measure the emission with confidence levels more than 99.999%. However, their observations need further confirmation before they can graduate from being scientific evidence to knocking on the doors of scientific fact.

The nature of these confirmations will be twofold. On the one hand, scientists will have to gather more evidence to assert that the X-ray emission is indeed from dark matter decays. To this end, they will need to see if the emission intensity varies according to how the density of dark matter varies through space. They will also look for a Doppler effect, which would make the emission look like a smear in a spectrograph.

On the other hand, to deny that the X-ray emission could be from other sources will require a thorough knowledge of other sources in the same volume of space that could emit X-ray photons and their behavior over time. Fortunately, this knowledge already exists – it was on its strength that the two groups that made the observation were able to postulate that the emission was from dark matter decays. Nevertheless, more detailed descriptions of the gases, elements, compounds and other objects in the area will be sought.

The Astro-H telescope

Even so, there’s one more problem: if the observation was made with high confidence, why was the signal weak? If this is indeed the dark matter signature that scientists have been looking for for over six decades, why isn’t the X-ray emission line more pronounced?

Alexey Boyarsky from the University of Leiden notes,

The [dark matter] decay line is much narrower than the spectral resolution of the present day X-ray telescopes and, as previous searches have shown, should be rather weak.

As if by coincidence, Esra Bulbul from the CfA highlights a solution that’s on the horizon: the planned 14-meter-long Astro-H X-ray telescope to be launched in 2015 by the Japanese Aerospace Exploration Agency. As Bulbul writes,

The future high-resolution Astro-H observations will be able to measure the broadening of the line, which will allow us to measure its velocity dispersion. To detect a dark matter decay line [that is weaker than other lines] will require a significantly long exposure.

The excitement these discoveries have set off is palpable, and deserves to be. Bulbul had told NASA in 2012, “After we submitted the paper, theoreticians came up with about 60 different dark matter types which could explain this line. Some particle physicists have jokingly called this particle a ‘bulbulon’.”

Apart from trying to assert the findings and invalidate competing theories, scientists will – rather could – also look for sterile neutrinos through neutrino experiments on Earth. Although some particles similar to them have been detected in the past, experiments looking for sterile neutrinos henceforth will also have to focus on the 3.5 keV mass scale.

A new LHC: 10 things to look out for

Through an extraordinary routine, the most powerful machine built by humankind is slowly but surely gearing up for its relaunch in March 2015. The Large Hadron Collider (LHC), straddling the national borders of France and Switzerland, will reawaken after two years of upgrades and fixes to smash protons at nearly twice the energy it did during its first run that ended in March 2012. Here are 10 things to look out for: five upgrades and five possible exciting discoveries.

Technical advancements

  1. Higher collision energy – In its previous run, each beam of protons destined for collision with other beams was accelerated to 3.5-4 TeV. By May 2015, each beam will be accelerated to 6.5-7 TeV. By doubling the collision energy, scientists hope to be able to observe higher-energy phenomena, such as heavier, more elusive particles.
  2. Higher collision frequency – Each beam has bunches of protons that are collided with other oncoming bunches at a fixed frequency. During the previous run, this frequency was once every 50 nanoseconds. In the new run, this will be doubled to once every 25 nanoseconds. With more collisions happening per unit time, rarer phenomena will happen more frequently and become easier to spot.
  3. Higher instantaneous luminosity – This is the detectability of particles per second. It will be increased by 10 times, to 1 × 1034 per cm2 per second. By 2022, engineers will aim to increase it to 7.73 × 1034 per cm2 per second.
  4. New pixel sensors – An extra layer of pixel sensors, to handle the higher luminosity regime, will be added around the beam pipe within the ATLAS and CMS detectors. While the CMS was built with higher luminosities in mind, ATLAS wasn’t, and its pixel sensors are expected to wear out within a year. As an intermediate solution, a temporary layer of sensors will be added to last until 2018.
  5. New neutron shields – Because of the doubled collision energy and frequency, instruments could be damaged by high-energy neutrons flying out of the beam pipe. To prevent this, advanced neutron shields will be screwed on around the pipe.

Research advancements

  1. Dark matter – The LHC is adept at finding particles both fundamental and composite previously unseen before. One area of physics desperately looking for a particle of its own is dark matter. It’s only natural for both quests to converge at the collider. A leader candidate particle for dark matter is the WIMP: weakly-interacting massive particle. If the LHC finds it, or finds something like it, it could be the next big thing after the Higgs boson, perhaps bigger.
  2. Dark energy – The universe is expanding at an accelerating pace. There is a uniform field of energy pervading it throughout that is causing this expansion, called the dark energy field. The source of dark energy’s potential is the vacuum of space, where extremely short-lived particles continuously pop in and out of existence. But to drive the expansion of the entire universe, the vacuum’s potential should be 10120 times what observations show it to be. At the LHC, the study of fundamental particles could drive better understanding of what the vacuum actually holds and where dark energy’s potential comes from.
  3. Supersymmetry – The Standard Model of particle physics defines humankind’s understanding of the behavior of all known fundamental particles. However, some of their properties are puzzling. For example, some natural forces are too strong for no known reason; some particles are too light. For this, physicists have a theory of particulate interactions called supersymmetry, SUSY for short. And SUSY predicts the existence of some particles that don’t exist in the Model yet, called supersymmetric partners. These are heavy particles that could show themselves in the LHC’s new higher-energy regime. Like with the dark matter WIMPs, finding a SUSY particle could by a Nobel Prize-winner.
  4. Higgs boson – One particle that’s too light in the Standard Model is the Higgs boson. As a result, physicists think it might not be the only Higgs boson out there. Perhaps there are others with the same properties but weigh lesser or more.
  5. Antimatter reactions – Among the class of particles called mesons, one – designated B0 – holds the clue to answering a question that has astrophysicists stymied for decades: Why does the universe have more matter than antimatter if, when it first came into existence, there were equal amounts of both? An older result from the LHC shows the B0 meson decays into more matter particles than antimatter ones. Probing further about why this is so will be another prominent quest of the LHC’s.

Bonus: Extra dimensions – Many alternate theories of fundamental particles require the existence of extra dimensions. The way to look for them is to create extremely high energies and then look for particles that might pop into one of the three dimensions we occupy from another that we don’t.

The hunt for supersymmetry: Reviewing the first run

What do dark matter, Higgs bosons, the electron dipole moment, topological superconductors and quantum loops have in common? These are exotic entities that scientists have been using to solve some longstanding problems in fundamental physics. Specifically, by studying these entities, they expect to discover new ingredients of the universe that will help them answer why it is the way it is. These ingredients could in come in a staggering variety, so it is important for scientists to narrow down what they’re looking for – which brings us to the question of why these entities are being studied. A brief summary:

  1. Dark matter is an exotic form of matter that is thought to interact with normal matter only through the gravitational force. Its existence was hypothesized in 1932-1933. Its exact nature is yet to be understood.
  2. Quantum loops refer to an intricate way in which some heavier particles decay into sets of lighter particles, involving the exchange of some other extremely short-lived particles. They have been theoretically known to exist for many decades.
  3. Topological superconductors are exotic materials that, under certain conditions, behave like superconductors on their surface and as insulators underneath. They were discovered fairly recently, around 2007, and how they manage to be this way is not fully understood.
  4. The Higgs boson‘s discovery was announced in July 2012 (and verified by March-June 2013). Math worked out on paper predicts that its mass ought to have been very high – but it was found to be much lower.
  5. The electron dipole moment is a measure of how spherical the electron is. Specifically, the EDM denotes how evenly the electron’s negative charge is distributed around it. While the well-understood laws of nature don’t prevent the charge from being uneven, they restrict the amount of unevenness to a very small value. The most precise measurement of this value to date was published in December 2013.

Clearly, these are five phenomena whose identities are incomplete. But more specifically, scientists have found a way to use advanced mathematics to complete all these identities with one encompassing theory called Supersymmetry (Susy). Unfortunately for them, the mathematics refuses to become real, i.e. scientists have failed to find evidence of Susy in experiments. Actually, that might be an overstatement: different experiments are at different stages of looking for Susy at work in giving these ‘freaks of nature’ a physical identity. On the other hand, it has been a few years since some of these experiments commenced – some of them are quite powerful indeed – and the only positive results they have had have been to say Susy cannot be found in this or that range.

But if signs of Susy are found, then the world of physics will be in tumult – in a good way, of course. It will get to replace an old theory called the Standard Model of particle physics. The Standard Model is the set of mathematical tools and techniques used to understand how fundamental particles make up different objects, what particles our universe is made of, how quantum loops work, how the Higgs boson could have formed, etc. But it has no answers for why there is dark matter, why the electron is allowed to have that small dipole moment, why topological superconductors work the way they do, why the Higgs boson’s mass is so low, etc.

Early next year, physicists will turn to the Large Hadron Collider (LHC) – which helped discover the Higgs boson in 2012 – after it wakes up from its two-year slumber to help find Susy, too. This LHC of 2015 will be way more powerful than the one that went dormant in early 2013 thanks to a slew of upgrades. Hopefully it will not disappoint, building on what it has managed to deliver for Susy until now. In fact, on April 28, 2014, two physicists from CERN submitted a preprint paper to the arXiv server summarizing the lessons for Susy from the LHC after the first run.

The hunt for supersymmetry: Is a choke on the cards?

The Copernican
April 28, 2014

“So irrelevant is the philosophy of quantum mechanics to its use that one begins to suspect that all the deep questions are really empty…”

— Steven Weinberg, Dreams of a Final Theory: The Search for the Fundamental Laws of Nature (1992)

On a slightly humid yet clement January evening in 2013, a theoretical physicist named George Sterman was in Chennai to attend a conference at the Institute of Mathematical Sciences. After the last talk of the day, he had strolled out of the auditorium and was mingling with students when I managed to get a few minutes with him. I asked for an interview and he agreed.

After some coffee, we seated ourselves at a kiosk in the middle of the lawn, the sun was setting, and mosquitoes abounded. Sterman was a particle physicist, so I opened with the customary question about the Higgs boson and expected him to swat it away with snowclones of the time like “fantastic”, “tribute to 50 years of mathematics” and “long-awaited”. He did say those things, but then he also expressed some disappointment.

George Sterman is distinguished for his work in quantum chromodynamics (QCD), for which he won the prestigious J.J. Sakurai Prize in 2003. QCD is a branch of physics that deals with particles that have a property called colour charge. Quarks and gluons are examples of such particles; these two together with electrons are the proverbial building blocks of matter. Sterman has been a physicist since the 1970s, the early years as far as experimental particle physics research is concerned.

The Standard Model disappoints

Over the last four or so decades, remarkable people like him have helped construct a model of laws, principles and theories that the rigours of this field are sustaining on, called the Standard Model of particle physics. And it was the reason Sterman was disappointed.

According to the Standard Model, Sterman explained, “if we gave our any reasonable estimate of what the mass of the Higgs particle should be, it should by all rights be huge! It should be as heavy as what we call the Planck mass.”

But it isn’t. The Higgs mass is around 125 GeV (GeV being a unit of energy that corresponds to certain values of a particle’s mass) – compare it with the proton that weighs 0.938 GeV. On the other hand, the Planck mass is 10^19 GeV. Seventeen orders of magnitude lie in between. According to Sterman, this isn’t natural. The question is why does there have to be such a big difference in what we can say the mass could be and what we find it to be.

Martinus Veltman, a Dutch theoretical physicist who won the Nobel Prize for physics in 2003 for his work in particle physics, painted a starker picture, “Since the energy of the Higgs [field] is distributed all over the universe, it should contribute to the curvature of space; if you do the calculation, the universe would have to curve to the size of a football,” in an interview to Nature in 2013.

Evidently, the Standard Model has many loose ends, and explaining the mass of the Higgs boson is only one of them. Another example is why it has no answer for what dark matter is and why it behaves the way it does. Yet another example is why the four fundamental forces of nature are not of the same order of magnitude.

An alternative

Thanks to the Standard Model, some mysteries have been solved, but other mysteries have come and are coming to light – in much the same way Isaac Newton’s ideas struggled to remain applicable in the troubled world of physics in the early 20th century. It seems history repeats itself through crises.

Fortunately, physicists in 1971-1972 had begun to piece together an alternative theory called supersymmetry, Susy for short. At the time, it was an alternative way of interpreting how emerging facts could be related to each other. Today, however, Susy is a more encompassing successor to the throne that the Standard Model occupies, a sort of mathematical framework in which the predictions of the Model still hold but no longer have those loose ends. And Susy’s USP is… well, that it doesn’t disappoint Sterman.

“There’s a reason why so many people felt so confident about supersymmetry,” he said. “It wasn’t just that it’s a beautiful theory – which it is – or that it engages and challenges the most mathematically oriented among physicists, but in another sense in which it appeared to be necessary. There’s this subtle concept that goes by the name of naturalness…”

And don’t yet look up ‘naturalness’ on Wikipedia because, for once, here is something so simple, so elegant, that it is precisely what its name implies. Naturalness is the idea that, for example, the Higgs boson is so lightweight because something out there is keeping it from being heavy. Naturalness is the idea that, in a given setting, the forces of nature all act in equal measure. Naturalness is the idea that causes seem natural, and logically plausible, without having to be fine-tuned in order to explain their effects. In other words, Susy, through its naturalness, makes possible a domesticated world, one without sudden, unexpected deviations from what common sense (a sophisticated one, anyway) would dictate.

To understand how it works, let us revisit the basics. Our observable universe plays host to two kinds of fundamental particles, which are packets of some well-defined amount of energy. The fermions, named for Enrico Fermi, are the matter particles. Things are made of them. The bosons, named for Satyendra Bose, are the force particles. Things interact with each other by using them as messengers. The Standard Model tells us how bosons and fermions will behave in a variety of situations.

However, the Model has no answers for why bosons and fermions weigh as much as they do, or come in as many varieties as they do. These are deeper questions that go beyond simply what we can observe. These are questions whose answers demand that we interpret what we know, that we explore the wisdom of nature that underlies our knowledge of it. To know this why, physicists investigated phenomena that lie beyond the Standard Model’s jurisdiction.

The search

One such place is actually nothingness, i.e. the quantum vacuum of deep space, where particles called virtual particles continuously wink in and out of existence. But even with their brief life-spans, they play a significant role in mediating the interactions between different particles. You will remember having studied in class IX that like charges repel each other. What you probably weren’t told is that the repulsive force between them is mediated by the exchange of virtual photons.

Curiously, these “virtual interactions” don’t proliferate haphazardly. Virtual particles don’t continuously “talk” to the electron or clump around the Higgs boson. If this happened, mass would accrue at a point out of thin air, and black holes would be popping up all around us. Why this doesn’t happen, physicists think, is because of Susy, whose invisible hand could be staying chaos from dominating our universe.

The way it does this is by invoking quantum mechanics, and conceiving that there is another dimension called superspace. In superspace, the bosons and fermions in the dimensions familiar to us behave differently, the laws conceived such that they restrict the random formation of black holes, for starters. In the May 2014 issue of Scientific American, Joseph Lykken and Maria Spiropulu describe how things work in superspace:

“If you are a boson, taking one step in [superspace] turns you into a fermion; if you are a fermion, one step in [superspace] turns you into a boson. Furthermore, if you take one step in [superspace] and then step back again, you will find that you have also moved in ordinary space or time by some minimum amount. Thus, motion in [superspace] is tied up, in a complicated way, with ordinary motion.”

The presence of this dimension implies that all bosons and fermions have a corresponding particle called a superpartner particle. For each boson, there is a superpartner fermion called a bosino; for each fermion, there is a superpartner boson called a sfermion (why the confusing titles, though?).

Physicists are hoping this supersymmetric world exists. If it does, they will have found tools to explain the Higgs boson’s mass, the difference in strengths of the four fundamental forces, what dark matter could be, and a swarm of other nagging issues the Standard Model fails to resolve. Unfortunately, this is where Susy’s credit-worthiness runs into trouble.

No signs

“Experiment will always be the ultimate arbiter, so long as it’s science we’re doing.”

— Leon Lederman & Christopher Hill, Beyond the Higgs Boson (2013)

Since the first pieces of the Standard Model were brought together in the 1960s, researchers have run repeated tests to check if what it predicts were true. Each time, the Model has stood up to its promise and yielded accurate results. It withstood the test of time – a criterion it shares with the Nobel Prize for physics, which physicists working with the Model have won at least 15 times since 1957.

Susy, on the other hand, is still waiting for confirmation. The Large Hadron Collider (LHC), the world’s most powerful particle physics experiment, ran its first round of experiments from 2009 to 2012, and found no signs of sfermions or bosinos. In fact, it has succeeded on the other hand to narrow the gaps in the Standard Model where Susy could be found. While the non-empty emptiness of quantum vacuum opened a small window into the world of Susy, a window through which we could stick a mathematical arm out and say “This is why black holes don’t just pop up”, the Model has persistently puttied every other crack we hound after.

An interesting quote comes to mind about Susy’s health. In November 2012, at the Hadron Collider Physics Symposium in Kyoto, Japan, physicists presented evidence of a particle decay that happens so rarely that only the LHC could have spotted it. The Standard Model predicts that every time the B_s (pronounced “Bee-sub-ess”) meson decays into a set of lighter particles, there is a small chance that it decays into two muons. The steps in which this happens is intricate, involving a process called a quantum loop.

What next?

“SUSY has been expected for a long time, but no trace has been found so far… Like the plot of the excellent movie ‘The Lady Vanishes’ (Alfred Hitchcock, 1938)”

— Andy Parker, Cambridge University

Susy predicts that some supersymmetric particles should show themselves during the quantum loop, but no signs of them were found. On the other hand, the rate of B_s decays into two muons was consistent with the Model’s predictions. Prof. Chris Parkes, a British physicist, had then told BBC News: “Supersymmetry may not be dead but these latest results have certainly put it into hospital.” Why not: Our peek of the supersymmetric universe eludes us, and if the LHC can’t find it, what will?

Then again, it took us many centuries to find the electron, and then many decades to find anti-particles. Why should we hurry now? After all, as Dr. Rahul Sinha from the Institute of Mathematical Sciences told me after the Symposium had concluded, “a conclusive statement cannot be made as yet”. At this stage, even waiting for many years might not be necessary. The LHC is set to reawaken around January 2015 after a series of upgrades that will let the machine deliver 10 times more particle collisions per second per unit area. Mayhap a superpartner particle can be found lurking in this profusion by, say, 2017.

There are also plans for other more specialised colliders, such as Project X in the USA, which India has expressed interest in formally cooperating with. X, proposed to be built at the Fermilab National Accelerator Laboratory, Illinois, will produce high intensity proton beams to investigate a variety of hitherto unexplored realms. One of them is to produce heavy short-lived isotopes of elements like radium or francium, and use them to study if the electron has a dipole moment, or a pronounced negative charge along one direction, which Susy allows for.

(Moreover, if Project X is realised it could prove extra-useful for India because it makes possible a new kind of nuclear reactor design, called the accelerator-driven sub-critical reactor, which operates without a core of critical-mass radioactive fuel, rendering impossible accidents like Chernobyl and Fukushima, while also being capable of inducing fission reactions using lighter fuel like thorium.)

Yet another avenue to explore Susy would be looking for dark matter particles using highly sensitive particle detectors such as LUX, XENON1T and CDMS. According to some supersymmetric models, the lightest Susy particles could actually be dark matter particles, so if a few are spotted and studied, they could buffet this theory’s sagging credence.

… which serves to remind us that this excitement could cut the other way, too. What if the LHC in its advanced avatar is still unable to find evidence of Susy? In fact, the Advanced Cold Molecule Electron group at Harvard University announced in December 2013 that they were able to experimentally rule out that they electron had a dipole moment with the highest precision attained to date. After such results, physicists will have to try and rework the theory, or perhaps zero in on other aspects of it that can be investigated by the LHC or Project X or other colliders.

But at the end of the day, there is also the romance of it all. It took George Sterman many years to find a theory as elegant and straightforward as Susy – an island of orderliness in the insane sea of quantum mechanics. How quickly would he give it up?

O Hunter, snare me his shadow!
O Nightingale, catch me his strain!
Else moonstruck with music and madness
I track him in vain!

— Oscar Wilde, In The Forest

Why do we need dark matter?

The first thing that goes wrong whenever a new discovery is reported, an old one is invalidated, or some vaguely important scientific result is announced has often to do with misrepresentation in the mainstream media. Right now, we’re in the aftermath of one such event: the October 30 announcement of results from a very sensitive dark matter detector. The detector, called the Large Underground Xenon Experiment (LUX), is installed in the Black Hills of South Dakota and operated by the Sanford Underground Research Facility.

Often the case is that what gets scientists excited may not get the layman excited, too, unless the media wants it to. So also with the announcement of results from LUX:

  • The detector hasn’t found dark matter
  • It hasn’t found a particular particle that some scientists thought could be dark matter in a particular energy range
  • It hasn’t ruled out that some other particles could be dark matter.

Unfortunately, as Matt Strassler noted, the BBC gave its report on the announcement a very misleading headline. We’re nowhere near figuring out what dark matter is as much as we’re figuring out what dark matter isn’t. Both these aspects are important because once we know dark matter isn’t something, we can fix our theories and start looking for something else. As for what dark matter is… here goes.

What is dark matter?

Dark matter is a kind of matter that is thought to occupy a little more than 80 per cent of this universe.

Why is it called ‘dark matter’?

This kind of matter’s name has to do with a property that scientists believe it should have: it does not absorb or emit light, remaining (optically) dark to our search for it.

What is dark matter made of?

We don’t know. Scientists think it could be composed of strange particles. Some other scientists think it could be composed of known particles that are for some reason behaving differently. At the moment, the leading candidate is a particle called the WIMP (weakly interacting massive particle), just like particles called electrons are an indicator of there being an electric field or particles called Higgs bosons are an indicator of there being a Higgs field. A WIMP gets its name because it doesn’t interact with other matter particles except through the gravitational force.

We don’t know how heavy or light WIMPs are or even what each WIMP’s mass could be. So, using different detectors, scientists are combing through different mass-ranges. And by ‘combing’, what they’re doing is using extremely sensitive instruments hidden thousands of feet under rocky terrain (or obiting the planet in a satellite) in an environment so clean that even undesired particles cannot interact with the detector (to some extent). In this state, the detector remains on ‘full alert’ to note the faintest interactions its components have with certain particles in the atmosphere – such as WIMPs.

The LUX detector team, in its October 30 announcement, ruled out that WIMPs existed in the ~10 GeV/c2 mass range (because of a silence of its components trying to pick up some particles in that range). This is important because results from some other detectors around the world suggested that a WIMP could be found in this range.

Can we trust LUX’s result?

Pretty much but not entirely – like the case with most measurements in particle physics experiments. Physicists announcing these results are only saying they aren’t likely to be any other entities masquerading as what they’re looking for. It’s a chance, and never really 100 per cent. But you’ve got to draw the line at some point. Even if there’s always going to be a 0.000…01 per cent chance of something happening, the quantity of observations and the quality of the detector should give you an idea about when to move on.

Where are the other detectors looking for dark matter?

Some are in orbit, some are underground. Check out FermiLATAlpha Magnetic Spectrometer,Payload for Antimatter Exploration and Light-nuclei Astrophysics, XENON100, CDMSLarge Hadron ColliderCoGeNT, etc.

So how was BBC wrong with its headline?

We’re not nearing the final phase of the search for dark matter. We’re only starting to consider the possibility that WIMPs might not be the dark matter particle candidates we should be looking for. Time to look at other candidates like axions. Of course, it wasn’t just BBC. CBS and Popular Science got it wrong, too, together with a sprinkling of other news websites.

Why do we need dark matter?

We haven’t been able to directly detect it, we think it has certain (unverified) properties to explain why it evades detection, we don’t know what it’s made of, and we don’t really know where to look if we think we know what it’s made of. Why then do we still cling to the idea of there being dark matter in the universe, that too in amounts overwhelming ‘normal’ matter by almost five times?

Answer: Because it’s the simplest explanation we can come up with to explain certain anomalous phenomena that existing theories of physics can’t.

Phenomenon #1

When the universe was created in a Big Bang, matter was released into it and sound waves propagated through it as ripples. The early universe was very, very hot, and electrons hadn’t yet condensed and become bound with the matter. They freely scattered radiation, whose intensity was also affected by the sound waves around it.

About 380,000 years after the Bang, the universe cooled and electrons became bound to matter. After this event, some radiation pervading throughout the universe was left behind like residue, observable to this day. When scientists used their knowledge of these events and their properties to work backwards to the time of the Bang, they found that the amount of matter that should’ve carried all that sound didn’t match up with what we could account for today.

They attributed the rest to what they called dark matter.

Phenomenon #2

Another way this mass deficiency manifests is in the observation of gravitational lensing. When light from a distant object passes near a massive object, such as a galaxy or a cluster of galaxies, their gravitational pull bends the light around them. When this bent beam reaches an observer on Earth, the image it carries will appear larger because it will have undergone angular magnification. If these clusters didn’t contain dark matter, physicists would observer much weaker lensing than they actually do.

Phenomenon #3

That’s not all. The stars in a galaxy rotate around the galactic centre, where most of its mass is located. According to theory, the velocity of the stars in a galaxy should drop off the farther they get from the centre. However, observations have revealed that, instead of dropping off, the velocity is actually almost constant even as one gets farther from the centre. So, something is also pulling the outermost stars inward, holding them together and keeping them from flying outward and away from the galaxy. This inward force astrophysicists think could be the gravitational force due to dark matter.

So… what next?

LUX was a very high sensitivity dark matter detector, the most sensitive in existence actually. However, its sensitivity is attuned to look for low-mass WIMPs, and its first results rule out anything in the 5-20 GeV/c2 range. WIMPs of a higher mass are still a possibility, and, who knows, might be found at detectors that work with the CERN collider.

Moreover, agreement between various detectors about the mass of WIMPs has also been iffy. For example, detectors like CDMS and CoGeNT have hinted that a ~10 GeV/c2 WIMP should exist. LUX has only now ruled this out; the XENON100 detector, on the other hand, has been around since 2008 and has been unable to find WIMPs in this mass-range altogether, and it’s more sensitive than CDMS or CoGeNT.

What’s next is some waiting and letting the LUX carry on with its surveys. In fact, the LUX has its peak sensitivity at 33 GeV/c2. Maybe there’s something there. Another thing to keep in mind is that we’ve only just started looking for dark matter particles. Remember how long it took us to figure out ‘normal’ matter particles? Perhaps future higher sensitive detectors (like XENON1T and LUX-ZEPLIN) have something for us.

(This post first appeared at The Copernican on November 3, 2013.)

EUCLID/ESA: A cosmic vision looking into the darkness

I spoke to Dr. Giuseppe Racca and Dr. Rene Laureijs, both of the ESA, regarding the EUCLID mission, which will be the world’s first space-telescope launched to study dark energy and dark matter. For the ESA, EUCLID will be the centerpiece of their Cosmic Vision program (2015-2025). Dr. Racca is the mission’s project manager while Dr. Laureijs is a project scientist.

Could you explain, in simple terms, what the Lagrange point is, and how being able to study the universe from that vantage point could help the study? 

GR: Sun-Earth Lagrangian point 2 (SEL2) is a point in space about 1.5 million km from Earth in the direction opposite to the sun, co-rotating with the Earth around the Sun. It is a nice and calm point to make observations. It is not disturbed by the heat fluxes from the Earth but at the same time is not too far away to allow to send to Earth the large amount of data from the observation. The orbit around SEL2 that Euclid will employ is rather large and it is easy to reach (in terms of launcher capability) and not expensive to control (in terms of fuel required for the orbit corrections and maintenance manoeuvres).

Does Euclid in any way play into a broader program by ESA to delve into the Cosmic Frontier? Are there future upgrades/extensions planned? 

RL: Euclid is the second approved medium class mission of ESA’s Cosmic Vision programme. The first one is Solar Orbiter, which studies the Sun at short distance. The Cosmic Vision programme sets out a plan for Large, Medium and Small size missions in the decade 2015-2025. ESA’s missions Planck, which is presently in operation in L2, and Euclid will study the beginning, the evolution, and the predicted end of our Universe.

GR: A theme of this programme is: “How did the Universe originate and what is it made of?” Euclid is the first mission of this part of Cosmic Vision 2015-2025. There will be other missions, which have not been selected yet.

What’s NASA’s role in all of this? What are the different ways in which they will be participating in the Euclid mission? Is this a mission-specific commitment or, again, is it encompassed by a broader participation agreement?

GR: The NASA participation in the Euclid mission is very important but rather limited in extent. They will provide the Near-infrared detectors for one of the two Euclid instruments. In addition they will contribute to the scientific investigation with a team of about 40 US scientists. Financially speaking NASA contribution is limited to some 3-4% of the total Euclid mission cost.

RL: The Euclid Memorandum of Understanding between ESA and NASA is mission specific and does not involve a broader participation agreement. First of all, NASA will provide the detectors for the infrared instrument. Secondly, NASA will support 40 US scientists to participate in the scientific exploitation of the data. These US scientists will be part of the larger Euclid Consortium, which contains nearly 1000 mostly European scientists.

Do you have any goals in mind? Anything specific or exciting that you expect to find? Who gets the data?

GR: The goals of the Euclid mission are extremely exciting: in few words we want to investigate the nature and origin of the unseen Universe: the dark matter, five times more abundant than the ordinary matter made of atoms, and the dark energy, causing the accelerating expansion of the Universe. The “dark Universe” is reckoned today to amount at 95% of the total matter-energy density. Euclid will survey about 40% of the sky, looking back in cosmic time up to 10 billion years. A smaller part (1% of the sky) will look back to when the universe was only few million years old. This three dimensional survey will allow to map the extent and history of dark matter and dark energy. The results of the mission will allow to understand the nature of the dark matter and its position as part of an extension of the current standard model. Concerning the dark energy we will be able to distinguish between the so called “quintessence” or a modification necessary to current theories of gravity, including General Relativity.

RL: Euclid goals are to measure the accelerated expansion of the universe which tells us about Dark Energy, to determine the properties of gravity on cosmic scales, to learn about the properties of dark matter, and to refine the initial conditions leading to the Universe we see now. These goals have been chosen carefully, the instrumentation of Euclid is optimised to reach these goals as best as possible. The Euclid data opens the discovery space for many other areas in astronomy: Euclid will literally measure billions of stars and galaxies at visible and infrared wavelengths, with a very high image quality, comparable to that of Hubble Space Telescope. The most exiting prospect is the availability of these sharp images, which will certainly reveal new classes of objects with new science. The nominal mission will last for 6 years, but the first year of data will become already public 26 months after the start of the survey.

When will the EUCLID data be released?

GR: The Euclid data will be released to the public one year after their collection and will be made available to all researchers in the world.

What’s allowed and disallowed in the name of SUSY

The International Conference on High Energy Physics (ICHEP) is due to begin on July 7 in Melbourne. This is the 26th episode of the most prestigious scientific conference on particle physics. In keeping with its stature, scientists from the ATLAS and CMS collaborations at the LHC plan to announce the results of preliminary tests conducted to look for the Higgs boson on July 4. Although speculations still will run rife within the high-energy and particle physics communities, they will be subdued; after all, nobody wants to be involved in another OPERAtic fiasco.

Earlier this year, CERN announced that the beam energy at the LHC would be increased from 3.5 TeV/beam to 4 TeV/beam. This means the collision energy will see a jump from 7 TeV to 8 TeV, increasing the chances of recreating the elusive Higgs boson, the “God particle”, and confirming if the Standard Model is able to explain the mechanism of mass formation in this universe. While this was the stated goal when the LHC was being constructed, another particle physics hypothesis was taking shape that lent itself to the LHC’s purpose.

In 1981, Howard Georgi and Savas Dimopoulos proposed a correction to the Standard Model to solve for what is called the hierarchy problem. Specifically, the question is why the weak force (mediated by the W± and Z bosons) is 1032 times stronger than gravity. Both forces are mediated by natural constants: Fermi’s constant for the weak force and for gravity, Newton’s constant. However, when operations of the Standard Model are used to quantum-correct for Fermi’s constant (a process that involves correcting for errors), its value starts to deviate from closer to Newton’s constant to something much, much higher.

Savas Dimopoulos (L) and Howard Georgi

Even by the late 1960s, the propositions of the Standard Model were cemented strongly enough into the psyche of mathematicians and scientists the world over: it had predicted with remarkable accuracy most naturally occurring processes and had predicted the existence of other particles, too, discovered later at detectors such as the Tevatron, ATLAS, CMS, and ZEUS. In other words, it was inviolable. At the same time, there were no provisions to correct for the deviation, indicating that there could be certain entities – particles and forces – that were yet to be discovered and that could solve the hierarchy problem, and perhaps explain the nature of dark matter, too.

So, the 1981 Georgi-Dimopoulos solution was called the Minimal Supersymmetric Standard Model (MSSM), a special formulation of supersymmetry, first proposed in 1966 by Hironari Miyazawa, that paired particles of half-integer spin with those of integer spin and vice versa. (The spin of a particle is the quantum mechanical equivalent of its orbital angular momentum, although one has never been representative of the other. Expressed in multiples of the reduced Planck’s constant, particle spin is denoted in natural units as simply an integer or half-integer.)

Particles of half-integer spin are called fermions and include leptons and quarks. Particles with integer spin are called bosons and comprise photons, the W± and Z bosons, eight gluons, and the hypothetical, scalar boson named after co-postulator Peter Higgs. The principle of supersymmetry (SUSY) states that for each fermion, there is a corresponding boson, and for each boson, there is a corresponding fermion. Also, if SUSY is assumed to possess an unbroken symmetry, then a particle and its superpartner will have the same mass. The superpartners are yet to be discovered, and if anyone has a chance of finding them, it has to be at the LHC.

MSSM solved for the hierarchy problem, which could be restated as the mass of the Higgs boson being much lower than the mass at which new physics appears (Planck mass), by exploiting the effects of what is called the spin-statistics theorem (SST). SST implies that the quantum corrections to the Higgs-mass-squared will be positive if from a boson, and negative if from a fermion. Along with MSSM, however, because of the existence of a superpartner to every particle, the contribution to the correction, Δm2H, is zero. This result leaves the Higgs mass lower than the Planck mass.

The existence of extra dimensions has been proposed to explain the hierarchy problem. However, the law of parsimony, insofar as SUSY seems validatable, prevents physicists from turning so radical.

MSSM didn’t just stabilize the weak scale: in turn, it necessitated the existence of more than one Higgs field for mass-coupling since the Higgs boson would have a superpartner, the fermionic Higgsino. For all other particles, though, particulate doubling didn’t involve an invocation of special fields or extrinsic parameters and was fairly simple. The presence of a single Higgsino in the existing Higgs field would supply an extra degree of freedom (DoF), leaving the Higgs mechanism theoretically inconsistent. However, the presence of two Higgsinos instead of one doesn’t lead to this anomaly (called the gauge anomaly).

The necessity of a second Higgs field was reinforced by another aspect of the Higgs mechanism: mass-coupling. The Higgs boson binds stronger to the heavier particle, which means that there must be a coupling constant to describe the proportionality. This was named after Hideki Yukawa, a Japanese theoretical physicist, and termed λf. When a Higgs boson couples with an up-quark, λf = +1/2; when it couples with a down-quark, λf = -1/2. SUSY, however, prohibits this switch to the value’s complex conjugate (a mass-reducing move), and necessitates a second Higgs field to describe the interactions.

A “quasi-political” explanation of the Higgs mechanism surfaced in 1993 and likened the process to a political leader entering a room full of party members. As she moved through the room, the members moved out of their evenly spaced “slots” and towards her, forming a cluster around her. The speed of the leader was then restricted because there were always a knot of people around her, and she became slowed (like a heavy particle). Finally, as she moved away, the members returned to their original positions in the room.

The MSSM-predicted superpartners are thought to have masses 100- to 1,000-times that of the proton, and require extremely large energies to be recreated in a hadronic collision. The sole, unambiguous way to validate the MSSM theory is to spot the particles in a laboratory experiment (such as those conducted at CERN, not in a high-school chemistry lab). Even as the LHC prepares for that, however, there are certain aspects of MSSM that aren’t understood even theoretically.

The first is the mu problem (that arises in describing the superpotential, or mass, of the Higgsino). Mu appears in the term μHuHd, and in order to perfectly describe the quantum vacuum expectation value of the Higgsino after electroweak symmetry breaking (again, the Higgsino’s mass), mu’s value must be of that order of magnitude close to the electroweak scale (As an analog of electroweak symmetry breaking, MSSM also introduces a soft SUSY-breaking, the terms of which must also be of the order of magnitude of the electroweak scale). The question is whence these large differences in magnitudes, whether they are natural, and if they are, then how.

The second is the problem of flavour mixing. Neutrinos and quarks exhibit a property called flavours, which they seem to change through a mechanism called flavour-mixing. Since no instances of this phenomenon have been observed outside the ambit of the Standard Model, the new terms introduced by MSSM must not interfere with it. In other words, MSSM must be flavour-invariant, and, by an extension of the same logic, CP-invariant.

Because of its involvement in determining which particle has how much mass, MSSM plays a central role in clarifying our understanding of gravity as well as, it has been theorized, in unifying gravity with special relativity. Even though it exists only in the theoretical realm, even though physicists are attracted to it because its consequences seem like favourable solutions, the mathematics of MSSM does explain many of the anomalies that threaten the Standard Model. To wit, dark matter is hypothesized to be the superpartner of the graviton, the particle that mediates the gravitational force, and is given the name gravitino (Here’s a paper from 2007 that attempts to explain the thermal production of gravitinos in the early universe).

While the beam energies were increased in pursuit of the Higgs boson after CERN’s landmark December 13, 2011 announcement, let’s hope that the folks at ATLAS, CMS, ALICE, and other detectors have something to say about opening the next big chapter in particle physics, the next big chapter that will bring humankind one giant leap closer to understanding the universe and the stuff that we’re made of.