The INO story

A longer story about the India-based Neutrino Observatory that I’d been wanting to do since 2012 was finally published today (to be clear, I hit the ‘Publish’ button today) on The Wire. Apart from myself, four people worked on it: two amazing reporters, one crazy copy-editor and one illustrator. I don’t mean to diminish the role of the illustrator, especially in setting the piece’s mood quite well, but only that the reporters and the copy-editor did a stupendous job of getting the story from 0 to 1. After all, all I’d had was an idea.

The INO’s is a great story but stands unfortunately to become a depressing parable at the moment – the biggest bug yet in a spider’s web spun of bureaucracy and misinformation. As told on The Wire, the INO is India’s most badass science experiment yet but its inherent sophistication has become its strength and weakness: a strength for being able yield cutting-edge scientific, a weakness for being the ideal target of stubborn activism, unreason and, consequently and understandably, fatigue on the part of the physicists.

Here on out, it doesn’t look like the INO will get built by 2020, and it doesn’t look like it will be the same thing it started out as when it does get built. Am I disappointed by that? Of course – and bad question. I’m rooting for the experiment, yes? I’m not sure – and much better question. In the last few years, in which the project’s plans gained momentum, some unreasonable activists were able to cash in on the Department of Atomic Energy’s generally cold-blooded way of dealing with disagreement (the DAE is funding the INO). At the same time, the INO collaboration wasn’t as diligent as it ought to have been with the environmental impact assessment report (getting it compiled by a non-accredited agency). Finally, the DAE itself just stood back and watched as the scientists and activists battled it out.

Who lost? Take a guess. I hope the next Big Science experiment fares better (I’m probably not referring to LIGO because it has a far stronger global/American impetus while the INO is completely indigenously motivated).

Discussing some motivations behind a particle physics FAQ

First, there is information. From information, people distill knowledge, and from knowledge, wisdom. Information is available on a lot of topics and in varying levels of detail. Knowledge on topics is harder to find – and even more hard is wisdom. This is because knowledge and wisdom require work (to fact-check and interpret) on information and knowledge, respectively. And people can be selective on what they choose to work on. One popular consequence of such choices is that most people are more aware of business information, business knowledge and business wisdom than they are of scientific information, scientific knowledge and scientific wisdom. This graduated topical awareness reflects in how we produce and consume the news.

struc

News articles written on business issues rarely see fit to delve into historical motivations or explainer-style elucidations because the audience is understood to be better aware of what business is about. Business information and knowledge are widespread and so is, to some extent, business wisdom, and articles can take advantage of conclusions made in each sphere, jumping between them to tease out more information, knowledge and wisdom. On the other hand, articles written on some topics of science – such as particle physics – have to start from the informational level before wisdom can be presented. This places strong limits on how the article can be structured or even styled.

There are numerous reasons for why this is so, especially for topics like particle physics, which I regularly (try to) write on. I’m drawn toward three of them in particular: legacy, complexity and pacing. Legacy is the size of the body of work that is directly related to the latest developments in that work. So, the legacy of the LHC stretches back to include the invention of the cyclotron in 1932 – and the legacy of the Higgs boson stretches back to 1961. Complexity is just that but becomes more meaningful in the context of pacing.

A consequence of business developments being reported on fervently is that there is at least some (understandable) information in the public domain about all stages of the epistemological evolution. In other words, the news reports are apace of new information, new knowledge, new wisdom. With particle physics, they aren’t – they can’t be. The reports are separated by some time, according to when the bigger developments occurred, and in the intervening span of time, new information/knowledge/wisdom would’ve arisen that the reports will have to accommodate. And how much has to be accommodated can be exacerbated by the complexity of what has come before.

struc1

But there is a catch here – at least as far as particle physics is concerned because it is in a quandary these days. The field is wide open because physicists have realised two things: first, that their theoretical understanding of physics is far, far ahead of what their experiments are capable of (since the 1970s and 1980s); second, that there are inconsistencies within the theories themselves (since the late 1990s). Resolving these issues is going to take a bit of time – a decade or so at least (although we’re likely in the middle of such a decade) – and presents a fortunate upside to communicators: it’s a break. Let’s use it to catch up on all that we’ve missed.

The break (or a rupture?) can also be utilised for what it signifies: a gap in information/knowledge. All the information/knowledge/wisdom that has come before is abruptly discontinued at this point, allowing communicators to collect them in one place, compose them and disseminate them in preparation for whatever particle physics will unearth next. And this is exactly what motivated me to write a ‘particle physics FAQ’, published on The Wire, as something anyone who’s graduated from high-school can understand. I can’t say if it will equip them to read scientific papers – but it will definitely (and hopefully) set them on the road to asking more questions on the topic.

Hopes for a new particle at the LHC offset by call for more data

At a seminar at CERN on Tuesday, scientists working with the Large Hadron Collider provided the latest results from the particle-smasher at the end of its operations for 2015. The results make up the most detailed measurements of the properties of some fundamental particles made to date at the highest energy at which humankind has been able to study them.

The data discussed during the seminar originated from observations at two experiments: ATLAS and CMS. And while the numbers were consistent between them, neither experimental collaboration could confirm any of the hopeful rumours doing the rounds – that a new particle might have been found. However, they were able to keep the excitement going by not denying some of the rumours either. All they said was they needed to gather more data.

One rumour that was neither confirmed nor denied was the existence of a particle at an energy of about 750 GeV (that’s about 750x the mass of a proton). That’s a lot of mass for a single particle – the heaviest known elementary particle is the top quark, weighing 175 GeV. As a result, it’d be extremely short-lived (if it existed) and rapidly decay into a combination of lighter particles, which are then logged by the detectors.

When physicists find such groups of particles, they use statistical methods and simulations to reconstruct the properties of the particle that could’ve produced them in the first place. The reconstruction shows up as a bump in the data where otherwise there’d have been a smooth curve.

This is the ATLAS plot displaying said bump (look in the area over 750 GeV on the x-axis):

ATLAS result showing a small bump in the diphoton channel at 750 GeV in the run-2 data. Credit: CERN
ATLAS result showing a small bump in the diphoton channel at 750 GeV in the run-2 data. Credit: CERN

It was found in the diphoton channel – i.e. the heavier particle decayed into two energetic photons which then impinged on the ATLAS detector. So why aren’t physicists celebrating if they can see the bump?

Because it’s not a significant bump. Its local significance is 3.6σ (that’s 3.6 times more than the average size of a fluctuation) – which is pretty significant by itself. But the more important number is the global significance that accounts for the look-elsewhere effect. As experimental physicist Tommaso Dorigo explains neatly here,

… you looked in many places [in the data] for a possible [bump], and found a significant effect somewhere; the likelihood of finding something in the entire region you probed is greater than it would be if you had stated beforehand where the signal would be, because of the “probability boost” of looking in many places.

The global significance is calculated by subtracting the effect of this boost. In the case of the 750-GeV particle, the bump stood at a dismal 1.9σ. A minimum of 3 is required to claim evidence and 5 for a discovery.

A computer’s reconstruction of the diphoton event observed by the ATLAS detector. Credit: ATLAS/CERN
A computer’s reconstruction of the diphoton event observed by the ATLAS detector. Credit: ATLAS/CERN

Marumi Kado, the physicist who presented the ATLAS results, added that when the bump was studied across a 45 GeV swath (on the x-axis), its significance went up to 3.9σ local and 2.3σ global. Kado is affiliated with the Laboratoire de l’Accelerateur Lineaire, Orsay.

A similar result was reported by James Olsen, of Princeton University, speaking for the CMS team with a telltale bump at 745 GeV. However, the significance was only 2.6σ local and 1.2σ global. Olsen also said the CMS detector had only one-fourth the data that ATLAS had in the same channel.

Where all this leaves us is that the Standard Model, which is the prevailing theory + equations used to describe how fundamental particles behave, isn’t threatened yet. Physicists would much like it to be: though it’s been able to correctly predict the the existence of many particles and fundamental forces, it’s been equally unable to explain some findings (like dark matter). And finding a particle weighing ~750 GeV, which the model hasn’t predicted so far, could show physicists what could be broken about the model and pave the way for a ‘new physics’.

However, on the downside, some other new-physics hypotheses didn’t find validation. One of the more prominent among them is called supersymmetry, SUSY for short, and it requires the existence of some heavier fundamental particles. Kado and Olsen both reported that no signs of such particles have been observed, nor of heavier versions of the Higgs boson, whose discovery was announced mid-2012 at the LHC. Thankfully they also appended that the teams weren’t done with their searches and analyses yet.

So, more data FTW – as well as looking forward to the Rencontres de Moriond (conference) in March 2016.

New LHC data has more of the same but could something be in the offing?

Dijet mass (TeV) v. no. of events. SOurce: ATLAS/CERN
Dijet mass (TeV) v. no. of events. Source: ATLAS/CERN

Looks intimidating, doesn’t it? It’s also very interesting because it contains an important result acquired at the Large Hadron Collider (LHC) this year, a result that could disappoint many physicists.

The LHC reopened earlier this year after receiving multiple performance-boosting upgrades over the 18 months before. In its new avatar, the particle-smasher explores nature’s fundamental constituents at the highest energies yet, almost twice as high as they were in its first run. By Albert Einstein’s mass-energy equivalence (E = mc2), the proton’s mass corresponds to an energy of almost 1 GeV (giga-electron-volt). The LHC’s beam energy to compare was 3,500 GeV and is now 6,500 GeV.

At the start of December, it concluded data-taking for 2015. That data is being steadily processed, interpreted and published by the multiple topical collaborations working on the LHC. Two collaborations in particular, ATLAS and CMS, were responsible for plots like the one shown above.

This is CMS’s plot showing the same result:

Source: CMS/CERN
Source: CMS/CERN

When protons are smashed together at the LHC, a host of particles erupt and fly off in different directions, showing up as streaks in the detectors. These streaks are called jets. The plots above look particularly at pairs of particles called quarks, anti-quarks or gluons that are produced in the proton-proton collisions (they’re in fact the smaller particles that make up protons).

The sequence of black dots in the ATLAS plot shows the number of jets (i.e. pairs of particles) observed at different energies. The red line shows the predicted number of events. They both match, which is good… to some extent.

One of the biggest, and certainly among the most annoying, problems in particle physics right now is that the prevailing theory that explains it all is unsatisfactory – mostly because it has some really clunky explanations for some things. The theory is called the Standard Model and physicists would like to see it disproved, broken in some way.

In fact, those physicists will have gone to work today to be proved wrong – and be sad at the end of the day if they weren’t.

Maintenance work underway at the CMS detector, the largest of the five that straddle the LHC. Credit: CERN
Maintenance work underway at the CMS detector, the largest of the five that straddle the LHC. Credit: CERN

The annoying problem at its heart

The LHC chips in providing two kinds of opportunities: extremely sensitive particle-detectors that can provide precise measurements of fleeting readings, and extremely high collision energies so physicists can explore how some particles behave in thousands of scenarios in search of a surprising result.

So, the plots above show three things. First, the predicted event-count and the observed event-count are a match, which is disappointing. Second, the biggest deviation from the predicted count is highlighted in the ATLAS plot (look at the red columns at the bottom between the two blue lines). It’s small, corresponding to two standard deviations (symbol: σ) from the normal. Physicists need at least three standard deviations () from the normal for license to be excited.

But this is the most important result (an extension to the first): The predicted event-count and the observed event-count are a match across 6,000 GeV. In other words: physicists are seeing no cause for joy, and all cause for revalidating a section of the Standard Model, in a wide swath of scenarios.

The section in particular is called quantum chromodynamics (QCD), which deals with how quarks, antiquarks and gluons interact with each other. As theoretical physicist Matt Strassler explains on his blog,

… from the point of view of the highest energies available [at the LHC], all particles in the Standard Model have almost negligible rest masses. QCD itself is associated with the rest mass scale of the proton, with mass-energy of about 1 GeV, again essentially zero from the TeV point of view. And the structure of the proton is simple and smooth. So QCD’s prediction is this: the physics we are currently probing is essential scale-invariant.

Scale-invariance is the idea that two particles will interact the same way no matter how energetic they are. To be sure, the ATLAS/CMS results suggest QCD is scale-invariant in the 0-6,000 GeV range. There’s a long way to go – in terms of energy levels and future opportunities.

Something in the valley

The folks analysing the data are helped along by previous results at the LHC as well. For example, with the collision energy having been ramped up, one would expect to see particles of higher energies manifesting in the data. However, the heavier the particle, the wider the bump in the plot and more the focusing that’ll be necessary to really tease out the peak. This is one of the plots that led to the discovery of the Higgs boson:

 

Source: ATLAS/CERN
Source: ATLAS/CERN

That bump between 125 and 130 GeV is what was found to be the Higgs, and you can see it’s more of a smear than a spike. For heavier particles, that smear’s going to be wider with longer tails on the site. So any particle that weighs a lot – a few thousand GeV – and is expected to be found at the LHC would have a tail showing in the lower energy LHC data. But no such tails have been found, ruling out heavier stuff.

And because many replacement theories for the Standard Model involve the discovery of new particles, analysts will tend to focus on particles that could weigh less than about 2,000 GeV.

In fact that’s what’s riveted the particle physics community at the moment: rumours of a possible new particle in the range 1,900-2,000 GeV. A paper uploaded to the arXiv preprint server on December 10 shows a combination of ATLAS and CMS data logged in 2012, and highlights a deviation from the normal that physicists haven’t been able to explain using information they already have. This is the relevant plot:

Source: arXiv:1512.03371v1
Source: arXiv:1512.03371v1

 

The one on the middle and right are particularly relevant. They each show the probability of the occurrence of an event (observed as a bump in the data, not shown here) of some heavier mass of energy decaying into two different final states: of W and Z bosons (WZ), and of two Z bosons (ZZ). Bosons make a type of fundamental particle and carry forces.

The middle chart implies that the mysterious event is at least 1,000-times less likelier to occur than normally and the one on the left implies the event is at least 10,000-times less likelier to occur than normally. And both readings are at more than 3σ significance, so people are excited.

The authors of the paper write: “Out of all benchmark models considered, the combination favours the hypothesis of a [particle or its excitations] with mass 1.9-2.0 [thousands of GeV] … as long as the resonance does not decay exclusively to WW final states.”

But as physicist Tommaso Dorigo points out, these blips could also be a fluctuation in the data, which does happen.

Although the fact that the two experiments see the same effect … is suggestive, that’s no cigar yet. For CMS and ATLAS have studied dozens of different mass distributions, and a bump could have appeared in a thousand places. I believe the bump is just a fluctuation – the best fluctuation we have in CERN data so far, but still a fluke.

There’s a seminar due to happen today at the LHC Physics Centre at CERN where data from the upgraded run is due to be presented. If something really did happen in those ‘valleys’, which were filtered out of a collision energy of 8,000 GeV (basically twice the beam energy, where each beam is a train of protons), then those events would’ve happened in larger quantities during the upgraded run and so been more visible. The results will be presented at 1930 IST. Watch this space.

Featured image: Inside one of the control centres of the collaborations working on the LHC at CERN. Each collaboration handles an experiment, or detector, stationed around the LHC tunnel. Credit: CERN.

A new particle to break the Standard Model?

The Wire
July 2, 2015

Scientists at the Large Hadron Collider particle-smasher have unearthed data from an experiment conducted in 2012 that shows signs of a new particle. If confirmed, its discovery could herald a new period of particle physics research.

On June 2, members of the ATLAS detector collaboration uploaded a paper to the arXiv pre-print server discussing the possible sighting of a new particle, which hasn’t been named yet. If the data is to be believed, it weighs as much as about 2,000 protons, making it 12-times heavier than the heaviest known fundamental particle, the top quark. It was spotted in the first place when scientists found an anomalous number of ‘events’ recorded by ATLAS at a particular energy scale, more than predicted by the Standard Model set of theories.

Actually, the Standard Model is more like a collection of principles and rules that dictate the behaviour of fundamental particles. Since the 1960s, it has dominated particle physics research but of late has revealed some weaknesses by not being able to explain the causes behind some of its own predictions. For example, two physicists – Peter Higgs and Francois Englert – used the Standard Model to predict the existence of a Higgs boson in 1964. The particle was found at the LHC in 2012. However, the model has no explanation for why the particle is much lighter than it was thought to be.

If its existence is confirmed, the new probable-particle sighted by ATLAS could force the Standard Model to pave way for a more advanced, and comprehensive, theory of physics and ultimately of nature. However, proving that it exists could take at least a year.

The scientists found the probable-particle in data that was recorded by a detector trained to look for the decays of W and Z bosons. These are two fundamental particles that mediate the weak nuclear force that’s responsible for radioactivity. A particle’s mass is equivalent to its energy, which every particle wants to lose if it has too much of it. So heavier particle often break down into smaller clumps of energy, which manifest as smaller particles. Similarly, at the 2 TeV energy scale, scientists spotted a more-than-predicted clumping of energy that’s often the sign of a new particle, in the W/Z channel.

The chance of the telltale spike in the data belonging to a fluke or impostor event, on the other hand, was 0.00135 (with 0 being ‘no chance’ and 1, certainty) – enough to claim evidence but insufficient to claim a discovery. For the latter, the chances will have to be reduced to at least 0.000000287. In the future, this is what scientists intent on zeroing in on the particle will be gunning for.

The LHC shut in early 2013 for upgrades, waking up in May 2015 to smash protons together at almost twice the energy and detect them with twice the sensitivity as before. The ATLAS data about the new particle was gathered in 2012, when the LHC was still smashing protons at a collision energy of 8 TeV (more than 8,000 proton-masses). In its new avatar, it will be smashing them at 13 TeV and with increased intensity as well. As a result, rarer events like this probable-particle’s formation could happen more often, making it easier for scientists to spot and validate them.

If unfortunately the probable-particle is found to have been something else, particle physicists will be disappointed. Since the LHC kicked off in 2009, physicists have been eager to find some data that will “break” the Standard Model, expose cracks in its foundations, that could be taken advantage of to build a theory that can explain the Higgs boson’s mass or why gravity among the four fundamental forces is so much more weaker than the other three.

The ATLAS team acknowledges a paper from members of the CMS collaboration, also at the LHC, from last year that found similar but weaker signs of the same particle.

All goes well on LHC 2.0’s first day back in action

It finally happened! The particle-smasher known as the Large Hadron Collider is back online after more than two years, during which its various components were upgraded to make it even meaner. A team of scientists and engineers gathered at the collider’s control room at CERN over the weekend – giving up Easter celebrations at home – to revive the giant machine so it could resume feeding its four detectors with high-energy collisions of protons.

Before the particles enter the LHC itself, they are pre-accelerated to 450 GeV by the Super Proton Synchrotron. At 11.53 am (CET), the first beam of pre-accelerated protons was injected into the LHC at Point 2 (see image), starting a clockwise journey. By 11.59 am, it’d been reported crossing Point 3, and at 12.01 pm, it was past Point 5. The anxiety in the control room was palpable when an update was posted in the live-blog: “The LHC operators watching the screen now in anticipation for Beam 1 through sector 5-6”.

Beam 1 going from Point 2 to Point 3 during the second run of the Large Hadron Collider's first day in action. Credit: CERN
Beam 1 going from Point 2 to Point 3 during the second run of the Large Hadron Collider’s first day in action. Credit: CERN

Finally, at 12.12 pm, the beam had crossed Point 6. By 12.27, it had gone a full-circle around the LHC’s particles pipeline, signalling that the pathways were defect-free and ready for use. Already, as and when the beam snaked through a detector without glitches, some protons were smashed into static targets producing a so-called splash of particles like sparks, and groups of scientists erupted in cheers.

Both Rolf-Dieter Heuer, the CERN Director-General, and Frederick Bordry, Director for Accelerators and Technology, were present in the control room. Earlier in the day, Heuer had announced that another beam of protons – going anti-clockwise – had passed through the LHC pipe without any problems, providing the preliminary announcement that all was well with the experiment. In fact, CERN’s scientists were originally supposed to have run these beam-checks a week ago, when an electrical glitch spotted at the last minute thwarted them.

In its new avatar, the LHC sports almost double the energy it ran at, before it shut down for upgrades in early-2013, as well as more sensitive collision detectors and fresh safety systems. For the details of the upgrades, read this. For an ‘abridged’ version of the upgrades together with what new physics experiments the new LHC will focus on, read this. Finally, here’s to another great year for high-energy physics!

The Large Hadron Collider is back online, ready to shift from the “what” of reality to “why”

The world’s single largest science experiment will restart on March 23 after a two-year break. Scientists and administrators at the European Organization for Nuclear Research – known by its French acronym CERN – have announced the status of the agency’s upgrades on its Large Hadron Collider (LHC) and its readiness for a new phase of experiments running from now until 2018.

Before the experiment was shut down in late 2013, the LHC became famous for helping discover the elusive Higgs boson, a fundamental (that is, indivisible) particle that gives other fundamental particles their mass through a complicated mechanism. The find earned two of the physicists who thought up the mechanism in 1964, Peter Higgs and Francois Englert, a Nobel Prize in that year.

Though the LHC had fulfilled one of its more significant goals by finding the Higgs boson, its purpose is far from complete. In its new avatar, the machine boasts of the energy and technical agility necessary to answer questions that current theories of physics are struggling to make sense of.

As Alice Bean, a particle physicist who has worked with the LHC, said, “A whole new energy region will be waiting for us to discover something.”

The finding of the Higgs boson laid to rest speculations of whether such a particle existed and what its properties could be, and validated the currently reigning set of theories that describe how various fundamental particles interact. This is called the Standard Model, and it has been successful in predicting the dynamics of those interactions.

From the what to the why

But having assimilated all this knowledge, what physicists don’t know, but desperately want to, is why those particles’ properties have the values they do. They have realized the implications are numerous and profound: ranging from the possible existence of more fundamental particles we are yet to encounter to the nature of the substance known as dark matter, which makes up a great proportion of matter in the universe while we know next to nothing about it. These mysteries were first conceived to plug gaps in the Standard Model but they have only been widening since.

With an experiment now able to better test theories, physicists have started investigating these gaps. For the LHC, the implication is that in its second edition it will not be looking for something as much as helping scientists decide where to look to start with.

As Tara Shears, a particle physicist at the University of Liverpool, told Nature, “In the first run we had a very strong theoretical steer to look for the Higgs boson. This time we don’t have any signposts that are quite so clear.”

Higher energy, luminosity

The upgrades to the LHC that would unlock new experimental possibilities were evident in early 2012.

The machine works by using powerful electric currents and magnetic fields to accelerate two trains, or beams, of protons in opposite directions, within a ring 27 km long, to almost the speed of light and then colliding them head-on. The result is a particulate fireworks of such high energy that the most rare, short-lived particles are brought into existence before they promptly devolve into lighter, more common particles. Particle detectors straddling the LHC at four points on the ring record these collisions and their effects for study.

So, to boost its performance, upgrades to the LHC were of two kinds: increasing the collision energy inside the ring and increasing the detectors’ abilities to track more numerous and more powerful collisions.

The collision energy has been nearly doubled in its second life, from 7-8 TeV to 13-14 TeV. The frequency of collisions has also been doubled from one set every 50 nanoseconds (billionth of a second) to one every 25 nanoseconds. Steve Myers, CERN’s director for accelerators and technology, had said in December 2012, “More intense beams mean more collisions and a better chance of observing rare phenomena.”

The detectors have received new sensors, neutron shields to protect from radiation damage, cooling systems and superconducting cables. An improved fail-safe system has also been installed to forestall accidents like the one in 2008, when failing to cool a magnet led to a shut-down for eight months.

In all, the upgrades cost approximately $149 million, and will increase CERN’s electricity bill by 20% to $65 million. A “massive debugging exercise” was conducted last week to ensure all of it clicked together.

Going ahead, these new specifications will be leveraged to tackle some of the more outstanding issues in fundamental physics.

CERN listed a few–presumably primary–focus areas. They include investigating if the Higgs boson could betray the existence of undiscovered particles, the particles dark matter could be made of, why the universe today has much more matter than antimatter, and if gravity is so much weaker than other forces because it is leaking into other dimensions.

Stride forward in three frontiers

Physicists are also hopeful for the prospects of discovering a class of particles called supersymmetric partners. The theory that predicts their existence is called supersymmetry. It builds on some of the conclusions of the Standard Model, and offers predictions that plug its holes as well with such mathematical elegance that it has many of the world’s leading physicists enamored. These predictions involve the existence of new particles called partners.

In a neat infographic by Elizabeth Gibney in Nature, she explains that the partner that will be easiest to detect will be the ‘stop squark’ as it is the lightest and can show itself in lower energy collisions.

In all, the LHC’s new avatar marks a big stride forward not just in the energy frontier but also in the intensity and cosmic frontiers. With its ability to produce and track more collisions per second as well as chart the least explored territories of the ancient cosmos, it’d be foolish to think this gigantic machine’s domain is confined to particle physics and couldn’t extend to fuel cells, medical diagnostics or achieving systems-reliability in IT.

Here’s a fitting video released by CERN to mark this momentous occasion in the history of high-energy physics.

Featured image: A view of the LHC. Credit: CERN

Update: After engineers spotted a short-circuit glitch in a cooled part of the LHC on March 21, its restart was postponed from March 23 by a few weeks. However, CERN has assured that its a fully understood problem and that it won’t detract from the experiment’s goals for the year.

Why you should care about the mass of the top quark

In a paper published in Physical Review Letters on July 17, 2014, a team of American researchers reported the most precisely measured value yet of the mass of the top quark, the heaviest fundamental particle. Its mass is so high that can exist only in very high energy environments – such as inside powerful particle colliders or in the very-early universe – and not anywhere else.

For this, the American team’s efforts to measure its mass come across as needlessly painstaking. However, there’s an important reason to get as close to the exact value as possible.

That reason is 2012’s possibly most famous discovery. It was drinks-all-round for the particle physics community when the Higgs boson was discovered by the ATLAS and CMS experiments on the Large Hadron Collider (LHC). While the elation lasted awhile, there were already serious questions being asked about some of the boson’s properties. For one, it was much lighter than is anticipated by some promising areas of theoretical particle physics. Proponents of an idea called naturalness pegged it to be 19 orders of magnitude higher!

Because the Higgs boson is the particulate residue of an omnipresent energy field called the Higgs field, the boson’s mass has implications for how the universe should be. Being much lighter, physicists couldn’t explain why the boson didn’t predicate a universe the size of a football – while their calculations did.

In the second week of September 2014, Stephen Hawking said the Higgs boson will cause the end of the universe as we know it. Because it was Hawking who said and because his statement contained the clause “end of the universe”, the media hype was ridiculous yet to be expected. What he actually meant was that the ‘unnatural’ Higgs mass had placed the universe in a difficult position.

The universe would ideally love to be in its lowest energy state, like you do when you’ve just collapsed into a beanbag with beer, popcorn and Netflix. However, the mass of the Higgs has trapped it on a chair instead. While the universe would still like to be in the lower-energy beanbag, it’s reluctant to get up from the higher-energy yet still comfortable chair.

Someday, according to Hawking, the universe might increase in energy (get out of the chair) and then collapsed into its lowest energy state (the beanbag). And that day is trillions of years away.

What does the mass of the top quark have to do with all this? Quite a bit, it turns out. Fundamental particles like the top quark possess their mass in the form of potential energy. They acquire this energy when they move through the Higgs field, which is spread throughout the universe. Some particles acquire more energy than others. How much energy is acquired depends on two parameters: the strength of the Higgs field (which is constant), and the particle’s Higgs charge.

The Higgs charge determines how strongly a particle engages with the Higgs field. It’s the highest for the top quark, which is why it’s also the heaviest fundamental particle. More relevant for our discussion, this unique connection between the top quark and the Higgs boson is also what makes the top quark an important focus of studies.

Getting the mass of the top quark just right is important to better determining its Higgs charge, ergo the extent of its coupling with the Higgs boson, ergo better determining the properties of the Higgs boson. Small deviations in the value of the top quark’s mass could spell drastic changes in when or how our universe will switch from the chair to the beanbag.

If it does, all our natural laws would change. Life would become impossible.

The American team that made the measurements of the top quark used values obtained from the D0 experiment on the Tevatron particle collider, at the Fermi National Accelerator Laboratory. The Tevatron was shut in 2011, so their measurements are the collider’s last words on top quark mass: 174.98 ± 0.76 GeV/c2 (the Higgs boson weighs around 126 GeV/c2; a gold atom, considered pretty heavy, weighs around 210 GeV/c2). This is a precision of better than 0.5%, the finest yet. This value is likely to be updated once the LHC restarts early next year.

Featured image: Screenshot from Inception

The hunt for supersymmetry: Is a choke on the cards?

The Copernican
April 28, 2014

“So irrelevant is the philosophy of quantum mechanics to its use that one begins to suspect that all the deep questions are really empty…”

— Steven Weinberg, Dreams of a Final Theory: The Search for the Fundamental Laws of Nature (1992)

On a slightly humid yet clement January evening in 2013, a theoretical physicist named George Sterman was in Chennai to attend a conference at the Institute of Mathematical Sciences. After the last talk of the day, he had strolled out of the auditorium and was mingling with students when I managed to get a few minutes with him. I asked for an interview and he agreed.

After some coffee, we seated ourselves at a kiosk in the middle of the lawn, the sun was setting, and mosquitoes abounded. Sterman was a particle physicist, so I opened with the customary question about the Higgs boson and expected him to swat it away with snowclones of the time like “fantastic”, “tribute to 50 years of mathematics” and “long-awaited”. He did say those things, but then he also expressed some disappointment.

George Sterman is distinguished for his work in quantum chromodynamics (QCD), for which he won the prestigious J.J. Sakurai Prize in 2003. QCD is a branch of physics that deals with particles that have a property called colour charge. Quarks and gluons are examples of such particles; these two together with electrons are the proverbial building blocks of matter. Sterman has been a physicist since the 1970s, the early years as far as experimental particle physics research is concerned.

The Standard Model disappoints

Over the last four or so decades, remarkable people like him have helped construct a model of laws, principles and theories that the rigours of this field are sustaining on, called the Standard Model of particle physics. And it was the reason Sterman was disappointed.

According to the Standard Model, Sterman explained, “if we gave our any reasonable estimate of what the mass of the Higgs particle should be, it should by all rights be huge! It should be as heavy as what we call the Planck mass.”

But it isn’t. The Higgs mass is around 125 GeV (GeV being a unit of energy that corresponds to certain values of a particle’s mass) – compare it with the proton that weighs 0.938 GeV. On the other hand, the Planck mass is 10^19 GeV. Seventeen orders of magnitude lie in between. According to Sterman, this isn’t natural. The question is why does there have to be such a big difference in what we can say the mass could be and what we find it to be.

Martinus Veltman, a Dutch theoretical physicist who won the Nobel Prize for physics in 2003 for his work in particle physics, painted a starker picture, “Since the energy of the Higgs [field] is distributed all over the universe, it should contribute to the curvature of space; if you do the calculation, the universe would have to curve to the size of a football,” in an interview to Nature in 2013.

Evidently, the Standard Model has many loose ends, and explaining the mass of the Higgs boson is only one of them. Another example is why it has no answer for what dark matter is and why it behaves the way it does. Yet another example is why the four fundamental forces of nature are not of the same order of magnitude.

An alternative

Thanks to the Standard Model, some mysteries have been solved, but other mysteries have come and are coming to light – in much the same way Isaac Newton’s ideas struggled to remain applicable in the troubled world of physics in the early 20th century. It seems history repeats itself through crises.

Fortunately, physicists in 1971-1972 had begun to piece together an alternative theory called supersymmetry, Susy for short. At the time, it was an alternative way of interpreting how emerging facts could be related to each other. Today, however, Susy is a more encompassing successor to the throne that the Standard Model occupies, a sort of mathematical framework in which the predictions of the Model still hold but no longer have those loose ends. And Susy’s USP is… well, that it doesn’t disappoint Sterman.

“There’s a reason why so many people felt so confident about supersymmetry,” he said. “It wasn’t just that it’s a beautiful theory – which it is – or that it engages and challenges the most mathematically oriented among physicists, but in another sense in which it appeared to be necessary. There’s this subtle concept that goes by the name of naturalness…”

And don’t yet look up ‘naturalness’ on Wikipedia because, for once, here is something so simple, so elegant, that it is precisely what its name implies. Naturalness is the idea that, for example, the Higgs boson is so lightweight because something out there is keeping it from being heavy. Naturalness is the idea that, in a given setting, the forces of nature all act in equal measure. Naturalness is the idea that causes seem natural, and logically plausible, without having to be fine-tuned in order to explain their effects. In other words, Susy, through its naturalness, makes possible a domesticated world, one without sudden, unexpected deviations from what common sense (a sophisticated one, anyway) would dictate.

To understand how it works, let us revisit the basics. Our observable universe plays host to two kinds of fundamental particles, which are packets of some well-defined amount of energy. The fermions, named for Enrico Fermi, are the matter particles. Things are made of them. The bosons, named for Satyendra Bose, are the force particles. Things interact with each other by using them as messengers. The Standard Model tells us how bosons and fermions will behave in a variety of situations.

However, the Model has no answers for why bosons and fermions weigh as much as they do, or come in as many varieties as they do. These are deeper questions that go beyond simply what we can observe. These are questions whose answers demand that we interpret what we know, that we explore the wisdom of nature that underlies our knowledge of it. To know this why, physicists investigated phenomena that lie beyond the Standard Model’s jurisdiction.

The search

One such place is actually nothingness, i.e. the quantum vacuum of deep space, where particles called virtual particles continuously wink in and out of existence. But even with their brief life-spans, they play a significant role in mediating the interactions between different particles. You will remember having studied in class IX that like charges repel each other. What you probably weren’t told is that the repulsive force between them is mediated by the exchange of virtual photons.

Curiously, these “virtual interactions” don’t proliferate haphazardly. Virtual particles don’t continuously “talk” to the electron or clump around the Higgs boson. If this happened, mass would accrue at a point out of thin air, and black holes would be popping up all around us. Why this doesn’t happen, physicists think, is because of Susy, whose invisible hand could be staying chaos from dominating our universe.

The way it does this is by invoking quantum mechanics, and conceiving that there is another dimension called superspace. In superspace, the bosons and fermions in the dimensions familiar to us behave differently, the laws conceived such that they restrict the random formation of black holes, for starters. In the May 2014 issue of Scientific American, Joseph Lykken and Maria Spiropulu describe how things work in superspace:

“If you are a boson, taking one step in [superspace] turns you into a fermion; if you are a fermion, one step in [superspace] turns you into a boson. Furthermore, if you take one step in [superspace] and then step back again, you will find that you have also moved in ordinary space or time by some minimum amount. Thus, motion in [superspace] is tied up, in a complicated way, with ordinary motion.”

The presence of this dimension implies that all bosons and fermions have a corresponding particle called a superpartner particle. For each boson, there is a superpartner fermion called a bosino; for each fermion, there is a superpartner boson called a sfermion (why the confusing titles, though?).

Physicists are hoping this supersymmetric world exists. If it does, they will have found tools to explain the Higgs boson’s mass, the difference in strengths of the four fundamental forces, what dark matter could be, and a swarm of other nagging issues the Standard Model fails to resolve. Unfortunately, this is where Susy’s credit-worthiness runs into trouble.

No signs

“Experiment will always be the ultimate arbiter, so long as it’s science we’re doing.”

— Leon Lederman & Christopher Hill, Beyond the Higgs Boson (2013)

Since the first pieces of the Standard Model were brought together in the 1960s, researchers have run repeated tests to check if what it predicts were true. Each time, the Model has stood up to its promise and yielded accurate results. It withstood the test of time – a criterion it shares with the Nobel Prize for physics, which physicists working with the Model have won at least 15 times since 1957.

Susy, on the other hand, is still waiting for confirmation. The Large Hadron Collider (LHC), the world’s most powerful particle physics experiment, ran its first round of experiments from 2009 to 2012, and found no signs of sfermions or bosinos. In fact, it has succeeded on the other hand to narrow the gaps in the Standard Model where Susy could be found. While the non-empty emptiness of quantum vacuum opened a small window into the world of Susy, a window through which we could stick a mathematical arm out and say “This is why black holes don’t just pop up”, the Model has persistently puttied every other crack we hound after.

An interesting quote comes to mind about Susy’s health. In November 2012, at the Hadron Collider Physics Symposium in Kyoto, Japan, physicists presented evidence of a particle decay that happens so rarely that only the LHC could have spotted it. The Standard Model predicts that every time the B_s (pronounced “Bee-sub-ess”) meson decays into a set of lighter particles, there is a small chance that it decays into two muons. The steps in which this happens is intricate, involving a process called a quantum loop.

What next?

“SUSY has been expected for a long time, but no trace has been found so far… Like the plot of the excellent movie ‘The Lady Vanishes’ (Alfred Hitchcock, 1938)”

— Andy Parker, Cambridge University

Susy predicts that some supersymmetric particles should show themselves during the quantum loop, but no signs of them were found. On the other hand, the rate of B_s decays into two muons was consistent with the Model’s predictions. Prof. Chris Parkes, a British physicist, had then told BBC News: “Supersymmetry may not be dead but these latest results have certainly put it into hospital.” Why not: Our peek of the supersymmetric universe eludes us, and if the LHC can’t find it, what will?

Then again, it took us many centuries to find the electron, and then many decades to find anti-particles. Why should we hurry now? After all, as Dr. Rahul Sinha from the Institute of Mathematical Sciences told me after the Symposium had concluded, “a conclusive statement cannot be made as yet”. At this stage, even waiting for many years might not be necessary. The LHC is set to reawaken around January 2015 after a series of upgrades that will let the machine deliver 10 times more particle collisions per second per unit area. Mayhap a superpartner particle can be found lurking in this profusion by, say, 2017.

There are also plans for other more specialised colliders, such as Project X in the USA, which India has expressed interest in formally cooperating with. X, proposed to be built at the Fermilab National Accelerator Laboratory, Illinois, will produce high intensity proton beams to investigate a variety of hitherto unexplored realms. One of them is to produce heavy short-lived isotopes of elements like radium or francium, and use them to study if the electron has a dipole moment, or a pronounced negative charge along one direction, which Susy allows for.

(Moreover, if Project X is realised it could prove extra-useful for India because it makes possible a new kind of nuclear reactor design, called the accelerator-driven sub-critical reactor, which operates without a core of critical-mass radioactive fuel, rendering impossible accidents like Chernobyl and Fukushima, while also being capable of inducing fission reactions using lighter fuel like thorium.)

Yet another avenue to explore Susy would be looking for dark matter particles using highly sensitive particle detectors such as LUX, XENON1T and CDMS. According to some supersymmetric models, the lightest Susy particles could actually be dark matter particles, so if a few are spotted and studied, they could buffet this theory’s sagging credence.

… which serves to remind us that this excitement could cut the other way, too. What if the LHC in its advanced avatar is still unable to find evidence of Susy? In fact, the Advanced Cold Molecule Electron group at Harvard University announced in December 2013 that they were able to experimentally rule out that they electron had a dipole moment with the highest precision attained to date. After such results, physicists will have to try and rework the theory, or perhaps zero in on other aspects of it that can be investigated by the LHC or Project X or other colliders.

But at the end of the day, there is also the romance of it all. It took George Sterman many years to find a theory as elegant and straightforward as Susy – an island of orderliness in the insane sea of quantum mechanics. How quickly would he give it up?

O Hunter, snare me his shadow!
O Nightingale, catch me his strain!
Else moonstruck with music and madness
I track him in vain!

— Oscar Wilde, In The Forest

An elusive detector for an elusive particle

(This article originally appeared in The Hindu on March 31, 2014.)

In the late 1990s, a group of Indian physicists pitched the idea of building a neutrino observatory in the country. The product of that vision is the India-based Neutrino Observatory (INO) slated to come up near Theni district in Tamil Nadu, by 2020. According to the 12th Five Year Plan report released in October 2011, it will be built at a cost of Rs.1,323.77 crore, borne by the Departments of Atomic Energy (DAE) and Science & Technology (DST).

By 2012, these government agencies, with the help of 26 participating institutions, were able to obtain environmental clearance, and approvals from the Planning Commission and the Atomic Energy Commission. Any substantial flow of capital will happen only with Cabinet approval, which has still not been given after more than a year.

If this delay persists, the Indian scientific community will face greater difficulty in securing future projects involving foreign collaborators because we can’t deliver on time. Worse still, bright Indian minds that have ideas to test will prioritise foreign research labs over local facilities.

‘Big science’ is international

This month, the delay acquired greater urgency. On March 24, the Institute of High Energy Physics, Beijing, announced that it was starting construction on China’s second major neutrino research laboratory — the Jiangmen Underground Neutrino Observatory (JUNO), to be completed at a cost of $350 million (Rs. 2,100 crore) by 2020.

Apart from the dates of completion, what Indian physicists find more troubling is that, once ready, both INO and JUNO will pursue a common goal in fundamental physics. Should China face fewer roadblocks than India does, our neighbour could even beat us to some seminal discovery. This is not a jingoistic concern for a number of reasons.

All “big science” conducted today is international in nature. The world’s largest scientific experiments involve participants from scores of institutions around the world and hundreds of scientists and engineers. In this paradigm, it is important for countries to demonstrate to potential investors that they’re capable of delivering good results on time and sustainably. The same paradigm also allows investing institutions to choose whom to support.

India is a country with prior experience in experimental neutrino physics. Neutrinos are extremely elusive fundamental particles whose many unmeasured properties hold clues about why the universe is the way it is.

In the 1960s, a neutrino observatory located at the Kolar Gold Fields in Karnataka became one of the world’s first experiments to observe neutrinos in the Earth’s atmosphere, produced as a by-product of cosmic rays colliding with its upper strata. However, the laboratory was shut in the 1990s because the mines were being closed.

However, Japanese physicist Masatoshi Koshiba and collaborators built on this observation with a larger neutrino detector in Japan, and went on to make a discovery that (jointly) won him the Nobel Prize for Physics in 2002. If Indian physicists had been able to keep the Kolar mines open, by now we could have been on par with Japan, which hosts the world-renowned Super-Kamiokande neutrino observatory involving more than 900 engineers.

Importance of time, credibility

In 1998, physicists from the Institute of Mathematical Sciences (IMSc), Chennai, were examining a mathematical parameter of neutrinos called theta-13. As far as we know, neutrinos come in three types, and spontaneously switch from one type to another (Koshiba’s discovery).

The frequency with which they engage in this process is influenced by their masses and sources, and theta-13 is an angle that determines the nature of this connection. The IMSc team calculated that it could at most measure 12°. In 2012, the Daya Bay neutrino experiment in China found that it was 8-9°, reaffirming the IMSc results and drawing attention from physicists because the value is particularly high. In fact, INO will leverage this “largeness” to investigate the masses of the three types of neutrinos relative to each other.

So, while the Indian scientific community is ready to work with an indigenously designed detector, the delay of a go-ahead from the Cabinet becomes demoralising because we automatically lose time and access to resources from potential investors.

“This is why we’re calling it an India-based observatory, not an Indian observatory, because we seek foreign collaborators in terms of investment and expertise,” says G. Rajasekaran, former joint director of IMSc, who is involved in the INO project.

On the other hand, China appears to have been both prescient and focussed on its goals. It purchased companies manufacturing the necessary components in the last five years, developed the detector technology in the last 24 months, and was confident enough to announce completion in barely six years. Thanks to its Daya Bay experiment holding it in good stead, JUNO is poised to be an international collaboration, too. Institutions from France, Germany, Italy, the U.S. and Russia have evinced interest in it.

Beyond money, there is also a question of credibility. Once Cabinet approval for INO comes through, it is estimated that digging the vast underground cavern to contain the principal neutrino detector will take five years, and the assembly of components, another year more. We ought to start now to be ready in 2020.

Because neutrinos are such elusive particles, any experiments on them will yield correspondingly “unsure” results that will necessitate corroboration by other experiments. In this context, JUNO and INO could complement each other. Similarly, if INO is delayed, JUNO is going to look for confirmation from experiments in Japan, South Korea and the U.S.

It is notable that the INO laboratory’s design permits it to also host a dark-matter decay experiment, in essence accommodating areas of research that are demanding great attention today. But if what can only be called an undue delay on the government’s part continues, we will again miss the bus.