Risky transfers

This update is 6 days old, but it hasn’t made any more sense with time. Perhaps it was the way it was written – my opinion: the stress on the financial benefits of offsetting local plutonium storage with monetary compensation is alarming. That Germany will pay the UK to store this ridiculously dangerous material, that the UK will risk political backlash because the “financial benefits from the title transfer will exceed the long-term costs of the material’s safe storage and management”, that France will then supply processed MOX fuel for use in German reactors, that the UK will then argue that it is glad it has been spared the trouble of shipping plutonium while implying that it is comfortable being the site of nuclear waste storage… are all alarming developments.

Why? Because, even though I’m pro-nuclear, the backlash that could arise out of this could negate years of progress in developing MOX-processing technologies and installing them in the middle of energy policies of three countries. One problem is already obviously foreseeable: Germany’s reluctance to continue its reliance on nuclear power is simply short-sighted. If it requires any more power in the future, it will have to purchase it from France, which, given the knee-jerk shutdown of NPPs worldwide after the Fukushima Incident, is just as surprisingly displaying enough sense to rely on NPPs. By then, I hope monetary advantages will not suffice to mask the reality that Germany would be paying to have France purchase its troubles. Unless, of course, there is some other agreeable form of risk-transfer.

Just ugly.

Signs of a slowdown

The way ahead for particle physics seems dully lit after CERN’s fourth-of-July firecracker. The Higgs announcement got everyone in the physics community excited – and spurred a frenzied submission of pre-prints all rushing to explain the particle’s properties. However, that excitement quickly died out after ICHEP ’12 was presented with nothing significant, even with anything a fraction as significant as the ATLAS/CMS results.

(L-R) Gianotti, Heuer & Incandela

Even so, I suppose we must wait at least another 3 months before a a conclusive Higgs-centric theory emerges that completely integrates the Higgs mechanism with the extant Standard Model.

The spotting of the elusive boson – or an impostor – closes a decades-old chapter in particle physics, but does almost nothing in pointing the way ahead apart from verifying the process of mass-formation. Even theoretically, the presence of SM quadratic divergences in the mass of the Higgs boson prove a resilient barrier to correct. How the Higgs field will be used as a tool in detecting other particles and the properties of other entities is altogether unclear.

The tricky part lies in working out the intricacies of the hypotheses that promise to point the way ahead. The most dominant amongst them is supersymmetry (SUSY). In fact, hints of existence of supersymmetric partners were recorded when the LHCb detector at the LHC spotted evidence of CP-violation in muon-decay events (the latter at 3.9σ). At the same time, the physicists I’m in touch with at IMS point out that rigid restrictions have been instituted on the discovery of sfermions and bosinos.

The energies at which these partners could be found are beyond those achievable by the LHC, let alone the luminosity. More, any favourable-looking ATLAS/CMS SUSY-results – which are simply interpretations of strange events – are definitely applicable only in narrow and very special scenarios. Such a condition is inadmissible when we’re actually in the hunt for frameworks that could explain grander phenomena. Like the link itself says,

“The searches leave little room for SUSY inside the reach of the existing data.”

Despite this bleak outlook, there is still a possibility that SUSY may stand verified in the future. Right now: “Could SUSY be masked behind general gauge mediation, R-parity violation or gauge-mediated SUSY-breaking” is the question (gauge-mediated SUSY-breaking (GMSB) is when some hidden sector breaks SUSY and communicates the products to the SM via messenger fields). Also, ZEUS/DESY results (generated by e-p DIS studies) are currently being interpreted.

However, everyone knows that between now and a future that contains a verified-SUSY, hundreds of financial appeals stand in the way. 😀 This is a typical time of slowdown – a time we must use for open-minded hypothesizing, discussion, careful verification, and, importantly, honest correction.

A dilemma of the auto-didact

If publishers could never imagine that there are people who could teach themselves particle physics, why conceive cheaper preliminary textbooks and ridiculously expensive advanced textbooks? Learning vector physics for classical mechanics costs Rs. 245 while progressing then to analytical mechanics involves an incurrence of Rs. 4,520. Does the cost barrier exist because the knowledge is more specialized? If this is the case, then such books should have become cheaper over time. They have not: Analytical Mechanics, which a good friend recommended, has stayed in the vicinity of $75 for the last three years (now, it’s $78.67 for the original paperback and $43 for a used one). This is just a handy example. There are a host of textbooks that detail concepts in advanced physics and cost a fortune: all you have to do is look for those that contain “hadron”, “accelerator”, “QCD”, etc., in their titles.

Getting to a place in time where a student is capable of understanding these subjects is cheap. In other words, the cost of aspirations is low while the price of execution is prohibitive.

Sure, alternatives exist, such as libraries and university archives. However, that misses the point: it seems the costs of the books are higher to prevent their ubiquitous consumption. No other reason seems evident, although I am loth to reach this conclusion. If you, the publisher, want me to read such books only in universities, then you are effectively requiring me to either abstain from reading these books irrespective of my interests if my professional interests reside elsewhere or depend on universities and university-publisher relationships for my progress in advanced physics, not myself. The resulting gap between the layman and the specialist eventually evades spanning, leading to ridiculous results such as not understanding the “God” in “God particle” to questioning the necessity of the LHC without quite understanding what it does and how that helps mankind.

The Indian Bose in the universal boson

Read this article.

Do you think Indians are harping too much about the lack of mention of Satyendra Nath Bose’s name in the media coverage of the CERN announcement last week? The articles in Hindustan Times and Economic Times seemed to be taking things too far with anthropological analyses that have nothing to do with Bose’s work. The boson was named so around 1945 by the great Paul Dirac as a commemoration of Bose’s work with Einstein. Much has happened since; why would we want to celebrate the Bose in the boson again and again?

Dr. Satyendra Nath Bose

The stage now belongs to the ATLAS and the CMS collaborations, and to Higgs, Kibble, Englert, Brout, Guralnik, and Hagen, and to physics itself as a triumph of worldwide cooperation in the face of many problems. Smarting because an Indian’s mention was forgotten is jejune. Then again, this is mostly the layman and the media, because the physicists I met last week seemed to fully understand Bose’s contribution to the field itself instead of count the frequency of his name’s mention.

Priyamvada Natarajan, as she writes in the Hindustan Times, is wrong (and the Economic Times article’s heading is just irritating). That Bose is not a household name like Einstein’s is is not because of post-colonialism – the exceptions are abundant enough to warrant inclusion – but because we place too much faith in a name instead of remembering what the man behind the name did for physics.

Ramblings on partons

When matter and anti-matter meet, they annihilate each other in a “flash” of energy. Usually, this release of energy is in the form of high-energy photons, or gamma rays, which are then detected, analysed, and interpreted to understand more of the collision’s other properties. In nature, however, matter/anti-matter collisions are ultra-rare if not altogether non-existent because of the unavailability of anti-matter.

Such annihilation processes are important not just to supplant our understanding of particle physics but also because they play a central role in the design of hadron colliders. Such colliders use heavily interacting particles (the superficial definition of hadrons), such as protons and neutrons, to bombard into each other. The target particles, depending on experimental necessities, may be stationary – in which case the collider is said to employ a fixed target – or moving. The Large Hadron Collider (LHC) is the world’s largest and most powerful hadron collider, and it uses moving targets, i.e., both the incident and target hadrons are moving toward each other.

Currently, it is know that a hadronic collision is explicable in terms of their constituent particles, quarks and gluons. Quarks are the snowcloned fundamental building blocks of all matter, and gluons are particles that allow two quarks to “stick” together, behaving like glue. More specifically, gluons mediate the residual strong force (where the strong force itself is one of the four fundamental forces of nature): in other words, quarks interact by exchanging gluons.

Parton distribution functions

Earlier, before the quark-gluon model was known, a hadronic collision was broken down in terms of hypothetical particles called partons. The idea was suggested by Richard Feynman in 1969. At very high energies – such as the ones at which collisions occur at the LHC – equations governing the parton model, which approximates the hadrons as presenting point-targets, evolve into parton-distribution functions (PDFs). PDFs, in turn, allow for the prediction of the composition of the hubris resulting from the collisions. Theoretical calculations pertaining to different collision environments and outcomes are used to derive different PDFs for each process, which are then used by technicians to design hadron-colliders accordingly.

(If you can work on FORTRAN, here are some PDFs to work with.)

Once the quark-gluon model was in place, there were no significant deviations from the parton model. At the same time, because quarks have a corresponding anti-matter “form”,anti-quarks, a model had to be developed that could study quark/anti-quark collisions during the course of a hadronic collision, especially one that could factor in the production of pairs of leptons during such collisions. Such a model was developed by Sidney Drell and Tung-Mow Yan in 1970, and was called the Drell-Yan (DY) process, and further complimented by a phenomenon called Bjorken scaling (Bsc).

(In Bsc, when the energy of an incoming lepton is sufficiently high during a collision process, the cross-section available for collision becomes independent of the electron’s momentum. In other words, the lepton, say, an electron, at very-high energies interacts with a hadron not as if the latter were particle but as if it were composed of point-like targets called partons.)

In a DY process, a quark from one hadron would collide with an anti-quark from another hadron and annihilate each other to produce a virtual photon (γ*). The γ* then decays to form a dilepton pair, which, if we were to treat with as one entity instead of as a paired two, could be said to have a mass M.

Now, if M is large, then Heisenberg’s uncertainty principle tells us that the time of interaction between the quark/anti-quark pair should have been small, essentially limiting its interaction with any other partons in the colliding hadrons. Similarly, in a timeframe that is long in comparison to the timescale of the annihilation, the other spectator-partons would rearrange themselves into resultant hadrons. However, in most cases, the dilepton is detected and momentum-analysed, not the properties of the outgoing hadrons. The DY process results in the production of dilepton pairs at finite energies, but these energies are very closely spaced, resulting in an energy-band, or continuum, being defined in the ambit of which a dilepton-pair might be produced.

In quantum chromodynamics and quark-parton transitions

Quark/anti-quark annihilation is of special significance in quantum chromodynamics (QCD), which studies the colour-force, the force between gluons and quarks and anti-quarks, inside hadrons. The strong field that gluons mediate is, in quantum mechanical terms, called the colour field. Unlike in QED (quantum electrodynamics) or classical mechanics, QCD allows for two strange kinds of behaviour from quarks and gluons. The first kind, called confinement, holds that the force between two interacting quarks does not diminish as they are separated. This doesn’t mean that quarks are strongly interacting at large distances! No, it means that once two quarks have come together, no amount of energy can take them apart. The second kind, called asymptotic freedom (AF), holds that that quarks and gluons interact weakly at high energies.

(If you think about it, colour-confinement implies that gluons can emit gluons, and as the separation between two quarks increases, so also the rate of gluon emission increases. Axiomatically, as the separation decreases, or that the relative four-momentum squared increases, the force holding quarks together decreases monotonically in strength, leading to asymptotic freedom.)

The definitions for both properties are deeply rooted in experimental ontology: colour-confinement was chosen to explain the consistent failure of free-quark searches, while asymptotic freedom doesn’t yield any phase-transition line between high- and low-energy scales while still describing a property transition between the two scales. Therefore, the DY process seemed well-poised to provide some indirect proof for the experimental validity of QCD if some relation could be found between the collision cross-section and the particles’ colour-charge, and this is just what was done.

The QCD factorization theorem can be read as:

Here, as(μ) is the effective chromodynamic (quark-gluon-quark) coupling at a factorization scale μ. Further, fa(xμ) defines the probability of finding a parton a within a nucleon with the Bjorken scaling variable x at the scale μ. Also, { hat { sigma  }  }_{ i }^{ a } (TeX converter) is the hard-scattering cross-section of the electroweak vector boson on the parton. The physical implication is that the nucleonic structure function is derived by the area of overlap between the function describing the probability of finding a parton inside a nucleon and the summa of all functions describing the probabilities of finding all partons within the nucleon.

This scaling behaviour enabled by QCD makes possible predictions about future particle phenomenology.

Putting particle physics research to work

In the whole gamut of comments regarding the Higgs boson, there is a depressingly large number decrying the efforts of the ATLAS and CMS collaborations. Why? Because a lot of people think the Large Hadron Collider (LHC) is a yawning waste of time and money, an investment that serves mankind no practical purpose.

Well, here and here are some cases in point that demonstrate the practical good that the LHC has made possible in the material sciences. Another big area of application is in medical diagnostics: making the point is one article about hunting for the origin of Alzheimer’s, and another about the very similar technology used in particle accelerators and medical imaging devices, meteorology, VLSI, large-scale networking, cryogenics, and X-ray spectroscopy.

Moving on to more germane applications: arXiv has reams of papers that discuss the deployment of

… amongst others.

The LHC, above all else, is the brainchild of the European Centre for Nuclear Research, popularly known as CERN. These guys invented the notion of the internet, developed the first touch-screen devices, and pioneered the earliest high-energy medical imaging techniques.

With experiments like those being conducted at the LHC, it’s easy to forget every other development in such laboratories apart from the discovery of much-celebrated particles. All the applications I’ve linked to in this post were conceived by scientists working with the LHC, if only to argue that everyone, the man whose tax money pays for these giant labs to the man who uses the money to work in the labs, is mindful of practical concerns.

Gunning for the goddamned: ATLAS results explained

Here are some of the photos from the CERN webcast yesterday (July 4, Wednesday), with an adjoining explanation of the data presented in each one and what it signifies.

This first image shows the data accumulated post-analysis of the diphoton decay mode of the Higgs boson. In simpler terms, physicists first put together all the data they had that resulted from previously known processes. This constituted what’s called the background. Then, they looked for signs of any particle that seemed to decay into two energetic photons, or gamma rays, in a specific energy window; in this case, 100-160 GeV.

Finally, knowing how the number of events would vary in a scenario without the Higgs boson, a curve was plotted that fit the data perfectly: the number of events at each energy level v. the energy level at which it was tracked. This way, a bump in the curve during measurement would mean there was a particle previously unaccounted for that was causing an excess of diphoton decay events at a particular energy.

This is the plot of the mass of the particle being looked for (x-axis) versus the confidence level with which it has (or has not, depending n how you look at it) been excluded as an event to focus on. The dotted horizontal line, corresponding to 1μ, marks off a 95% exclusion limit: any events registered above the line can be claimed as having been observed with “more than 95% confidence” (colloquial usage).

Toward the top-right corner of the image are some numbers. 7 TeV and 8 TeV are the values of the total energy going into each collision before and after March, 2012, respectively. The beam energy was driven up to increase the incidence of decay events corresponding to Higgs-boson-like particles, which, given the extremely high energy at which they exist, are viciously short-lived. In experiments that were run between March and July, physicists at CERN reported an increase of almost 25-30% of such events.

The two other numbers indicate the particle accelerator’s integrated luminosity. In particle physics, luminosity is measured as the number of particles that can pass detected through a unit of area per second. The integrated luminosity is the same value but measured over a period of time. In the case of the LHC, after the collision energy was vamped up, the luminosity, too, had to be increased: from about 4.7 fb-1 to 5.8 fb-1. You’ll want to Wiki the unit of area called barn. Some lighthearted physics talk there.

In this plot, the y-axis on the left shows the chances of error, and the corresponding statistical significance on the right. When the chances of an error stand at 1, the results are not statistically significant at all because every observation is an error! But wait a minute, does that make sense? How can all results be errors? Well, when looking for one particular type of event, any event that is not this event is an error.

Thus, as we move toward the ~125 GeV mark, the number of statistically significant results shoot up drastically. Looking closer, we see two results registered just beyond the 5-sigma mark, where the chances of error are 1 in 3.5 million. This means that if the physicists created just those conditions that resulted in this >5σ (five-sigma) observation 3.5 million times, only once will a random fluctuation play impostor.

Also, notice how the differences between each level of statistical significance increases with increasing significance? For chances of errors: 5σ – 4σ > 4σ – 3σ > … > 1σ – 0σ. This means that the closer physicists get to a discovery, the exponentially more precise they must be!

OK, this is a graph showing the mass-distribution for the four-lepton decay mode, referred to as a channel by those working on the ATLAS and CMS collaborations (because there are separate channels of data-taking for each decay-mode). The plotting parameters are the same as in the first plot in this post except for the scale of the x-axis, which goes all the way from 0 to 250 GeV. Now, between 120 GeV and 130 GeV, there is an excess of events (light blue). Physicists know it is an excess and not at par with expectations because theoretical calculations made after discounting a Higgs-boson-like decay event show that, in that 10 GeV, only around 5.3 events are to be expected, as opposed to the 13 that turned up.

After the Higgs-boson-like particle, what’s next?

This article, as written by me, appeared in print in The Hindu on July 5, 2012.

The ATLAS (A Toroidal LHC Apparatus) collaboration at CERN has announced the sighting of a Higgs boson-like particle in the energy window of 125.3 ± 0.6 GeV. The observation has been made with a statistical significance of 5 sigma. This means the chances of error in their measurements are 1 in 3.5 million, sufficient to claim a discovery and publish papers detailing the efforts in the hunt.

Rolf-Dieter Heuer, Director General of CERN since 2009, said at the special conference called by CERN in Geneva, “It was a global effort, it is a global effort. It is a global success.” He expressed great optimism and concluded the conference saying this was “only the beginning.”

With this result, collaborations at the Large Hadron Collider (LHC), the atom-smashing machine, have vastly improved on their previous announcement on December 13, 2011, where the chance of an error was 1-in-50 for similar sightings.

A screenshot from the Dec 13, 2011, presentation by Fabiola Gianotti, leader of the ATLAS collaboration, that shows a global statistical significance of 2.3 sigma, which translates to a 1-in-50 chance of the result being erroneous.

Another collaboration, called CMS (Compact Muon Solenoid), announced the mass of the Higgs-like particle with a 4.9 sigma result. While insufficient to claim a discovery, it does indicate only a one-in-two-million chance of error.

Joe Incandela, CMS spokesman, added, “We’re reaching into the fabric of the universe at a level we’ve never done before.”

The LHC will continue to run its experiments so that results revealed on Wednesday can be revalidated before it shuts down at the end of the year for maintenance. Even so, by 2013, scientists, such as Dr. Rahul Sinha, a participant of the Belle Collaboration in Japan, are confident that a conclusive result will be out.

“The LHC has the highest beam energy in the world now. The experiment was designed to yield quick results. With its high luminosity, it quickly narrowed down the energy-ranges. I’m sure that by the end of the year, we will have a definite word on the Higgs boson’s properties,” he said.

However, even though the Standard Model, the framework of all fundamental particles and the dominating explanatory model in physics today, predicted the particle’s existence, slight deviations have been observed in terms of the particle’s predicted mass. Even more: zeroing in on the mass of the Higgs-like particle doesn’t mean the model is complete when, in fact, it is far from.

While an answer to the question of mass formation took 50 years to be reached, physicists are yet to understand many phenomena. For instance, why aren’t the four fundamental forces of nature equally strong?

The weak, nuclear, electromagnetic, and gravitational forces were born in the first few moments succeeding the Big Bang 13.75 billion years ago. Of these, the weak force is, for some reason, almost 1 billion, trillion, trillion times stronger than the gravitational force! Called the hierarchy problem, it evades a Standard Model explanation.

In response, many theories were proposed. One, called supersymmetry (SUSY), proposed that all fermions, which are particles with half-integer spin, were paired with a corresponding boson, or particles with integer spin. Particle spin is the term quantum mechanics attributes to the particle’s rotation around an axis.

Technicolor was the second framework. It rejects the Higgs mechanism, a process through which the Higgs boson couples stronger with some particles and weaker with others, making them heavier and lighter, respectively.

Instead, it proposes a new form of interaction with initially-massless fermions. The short-lived particles required to certify this framework are accessible at the LHC. Now, with a Higgs-like particle having been spotted with a significant confidence level, the future of Technicolor seems uncertain.

However, “significant constraints” have been imposed on the validity of these and such theories, labeled New Physics, according to Prof. M.V.N. Murthy of the Institute of Mathematical Sciences (IMS), whose current research focuses on high-energy physics.

Some other important questions include why there is more matter than antimatter in this universe, why fundamental particles manifest in three generations and not more or fewer, and the masses of the weakly-interacting neutrinos. State-of-the-art technology worldwide has helped physicists design experiments to study each of these problems better.

For example, the India-based Neutrino Observatory (INO), under construction in Theni, will house the world’s largest static particle detector to study atmospheric neutrinos. Equipped with its giant iron-calorimeter (ICAL) detector, physicists aim to discover which neutrinos are heavier and which lighter.

The LHC currently operates at the Energy Frontier, with high-energy being the defining constraint on experiments. Two other frontiers, Intensity and Cosmic, are also seeing progress. Project X, a proposed proton accelerator at Fermilab in Chicago, Illinois, will push the boundaries of the Intensity Frontier by trying to look for ultra-rare process. On the Cosmic Frontier, dark matter holds the greatest focus.