The hunt for supersymmetry: Reviewing the first run

What do dark matter, Higgs bosons, the electron dipole moment, topological superconductors and quantum loops have in common? These are exotic entities that scientists have been using to solve some longstanding problems in fundamental physics. Specifically, by studying these entities, they expect to discover new ingredients of the universe that will help them answer why it is the way it is. These ingredients could in come in a staggering variety, so it is important for scientists to narrow down what they’re looking for – which brings us to the question of why these entities are being studied. A brief summary:

  1. Dark matter is an exotic form of matter that is thought to interact with normal matter only through the gravitational force. Its existence was hypothesized in 1932-1933. Its exact nature is yet to be understood.
  2. Quantum loops refer to an intricate way in which some heavier particles decay into sets of lighter particles, involving the exchange of some other extremely short-lived particles. They have been theoretically known to exist for many decades.
  3. Topological superconductors are exotic materials that, under certain conditions, behave like superconductors on their surface and as insulators underneath. They were discovered fairly recently, around 2007, and how they manage to be this way is not fully understood.
  4. The Higgs boson‘s discovery was announced in July 2012 (and verified by March-June 2013). Math worked out on paper predicts that its mass ought to have been very high – but it was found to be much lower.
  5. The electron dipole moment is a measure of how spherical the electron is. Specifically, the EDM denotes how evenly the electron’s negative charge is distributed around it. While the well-understood laws of nature don’t prevent the charge from being uneven, they restrict the amount of unevenness to a very small value. The most precise measurement of this value to date was published in December 2013.

Clearly, these are five phenomena whose identities are incomplete. But more specifically, scientists have found a way to use advanced mathematics to complete all these identities with one encompassing theory called Supersymmetry (Susy). Unfortunately for them, the mathematics refuses to become real, i.e. scientists have failed to find evidence of Susy in experiments. Actually, that might be an overstatement: different experiments are at different stages of looking for Susy at work in giving these ‘freaks of nature’ a physical identity. On the other hand, it has been a few years since some of these experiments commenced – some of them are quite powerful indeed – and the only positive results they have had have been to say Susy cannot be found in this or that range.

But if signs of Susy are found, then the world of physics will be in tumult – in a good way, of course. It will get to replace an old theory called the Standard Model of particle physics. The Standard Model is the set of mathematical tools and techniques used to understand how fundamental particles make up different objects, what particles our universe is made of, how quantum loops work, how the Higgs boson could have formed, etc. But it has no answers for why there is dark matter, why the electron is allowed to have that small dipole moment, why topological superconductors work the way they do, why the Higgs boson’s mass is so low, etc.

Early next year, physicists will turn to the Large Hadron Collider (LHC) – which helped discover the Higgs boson in 2012 – after it wakes up from its two-year slumber to help find Susy, too. This LHC of 2015 will be way more powerful than the one that went dormant in early 2013 thanks to a slew of upgrades. Hopefully it will not disappoint, building on what it has managed to deliver for Susy until now. In fact, on April 28, 2014, two physicists from CERN submitted a preprint paper to the arXiv server summarizing the lessons for Susy from the LHC after the first run.

The hunt for supersymmetry: Is a choke on the cards?

The Copernican
April 28, 2014

“So irrelevant is the philosophy of quantum mechanics to its use that one begins to suspect that all the deep questions are really empty…”

— Steven Weinberg, Dreams of a Final Theory: The Search for the Fundamental Laws of Nature (1992)

On a slightly humid yet clement January evening in 2013, a theoretical physicist named George Sterman was in Chennai to attend a conference at the Institute of Mathematical Sciences. After the last talk of the day, he had strolled out of the auditorium and was mingling with students when I managed to get a few minutes with him. I asked for an interview and he agreed.

After some coffee, we seated ourselves at a kiosk in the middle of the lawn, the sun was setting, and mosquitoes abounded. Sterman was a particle physicist, so I opened with the customary question about the Higgs boson and expected him to swat it away with snowclones of the time like “fantastic”, “tribute to 50 years of mathematics” and “long-awaited”. He did say those things, but then he also expressed some disappointment.

George Sterman is distinguished for his work in quantum chromodynamics (QCD), for which he won the prestigious J.J. Sakurai Prize in 2003. QCD is a branch of physics that deals with particles that have a property called colour charge. Quarks and gluons are examples of such particles; these two together with electrons are the proverbial building blocks of matter. Sterman has been a physicist since the 1970s, the early years as far as experimental particle physics research is concerned.

The Standard Model disappoints

Over the last four or so decades, remarkable people like him have helped construct a model of laws, principles and theories that the rigours of this field are sustaining on, called the Standard Model of particle physics. And it was the reason Sterman was disappointed.

According to the Standard Model, Sterman explained, “if we gave our any reasonable estimate of what the mass of the Higgs particle should be, it should by all rights be huge! It should be as heavy as what we call the Planck mass.”

But it isn’t. The Higgs mass is around 125 GeV (GeV being a unit of energy that corresponds to certain values of a particle’s mass) – compare it with the proton that weighs 0.938 GeV. On the other hand, the Planck mass is 10^19 GeV. Seventeen orders of magnitude lie in between. According to Sterman, this isn’t natural. The question is why does there have to be such a big difference in what we can say the mass could be and what we find it to be.

Martinus Veltman, a Dutch theoretical physicist who won the Nobel Prize for physics in 2003 for his work in particle physics, painted a starker picture, “Since the energy of the Higgs [field] is distributed all over the universe, it should contribute to the curvature of space; if you do the calculation, the universe would have to curve to the size of a football,” in an interview to Nature in 2013.

Evidently, the Standard Model has many loose ends, and explaining the mass of the Higgs boson is only one of them. Another example is why it has no answer for what dark matter is and why it behaves the way it does. Yet another example is why the four fundamental forces of nature are not of the same order of magnitude.

An alternative

Thanks to the Standard Model, some mysteries have been solved, but other mysteries have come and are coming to light – in much the same way Isaac Newton’s ideas struggled to remain applicable in the troubled world of physics in the early 20th century. It seems history repeats itself through crises.

Fortunately, physicists in 1971-1972 had begun to piece together an alternative theory called supersymmetry, Susy for short. At the time, it was an alternative way of interpreting how emerging facts could be related to each other. Today, however, Susy is a more encompassing successor to the throne that the Standard Model occupies, a sort of mathematical framework in which the predictions of the Model still hold but no longer have those loose ends. And Susy’s USP is… well, that it doesn’t disappoint Sterman.

“There’s a reason why so many people felt so confident about supersymmetry,” he said. “It wasn’t just that it’s a beautiful theory – which it is – or that it engages and challenges the most mathematically oriented among physicists, but in another sense in which it appeared to be necessary. There’s this subtle concept that goes by the name of naturalness…”

And don’t yet look up ‘naturalness’ on Wikipedia because, for once, here is something so simple, so elegant, that it is precisely what its name implies. Naturalness is the idea that, for example, the Higgs boson is so lightweight because something out there is keeping it from being heavy. Naturalness is the idea that, in a given setting, the forces of nature all act in equal measure. Naturalness is the idea that causes seem natural, and logically plausible, without having to be fine-tuned in order to explain their effects. In other words, Susy, through its naturalness, makes possible a domesticated world, one without sudden, unexpected deviations from what common sense (a sophisticated one, anyway) would dictate.

To understand how it works, let us revisit the basics. Our observable universe plays host to two kinds of fundamental particles, which are packets of some well-defined amount of energy. The fermions, named for Enrico Fermi, are the matter particles. Things are made of them. The bosons, named for Satyendra Bose, are the force particles. Things interact with each other by using them as messengers. The Standard Model tells us how bosons and fermions will behave in a variety of situations.

However, the Model has no answers for why bosons and fermions weigh as much as they do, or come in as many varieties as they do. These are deeper questions that go beyond simply what we can observe. These are questions whose answers demand that we interpret what we know, that we explore the wisdom of nature that underlies our knowledge of it. To know this why, physicists investigated phenomena that lie beyond the Standard Model’s jurisdiction.

The search

One such place is actually nothingness, i.e. the quantum vacuum of deep space, where particles called virtual particles continuously wink in and out of existence. But even with their brief life-spans, they play a significant role in mediating the interactions between different particles. You will remember having studied in class IX that like charges repel each other. What you probably weren’t told is that the repulsive force between them is mediated by the exchange of virtual photons.

Curiously, these “virtual interactions” don’t proliferate haphazardly. Virtual particles don’t continuously “talk” to the electron or clump around the Higgs boson. If this happened, mass would accrue at a point out of thin air, and black holes would be popping up all around us. Why this doesn’t happen, physicists think, is because of Susy, whose invisible hand could be staying chaos from dominating our universe.

The way it does this is by invoking quantum mechanics, and conceiving that there is another dimension called superspace. In superspace, the bosons and fermions in the dimensions familiar to us behave differently, the laws conceived such that they restrict the random formation of black holes, for starters. In the May 2014 issue of Scientific American, Joseph Lykken and Maria Spiropulu describe how things work in superspace:

“If you are a boson, taking one step in [superspace] turns you into a fermion; if you are a fermion, one step in [superspace] turns you into a boson. Furthermore, if you take one step in [superspace] and then step back again, you will find that you have also moved in ordinary space or time by some minimum amount. Thus, motion in [superspace] is tied up, in a complicated way, with ordinary motion.”

The presence of this dimension implies that all bosons and fermions have a corresponding particle called a superpartner particle. For each boson, there is a superpartner fermion called a bosino; for each fermion, there is a superpartner boson called a sfermion (why the confusing titles, though?).

Physicists are hoping this supersymmetric world exists. If it does, they will have found tools to explain the Higgs boson’s mass, the difference in strengths of the four fundamental forces, what dark matter could be, and a swarm of other nagging issues the Standard Model fails to resolve. Unfortunately, this is where Susy’s credit-worthiness runs into trouble.

No signs

“Experiment will always be the ultimate arbiter, so long as it’s science we’re doing.”

— Leon Lederman & Christopher Hill, Beyond the Higgs Boson (2013)

Since the first pieces of the Standard Model were brought together in the 1960s, researchers have run repeated tests to check if what it predicts were true. Each time, the Model has stood up to its promise and yielded accurate results. It withstood the test of time – a criterion it shares with the Nobel Prize for physics, which physicists working with the Model have won at least 15 times since 1957.

Susy, on the other hand, is still waiting for confirmation. The Large Hadron Collider (LHC), the world’s most powerful particle physics experiment, ran its first round of experiments from 2009 to 2012, and found no signs of sfermions or bosinos. In fact, it has succeeded on the other hand to narrow the gaps in the Standard Model where Susy could be found. While the non-empty emptiness of quantum vacuum opened a small window into the world of Susy, a window through which we could stick a mathematical arm out and say “This is why black holes don’t just pop up”, the Model has persistently puttied every other crack we hound after.

An interesting quote comes to mind about Susy’s health. In November 2012, at the Hadron Collider Physics Symposium in Kyoto, Japan, physicists presented evidence of a particle decay that happens so rarely that only the LHC could have spotted it. The Standard Model predicts that every time the B_s (pronounced “Bee-sub-ess”) meson decays into a set of lighter particles, there is a small chance that it decays into two muons. The steps in which this happens is intricate, involving a process called a quantum loop.

What next?

“SUSY has been expected for a long time, but no trace has been found so far… Like the plot of the excellent movie ‘The Lady Vanishes’ (Alfred Hitchcock, 1938)”

— Andy Parker, Cambridge University

Susy predicts that some supersymmetric particles should show themselves during the quantum loop, but no signs of them were found. On the other hand, the rate of B_s decays into two muons was consistent with the Model’s predictions. Prof. Chris Parkes, a British physicist, had then told BBC News: “Supersymmetry may not be dead but these latest results have certainly put it into hospital.” Why not: Our peek of the supersymmetric universe eludes us, and if the LHC can’t find it, what will?

Then again, it took us many centuries to find the electron, and then many decades to find anti-particles. Why should we hurry now? After all, as Dr. Rahul Sinha from the Institute of Mathematical Sciences told me after the Symposium had concluded, “a conclusive statement cannot be made as yet”. At this stage, even waiting for many years might not be necessary. The LHC is set to reawaken around January 2015 after a series of upgrades that will let the machine deliver 10 times more particle collisions per second per unit area. Mayhap a superpartner particle can be found lurking in this profusion by, say, 2017.

There are also plans for other more specialised colliders, such as Project X in the USA, which India has expressed interest in formally cooperating with. X, proposed to be built at the Fermilab National Accelerator Laboratory, Illinois, will produce high intensity proton beams to investigate a variety of hitherto unexplored realms. One of them is to produce heavy short-lived isotopes of elements like radium or francium, and use them to study if the electron has a dipole moment, or a pronounced negative charge along one direction, which Susy allows for.

(Moreover, if Project X is realised it could prove extra-useful for India because it makes possible a new kind of nuclear reactor design, called the accelerator-driven sub-critical reactor, which operates without a core of critical-mass radioactive fuel, rendering impossible accidents like Chernobyl and Fukushima, while also being capable of inducing fission reactions using lighter fuel like thorium.)

Yet another avenue to explore Susy would be looking for dark matter particles using highly sensitive particle detectors such as LUX, XENON1T and CDMS. According to some supersymmetric models, the lightest Susy particles could actually be dark matter particles, so if a few are spotted and studied, they could buffet this theory’s sagging credence.

… which serves to remind us that this excitement could cut the other way, too. What if the LHC in its advanced avatar is still unable to find evidence of Susy? In fact, the Advanced Cold Molecule Electron group at Harvard University announced in December 2013 that they were able to experimentally rule out that they electron had a dipole moment with the highest precision attained to date. After such results, physicists will have to try and rework the theory, or perhaps zero in on other aspects of it that can be investigated by the LHC or Project X or other colliders.

But at the end of the day, there is also the romance of it all. It took George Sterman many years to find a theory as elegant and straightforward as Susy – an island of orderliness in the insane sea of quantum mechanics. How quickly would he give it up?

O Hunter, snare me his shadow!
O Nightingale, catch me his strain!
Else moonstruck with music and madness
I track him in vain!

— Oscar Wilde, In The Forest

Feeling the pulse of the space-time continuum

The Copernican
April 17, 2014

Haaaaaave you met PSR B1913+16? The first three letters of its name indicate it’s a pulsating radio source, an object in the universe that gives off energy as radio waves at very specific periods. More commonly, such sources are known as pulsars, a portmanteau of pulsating stars.

When heavy stars run out of hydrogen to fuse into helium, they undergo a series of processes that sees them stripped off their once-splendid upper layers, leaving behind a core of matter called a neutron star. It is extremely dense, extremely hot, and spinning very fast. When it emits electromagnetic radiation in flashes, it is called a pulsar. PSR B1913+16 is one such pulsar, discovered in 1974, located in the constellation Aquila some 21,000 light-years from Earth.

Finding PSR B1913+16 earned its discoverers the Nobel Prize for physics in 1993 because this was no ordinary pulsar, and it was the first to be discovered of its kind: of binary stars. As the ‘B’ in its name indicates, it is locked in an epic pirouette with a nearby neutron star, the two spinning around each other with the orbit’s total diameter spanning one to five times that of our Sun.

Losing energy but how?

The discoverers were Americans Russell Alan Hulse and Joseph Hooton Taylor, Jr., of the University of Massachusetts Amherst, and their prize-winning discovery didn’t culminate with just spotting the binary pulsar that has come to be named after them. Further, they found that the pulsar’s orbit was shrinking, meaning the system as a whole was losing energy. They found that they could also predict the rate at which the orbit was shrinking using the general theory of relativity.

In other words, PSR B1913+16 was losing energy as gravitational energy while proving a direct (natural) experiment to verify Albert Einstein’s monumental theory from a century ago. (That a human was able to intuit how two neutron stars orbiting each other trillions of miles away could lose energy is homage to the uniformity of the laws of physics. Through the vast darkness of space, we can strip away with our minds any strangeness of its farthest reaches because what is available on a speck of blue is what is available there, too.)

While gravitational energy, and gravitational waves with it, might seem like an esoteric concept, it is easily intuited as the gravitational analogue of electromagnetic energy (and electromagnetic waves). Electromagnetism and gravitation are the two most accessible of the four fundamental forces of nature. When a system of charged particles moves, it lets off electromagnetic energy and so becomes less energetic over time. Similarly, when a system of massive objects moves, it lets off gravitational energy… right?

“Yeah. Think of mass as charge,” says Tarun Souradeep, a professor at the Inter-University Centre for Astronomy and Astrophysics, Pune, India. “Electromagnetic waves come with two charges that can make up a dipole. But the conservation of momentum prevents gravitational radiation from having dipoles.”

According to Albert Einstein and his general theory of relativity, gravitation is a force born due to the curvature, or roundedness, of the space-time continuum: space-time bends around massive objects (an effect very noticeable during gravitational lensing). When massive objects accelerate through the continuum, they set off waves in it that travel at the speed of light. These are called gravitational waves.

“The efficiency of energy conversion – from the bodies into gravitational waves – is very high,” Prof. Souradeep clarifies. “But they’re difficult to detect because they don’t interact with matter.”

Albie’s still got it

In 2004, Joseph Taylor, Jr., and Joel Weisberg published a paper analysing 30 years of observations of PSR B1913+16, and found that general relativity was able to explain the rate of orbit contraction within an error of 0.2 per cent. Should you argue that the binary system could be losing its energy in many different ways, that the theory of general relativity is able to so accurately explain it means that the theory is involved, and in the form of gravitational waves.

Prof. Souradeep says, “According to Newtonian gravity, the gravitational pull of the Sun on Earth was instantaneous action at a distance. But now we know light takes eight minutes to come from the Sun to Earth, which means the star’s gravitational pull must also take eight minutes to affect Earth. This is why we have causality, with gravitational waves in a radiative mode.”

And this is proof that the waves exist, at least definitely in theory. They provide a simple, coherent explanation for a well-defined problem – like a hole in a giant jigsaw puzzle that we know only a certain kind of piece can fill. The fundamental particles called neutrinos were discovered through a similar process.

These particles, like gravitational waves, hardly interact with matter and are tenaciously elusive. Their discovery was predicted by the physicist Wolfgang Pauli in 1930. He needed such a particle to explain how the heavier neutron could decay into the lighter proton, the remaining mass (or energy) being carried away by an electron and a neutrino antiparticle. And the team that first observed neutrinos in an experiment, in 1942, did find it under these circumstances.

Waiting for a direct detection

On March 17, radio-astronomers from the Harvard-Smithsonian Centre for Astrophysics (CfA) announced a more recent finding that points to the existence of gravitational waves, albeit in a more powerful and ancient avatar. Using a telescope called BICEP2 located at the South Pole, they found the waves’ unique signature imprinted on the cosmic microwave background, a dim field of energy leftover from the Big Bang and visible to this day.

At the time, Chao-Lin Kuo, a co-leader of the BICEP2 collaboration, had said, “We have made the first direct image of gravitational waves, or ripples in space-time across the primordial sky, and verified a theory about the creation of the whole universe.”

Spotting the waves themselves, directly, in our human form is impossible. This is why the CfA discovery and the orbital characteristics of PSR B1913+16 are as direct detections as they get. In fact, finding one concise theory to explain actions and events in varied settings is a good way to surmise that such a theory could exist.

For instance, there is another experiment whose sole purpose has been to find gravitational waves, using laser. Its name is LIGO (Laser Interferometer Gravitational-wave Observatory). Its first phase operated from 2002 to 2010, and found no conclusive evidence of gravitational waves to report. Its second phase is due to start this year, in 2014, in an advanced form. On April 16, the LIGO collaboration put out a 20-minute documentary titled Passion for Understanding, about the “raw enthusiasm and excitement of those scientists and researchers who have dedicated their professional careers to this immense undertaking”.

The laser pendula

LIGO works like a pendulum to try and detect gravitational waves. With a pendulum, there is a suspended bob that goes back and forth between two points with a constant rhythm. Now, imagine there are two pendulums swinging parallel to each other but slightly out of phase, between two parallel lines 1 and 2. So when pendulum A reaches line 1, pendulum B hasn’t got there just yet, but it will soon enough.

When gravitational waves, comprising peaks and valleys of gravitational energy, surf through the space-time continuum, they induce corresponding crests and troughs that distort the metrics of space and passage of time in that area. When the two super-dense neutron stars that comprise PSR B1913+16 move around each other, they must be letting off gravitational waves in a similar manner, too.

When such a wave passes through the area where we are performing our pendulums experiment, they are likely to distort their arrival times to lines 1 and 2. Such a delay can be observed and recorded by sensitive instruments.

Analogously, LIGO uses beams of light generated by a laser at one point to bounce back and forth between mirrors for some time, and reconvene at a point. And instead of relying on the relatively clumsy mechanisms of swinging pendulums, scientists leverage the wave properties of light to make the measurement of a delay more precise.

At the beach, you’ll remember having seen waves forming in the distance, building up in height as they reach shallower depths, and then crashing in a spray of water on the shore. You might also have seen waves becoming bigger by combining. That is, when the crests of waves combine, they form a much bigger crest; when a crest and a trough combine, the effect is to cancel each other. (Of course this is an exaggeration. Matters are far less exact and pronounced on the beach.)

Similarly, the waves of laser light in LIGO are tuned such that, in the absence of a gravitational wave, what reaches the detector – an interferometer – is one crest and one trough, cancelling each other out and leaving no signal. In the presence of a gravitational wave, there is likely to be one crest and another crest, too, leaving behind a signal.

A blind spot

In an eight-year hunt for this signal, LIGO hasn’t found it. However, this isn’t the end because, like all waves, gravitational waves should also have a frequency, and it can be anywhere in a ginormous band if theoretical physicists are to be believed (and they are to be): between 10-7 and 1011 hertz. LIGO will help humankind figure out which frequency ranges can be ruled out.

In 2014, the observatory will also reawaken after four-years of being dormant and receiving upgrades to improve its sensitivity and accuracy. According to Prof. Souradeep, the latter now stands at 10-20 m. One more way in which LIGO is being equipped to find gravitational waves is by created a network of LIGO detectors around Earth. There are already two in the US, one in Europe, and one in Japan (although the Japanese LIGO uses a different technique).

But though the network improves our ability to detect gravitational waves, it presents another problem. “These detectors are on a single plane, making them blind to a few hundred degrees of the sky,” Prof. Souradeep says. This means the detectors will experience the effects of a gravitational wave but if it originated from a blind spot, they won’t be able to get a fix on its source: “It will be like trying to find MH370!” Fortunately, since 2010, there have been many ways proposed to solve this problem, and work on some of them is under way.

One of them is called eLISA, for Evolved Laser Interferometer Space Antenna. It will attempt to detect and measure gravitational waves by monitoring the locations of three spacecraft arranged in an equilateral triangle moving in a Sun-centric orbit. eLISA is expected to be launched only two decades from now, although a proof-of-concept mission has been planned by the European Space Agency for 2015.

Another solution is to install a LIGO detector on ground and outside the plane of the other three – such as in India. According to Prof. Souradeep, LIGO-India will reduce the size of the blind spot to a few tens of degrees – an order of magnitude improvement. The country’s Planning Commission has given its go-ahead for the project as a ‘mega-science project’ in the 12th Five Year Plan, and the Department of Atomic Energy, which is spearheading the project, has submitted a note to the Union Cabinet for approval. With the general elections going on in the country, physicists will have to wait until at least June or July to expect to get this final clearance.

Once cleared, of course, it will prove a big step forward not just for the Indian scientific community but also for the global one, marking the next big step – and possibly a more definitive one – in a journey that started with a strange pulsar 21,000 light-years away. As we get better at studying these waves, we have access to a universe visible not just in visible light, radio-waves, X-rays or neutrinos but also through its gravitational susurration – like feeling the pulse of the space-time continuum itself.

Ambivalent promises for S&T in the BJP manifesto

The Copernican
April 7, 2014

Even though they haven’t been in power for the last decade, the Bharatiya Janata Party (BJP) concedes no concrete assurances for science & technology in the country in its manifesto ahead of the 2014 Lok Sabha polls. However, these subjects are geared to be utilised for the benefit of other sectors in which specific promises feature aplenty. Indeed, the party’s S&T section of the manifesto reads like a bulleted list of the most popular problems for scientific research in India and the world, although that the party has taken cognizance of this-and-that is heartening.

The BJP makes no mention of increasing India’s spending on S&T while the Indian National Congress promises to do that to 2% of GDP, a long-standing demand. On the upside, however, both parties mention that they would like to promote private sector involvement in certain areas like agriculture, education, transportation and public infrastructure, but only the BJP mentions it in the context of scientific research.

As things stand, private sector involvement in scientific research in India is very low. A DST report from May 2013 claims that it would like to achieve 50-50 investment from public and private participants by 2017, while the global norm stands at 66-34 in favour of private. It is well-documented that higher private sector involvement, together with more interdisciplinary research, reduces the time for commercialization of technologies – which the BJP aspires to in its manifesto. However, the party doesn’t mention the sort of fiscal and policy benefits it will be willing to use to stimulate the private sector.

Apart from this, there are other vague aspirations, too. Sample the following.

  • Promotion of innovation by creating a comprehensive national system of innovation
  • Set [up] an institute of Big data and Analytics for studying the impact of big data across sectors for predictive science
  • Establish an Intellectual Property Rights Regime

Climate change

There is also mention of tackling climate change, with a bias toward the Himalayan region. Under the S&T section, there’s a point about establishing a “Central University dedicated to Himalayan technology”. With respect to conservation efforts, BJP proposes to “launch ‘National Mission on Himalayas’ as a unique programme of inter-governmental partnership, in coordinated policy making and capacity building across states and sectors”, not to mention promote tourism as well.

The BJP also says it would like to make the point of tackling climate change a part of its foreign policy. However, its proposed power generation strategy does also include coal, natural gas and oil, apart from wanting to maximise the potential of renewable energy sources. Moreover, it also promotes the use of carbon credits, which is an iffy idea as this is a very malleable system susceptible to abuse, especially by richer agents operating across borders.

“Take steps to increase the domestic coal exploration and production, to bridge the demand and supply gap. Oil and gas explorations would also be expedited in the country. This will also help to reduce the import bill.”

Until here, not much is different from what the Congress is already promising, albeit with different names.

The BJP appears to be very pro-nuclear. Under its ‘Cultural Heritage’ section, the manifesto mentions Ram Setu in the context of its vast thorium deposits. How this is part of our cultural heritage, I’m not sure. The party also proposes to build “world class, regional centres of excellence of scientific research” for nanotechnology, material sciences, “thorium technology” and brain research. Sure, India has thorium reserves, but the design for a thorium-based nuclear power plant came out only in February 2014, and an operational system is only likely to be ready by the end of this decade.

Troubling stuff

If spending doesn’t increase, these promises are meaningless. Moreover, there are also some pending Bills in the Lok Sabha concerning the setting up of new universities, as well as a materials science initiative named ISMER pending from 2011. With no concrete promises, will those initiatives set forth by the INC but not really followed through see the light of day?

In fact, two things trouble me.

  1. A no-mention of scientific research that is not aimed at improving the quality of life in a direct way, i.e. our space program, supercomputing capabilities, fundamental research, etc.
  2. How the private sector is likely to be motivated to invest in government-propelled R&D, to what extent, and if it will be allowed to enter sensitive areas like power generation.

Clearly, the manifesto is a crowd-pleaser, and to that end it has endeavoured to bend science to its will. In fact, there is nothing more troubling in the entire document than the BJP’s intention to “set up institutions and launch a vigorous program to standardize and validate the Ayurvedic medicine”. I get that they’re trying to preserve our historical traditions, etc., but this sounds like an agenda of the Minitrue to me.

And before this line comes the punchline: “We will start integrated courses for Indian System of Medicine (ISM) and modern science and Ayurgenomics.”

The nonsense in Wockhardt’s reply to the FDA

The Copernican
April 7, 2014

I love the nonsense that companies put in their replies to accusations of willful negligence. Consider this from Wockhardt after a US FDA inspection found piss, mold and samples tested “into compliance” [emphasis mine] at its Chikalthana manufacturing plant in Aurangabad, India.

We are also leveraging technology and deploying enterprise-wide software that will streamline the entire quality and compliance system. This is backed by a comprehensive compliance training program for all personnel responsible for manufacturing and quality control.

Over the last few days, speculation has been rife that Wockhardt will be able to reach a quick resolution with the FDA. Earlier today, Sun Pharma announced that it will be fully acquiring Ranbaxy Labs – two other companies that have come under fire from the FDA for maintaining unsanitary working conditions.

What Wockhardt has rambled on about in its reply should’ve happened before the plant received a license to manufacture drugs (goes to show how terrible India’s regulatory measures are). One of the drugs is a variant of metoprolol, a beta-blocker used to treat some cardiovascular diseases, hypertension and angina pectoris. It finds mention in a WHO factsheet of essential medicines.

In the US alone, 27 million prescriptions for metoprolol are filled yearly according to a US National Library of Medicine assessment. After the FDA find, exports to the US are likely to be stopped from Wockhardt. For India, I couldn’t find the exact amount of consumption, but according to many manufacturers, it’s a ‘high growth trajectory’ drug and its consumption through multiple variants could easily be in the tens of millions.

Of course, Wockhardt is not alone in this – its Waluj plant has also come under scrutiny. Last month, Sun Pharma’s Gujarat plant was barred from exporting to the USA as was, in 2013, Ranbaxy’s Toansa plant. Incidentally, a Bangalore-based facility of Canadian manufacturer Apotex, Inc., has also been banned. This goes to show that, even though the insignificant exports may not hit us financially, our regulations are sparse enough to allow both foreign and domestic players to operate in shoddy conditions and release their products into the domestic market.

That three companies manufacturing such an important drug were awarded a license to manufacture without what appears to be a lack of even customary inspections is startling, especially with the contrast painted by Wockhardt’s swanky Mumbai corporate office and the conditions in its manufacturing units strewn around suburban and rural India. A Reuters report mentions that,

There are just 1,500 drug inspectors responsible for more than 10,000 factories in India, where one in every 22 locally made samples was of sub-standard quality according to a study carried out two years ago.

Here’s FDA’s letter to Wockhardt CEO Habil Khorakiwala from November 2013 warning about the Chikalthana plant’s working conditions, and another letter about the Waluj plant’s from July 2013.

The Indian medical devices industry stays foreign

India has a burgeoning medical tourism industry which, according to some estimates, is going to be worth Rs.9,500 crore in 2015 and Rs.54,000 crore in 2020. This industry evidently relies on medical imaging and diagnostics. According to an article published in 2013 by the Center for the Advanced Study of India at the University of Pennsylvania, over 75 per cent of India’s medical imaging equipment is imported, constituting a Rs.18,000-crore industry in 2011 and growing at a compounded rate of 16 per cent in 2010-2015.

There is an import duty on fully-finished devices averaging 10 per cent, which consumers pay. What is worse is that if device components are imported individually and assembled in India, there is an additional excise duty and VAT on the components, increasing the device cost. Therefore, taxation is not in favor of domestic production and against exports.

Another funny thing is that disposable medical equipment, which are technologically non-intensive, comprise less than 10% of our imports, i.e., we can locally produce the rest. Technology-intensive equipment make up around make up more than half of our imports, with the exception of X-ray imaging devices which comprise 25% of our exports. These are numbers from the Annual Survey of Industry, CMIE and the Department of Commerce (GoI).

The more some devices remain import-intensive, the more they could inflate healthcare costs in a country where only around 20% of it is publicly funded yet.

This seems a weird position to be in. On the one hand, we plan to expand our public healthcare system to more than 500 million people by 2020, and on the other, don’t reduce costs of the devices that will form the spine of this system. There was a situation in 2010, ahead of the presentation of the Union Budget, when the Department of Pharma sought a cut in the customs duty on some medical devices to facilitate imports while the Association of Indian Medical Device Industry sought a hike in the customs duty to promote domestic innovation.

Thanks to our population, per capita expenditure on medical technology is a frugal $2-2.5. This is an important figure because it highlights how lucrative the Indian market must seem like to giants like Siemens and GE. Further on the downside, urban centers are the primary consumers of high-quality, ‘high-technology’, high-price medical imaging/diagnostic equipment and implants. A July 2010 report from Deloitte explained this well:

One example to illustrate low penetration is sales of pacemakers. At 18,000 units per year, India’s pacemaker penetration is just 1% of western levels. According to Dinesh Puri, CEO, MediVed, India should be selling a million pacemakers a year, considering heart disease is a major killer in India.

‘Poverty first, Mars next’ is a non-idea

The Copernican
April 4, 2014

I am on nobody’s side because nobody is on my side.” – Treebeard, Lord of the Rings

Thanks to two wonderful pieces in the April 3 edition of The Hindu (by D. Balasubramanian andR. Prasad) talking about how scientific enterprise in India has been constantly undermined, it’s pretty clear that there is a perception schism between the fantasies of and the reality of publicly funded scientific development in the country. The underminers in question have been bureaucracy and, periodically, ignorance by the Indian polity – of late, in the form of political manifestos choosing to leave out scientific agendas in favour of more populist schemes.

But with bureaucracy, that is only to be expected. What is not is that, beyond a circle of scientists and science communicators, people seem to be okay with it, too. And this exclusion from the scheme of things has become two-pronged. Among the people, science has been malleated into the form of an unpredictable tool to further our developmental goals. Among the politicians, science has become a thing whose fundamentals can be called into question to pander to political expediency.

Sadly, scientific research and development has been instrumental to India’s progress since even the British Raj, when the construction of factories, transportation routes and communication lines (including what is still one of the world’s largest railway networks) helped dismantle feudalism. After Independence, however, a series of unfortunate mistakes have come together to knock the scientific temperament out of its rightful place in governance.

As Dr. Mathai Joseph told The Hindu, “The fact that scientific departments are modelled on the rest of the bureaucracy has turned out to be a big mistake. That’s because bureaucracy is not designed to encourage innovation.”

Who runs the science?

In August 2012, Colin Macilwain had touched on a similar topic with a piece in Nature titled ‘What matters for science is who runs the country‘. Working on the reasonable assumptions that a) researchers would want someone in the government to further their interests, and b) a government would want a scientist on its side to hone policies, Macilwain suggested that the role of a Scientific Adviser was to bridge the political and scientific classes.

Over the years, however, the Indian chief SA’s role, though continuing to attempt to bridge this divide, has become steadily less effectual. At least as far as C.N.R. Rao is concerned: he set up the IISERs and the Science and Engineering Research Board (SERB), which serve important goals in their own right but also fall prey to the effects of a bureaucratic administration. Moreover, though there has been a growing demand from the scientific community to get the Indian government to spend more than 1% of its GDP on R&D, there is no concerted call from either side to establish a mechanism to ensure that grants are allocated purely on merit, and thereafter to ensure accountability in spending.

In the Vote of Accounts presented by FM P. Chidambaram on February 17, point #74 did proposesomething remedial (albeit as a tax-redemption measure): “I … propose to set up a Research Funding Organisation [RFO] that will fund research projects selected through a competitive process. Contributions to that organisation will be eligible for tax benefits. This will require legislative changes which can be introduced at the time of the regular Budget”

Incidentally, when Rao helped set up the SERB in 2008, its stated aim was to promote research in the basic sciences and provide financial assistance to those who engaged in it. Detrimentally, its Board is chaired by a secretary to the Government of India, and 7 of its 16 other Board members are government agents. As for how likely the next government is to pursue the RFO: I don’t know, but I don’t have my hopes up. For as long as grant-allocation and the government remain strongly coupled, not much is likely to change.

In fact, the government’s involvement is not limited to grants but also extends to issues of autonomy, such as in the appointment of Chancellors or Vice-chancellors, all of which together directly affects the quality and direction of research undertaken. And the situation is only likely to worsen, as D. Balasubramanian mentions in his article, when educational institutions like IITs and IIMs are proposed to be set up to make political amends.

I write all of this, of course, keeping in mind the following lines from the April 3 Speaking of Sciencecolumn in The Hindu: “The central finance ministry, with one stroke of a pen, has cut the operating budget of all science departments by almost 30 per cent of the originally sanctioned amounts. As a result, the science ministries and departments have defaulted in their grant payments and in some instances even salaries. Many young research students are yet to be paid their monthly fellowship money.”

Good idea, bad implementation

Simultaneously, it would seem the government has acquired a bias over the years about the sectors it considers strategic and those it considering available for politically expedient manipulation. The former section accommodates areas like social policies, domestic policies, defence, PDS, employment, etc. The latter accommodates areas like scientific research – but not all of it.

Consider how areas like telecommunication and nuclear physics have received substantial monetary and infrastructural support from the government, while astronomy and materials science lag behind. This divisive addressing of different disciplines has also resulted in a fractious working environment for scientists: collaborations are too few and far between, and interdisciplinary R&D is stifled. If thewords of Luiz Davidovich, a Brazilian researcher speaking at the World Science Forum in Rio de Janeiro, are to be believed, this is a problem plaguing the world’s emerging powers. Perhaps this is one of the most important lessons we should be learning from the USA and the EU.

The government, in its choice of subjects, has also been limited by its own middling knowledge of how likely these enterprises are to elevate sections of the Indian population out of poverty and toward better access to the basic amenities (if not to further vested interests, of course). This is again an instance of expediency and is not sustainable for the scientific community because it implies a support-structure that requires scientists to submit to the government’s agenda. The ideal situation would have the roles well balanced, to see scientific research blossom to improve the quality of all walks of life.

Now, the country’s any meaningful scientific output geared at improving the quality of life in the country is becoming poisoned by government mismanagement. For instance, while many countries have been able to engender a healthy debate on whether a nuclear power plant should be built or if GM crop seeds should be sold, a pall of negativity has descended on these subjects in India because we are unable to separate the DAE from nuclear power generation and the DBT from genetic modification. We must thank a stubborn lack of transparency for this.

Scientific research as an industry

If the fantasy of a fully decoupled government support and government funding were to be realised, and the screen of bureaucracy lifted from our institutions, we would have the chance to be better organised with our research interests. Put practically, we wouldn’t have to fund a fusion project in France because we’d have the temperament to develop a low-cost alternative in India itself (where labour continues to be cheap).

Those in power should know that science, as an organised articulation of human curiosity, is capable of developing products, services and technologies that go beyond alerting farmers of approaching storms or reducing the cost of a smartphone to less than one-plumbed-toilet. Scientific research can also found industries (opening up the thousands of jobs that campaigning politicians promise to the marginalised sections of the electorate), engage graduating scholars (the number of research degrees awarded increased by over 50% between 2008 and 2011, to 16,093, according to a UGC report), elevate the quality of education in the country, promote innovation (by reducing the time taken for a prototype engineered in the lab to a product mass-produced – an important mechanism for labs to prove useful in the eyes of the tax-payer), and cure diseases (did you hear about the Foldscope?).

In fact, those who clamour that India should be alleviating poverty before launching satellites to Mars should shed a sadly prevalent impression of scientific research and technological development that precludes incentives such as job-creation and technology-transfer. Scientific R&D is an industry – rather, can be – like any other. By launching a satellite to Mars (hopefully Mangalyaan will make it), technicians at ISRO now have the capability to coordinate such sophisticated programs. They could also possibly bring in revenue in the future by affording high-load launch-vehicles like the GSLV for developing countries that can’t cough up for the American/European coffers. And in the midst of all this, we must not over-celebrate the frugal budget with which we achieved this feat but use it as an opportunity to ask for incrementally more funding.

In another example, India designed and manufactured some of the superconducting magnets, accelerator heater protection systems and cryogenic facilities used to operate the Large Hadron Collider in Europe. Such components are also commonly used in medical imaging and diagnostics, and India already has a burgeoning medical tourism industry which, according to some estimates, is going to be worth Rs.9,500 crore in 2015. Thus, it seems we also stand to gain if only we could leverage local talent in devising products tailored for the Indian consumer.

As Rahul Sinha, a professor at the Institute of Mathematical Sciences, Chennai, remarked: “Physics is a technology developer.” So this schism between ‘blue sky’ scientific research and India’s developmental hurdles is one that, in an ideal world, doesn’t exist. That it does in our country is thanks only to a government’s mismanagement of its powers.

An elusive detector for an elusive particle

(This article originally appeared in The Hindu on March 31, 2014.)

In the late 1990s, a group of Indian physicists pitched the idea of building a neutrino observatory in the country. The product of that vision is the India-based Neutrino Observatory (INO) slated to come up near Theni district in Tamil Nadu, by 2020. According to the 12th Five Year Plan report released in October 2011, it will be built at a cost of Rs.1,323.77 crore, borne by the Departments of Atomic Energy (DAE) and Science & Technology (DST).

By 2012, these government agencies, with the help of 26 participating institutions, were able to obtain environmental clearance, and approvals from the Planning Commission and the Atomic Energy Commission. Any substantial flow of capital will happen only with Cabinet approval, which has still not been given after more than a year.

If this delay persists, the Indian scientific community will face greater difficulty in securing future projects involving foreign collaborators because we can’t deliver on time. Worse still, bright Indian minds that have ideas to test will prioritise foreign research labs over local facilities.

‘Big science’ is international

This month, the delay acquired greater urgency. On March 24, the Institute of High Energy Physics, Beijing, announced that it was starting construction on China’s second major neutrino research laboratory — the Jiangmen Underground Neutrino Observatory (JUNO), to be completed at a cost of $350 million (Rs. 2,100 crore) by 2020.

Apart from the dates of completion, what Indian physicists find more troubling is that, once ready, both INO and JUNO will pursue a common goal in fundamental physics. Should China face fewer roadblocks than India does, our neighbour could even beat us to some seminal discovery. This is not a jingoistic concern for a number of reasons.

All “big science” conducted today is international in nature. The world’s largest scientific experiments involve participants from scores of institutions around the world and hundreds of scientists and engineers. In this paradigm, it is important for countries to demonstrate to potential investors that they’re capable of delivering good results on time and sustainably. The same paradigm also allows investing institutions to choose whom to support.

India is a country with prior experience in experimental neutrino physics. Neutrinos are extremely elusive fundamental particles whose many unmeasured properties hold clues about why the universe is the way it is.

In the 1960s, a neutrino observatory located at the Kolar Gold Fields in Karnataka became one of the world’s first experiments to observe neutrinos in the Earth’s atmosphere, produced as a by-product of cosmic rays colliding with its upper strata. However, the laboratory was shut in the 1990s because the mines were being closed.

However, Japanese physicist Masatoshi Koshiba and collaborators built on this observation with a larger neutrino detector in Japan, and went on to make a discovery that (jointly) won him the Nobel Prize for Physics in 2002. If Indian physicists had been able to keep the Kolar mines open, by now we could have been on par with Japan, which hosts the world-renowned Super-Kamiokande neutrino observatory involving more than 900 engineers.

Importance of time, credibility

In 1998, physicists from the Institute of Mathematical Sciences (IMSc), Chennai, were examining a mathematical parameter of neutrinos called theta-13. As far as we know, neutrinos come in three types, and spontaneously switch from one type to another (Koshiba’s discovery).

The frequency with which they engage in this process is influenced by their masses and sources, and theta-13 is an angle that determines the nature of this connection. The IMSc team calculated that it could at most measure 12°. In 2012, the Daya Bay neutrino experiment in China found that it was 8-9°, reaffirming the IMSc results and drawing attention from physicists because the value is particularly high. In fact, INO will leverage this “largeness” to investigate the masses of the three types of neutrinos relative to each other.

So, while the Indian scientific community is ready to work with an indigenously designed detector, the delay of a go-ahead from the Cabinet becomes demoralising because we automatically lose time and access to resources from potential investors.

“This is why we’re calling it an India-based observatory, not an Indian observatory, because we seek foreign collaborators in terms of investment and expertise,” says G. Rajasekaran, former joint director of IMSc, who is involved in the INO project.

On the other hand, China appears to have been both prescient and focussed on its goals. It purchased companies manufacturing the necessary components in the last five years, developed the detector technology in the last 24 months, and was confident enough to announce completion in barely six years. Thanks to its Daya Bay experiment holding it in good stead, JUNO is poised to be an international collaboration, too. Institutions from France, Germany, Italy, the U.S. and Russia have evinced interest in it.

Beyond money, there is also a question of credibility. Once Cabinet approval for INO comes through, it is estimated that digging the vast underground cavern to contain the principal neutrino detector will take five years, and the assembly of components, another year more. We ought to start now to be ready in 2020.

Because neutrinos are such elusive particles, any experiments on them will yield correspondingly “unsure” results that will necessitate corroboration by other experiments. In this context, JUNO and INO could complement each other. Similarly, if INO is delayed, JUNO is going to look for confirmation from experiments in Japan, South Korea and the U.S.

It is notable that the INO laboratory’s design permits it to also host a dark-matter decay experiment, in essence accommodating areas of research that are demanding great attention today. But if what can only be called an undue delay on the government’s part continues, we will again miss the bus.

Forget me. I’m there.

You don’t have to walk up to stand next to me, you don’t have to hug me. You don’t have to want to kiss me. You just have to look at me in the eye, Stranger, when you walk past. You needn’t smile either. You just have to acknowledge that I exist. That’s all I need.

You just have to drive your car in front of mine and switch on your indicator when you’re taking a turn. Even if there’s no other car on the road except ours and it’s dusk. Turn on your indicator all for me and I’m yours. Tell me you’re closing up for the day just when I’m about to step in your store. Don’t bring the shutters down on my face without a warning. Tell me you’re sorry without meaning it but just because I’m there about to enter your store. Tell me and I’m all yours.

Share an umbrella with your friend when it rains and whisper into her ear about how I’m getting wet, standing in the middle of the road like that. Giggle behind my back about the fool I look and I will thank you. Be annoyed when I set my glass of orange juice on your glass table without a coaster and I’ll know you know I’m here.

Fix the automated doors at the mall to open when I’m approaching them and I will kiss one goodbye. Flash a marquee on the TV asking me to stay indoors because a storm’s coming and I’ll die happy that night. Give me a dial tone when I pick up the phone because I don’t want you to assume nobody’s listening. Somebody’s listening, somebody’s listening all the time. I think that’s me.

So… don’t walk up to me to shake my hand. Don’t bump into me and then act like you’ve forgotten me. Forget me, but when you see me, smile.

Lord of the Rings Day

Today is Lord of the Rings Day. On this day, in the year 3019 of the Third Age, Frodo Baggins and Samwise Gamgee reach the Sammath Naur and cast the One Ring into Orodruin, in whose fires the ring was first forged. Thus, the ring is destroyed and leads to the downfall of Sauron, the Dark Lord. However, this doesn’t mark the end of the War of the Ring (although it does in the movies) – that happens when Saruman is defeated in the Battle of Bywater by the hobbits on November 3 of the same year.

Why do I still remember the date? I don’t know. Tolkien’s books were good, three of the best, in fact, and much better than the trope to come after. There were a few notable exceptions, but nothing has came to being just as original until, I’d say, GRRM and Erikson. I was briefly excited by Robert Jordan but his more classical narrative combined with a droning style bored me. It was never the length because one of my enduring favourites is Steven Erikson’s Malazan Book of the Fallen series, which has seen 10 books and one part of a trilogy already out (all kickass – you should check them out).

Nevertheless, reading Lord of the Rings in 2003 was an important part of my life. In the years since, I have taken away different morals from the book – which, thankfully, aren’t as mundane as Jordan’s nor as multi-hued as Erikson’s (or as gruesome as Martin’s or as juvenile as Feist’s). Beyond the immediate take-away that is good-versus-evil, there are tales of friendships, sacrifices, trust, humility and leadership. And what a great epic all of it made! As it happens, Lord of the Rings Day is actually Tolkien Reading Day. So if you haven’t already read the trilogy, or its adorable prequel The Hobbit (or Silmarillion, for that matter), grab a copy and start. It’s never too late.