Neuromorphic hype

We all know there’s a difference between operating an Indica Diesel car and a WDP 4 diesel locomotive. The former has two cylinders and the latter 16. But that doesn’t mean the WDP 4 simply has eight times more components as the Indica. This is what comes to my mind when I come across articles that trumpet an achievement without paying any attention to its context.

In an example from yesterday, IEEE Spectrum published an article with the headline ‘Nanowire Synapses 30,000x Faster Than Nature’s’. An artificial neural network is a network of small data-processing components called neurons. Once the neurons are fed data, they work together to analyse it and solve problems (like spotting the light from one star in a picture of a galaxy). The network also iteratively adjusts the connections between neurons, called synapses, so that the neurons cooperate more efficiently. The architecture and the process broadly mimic the way the human brain works, so they’re also collected under the label ‘neuromorphic computing’.

Now consider this excerpt:

“… a new superconducting photonic circuit … mimics the links between brain cells—burning just 0.3 percent of the energy of its human counterparts while operating some 30,000 times as fast. … the synapses are capable of [producing output signals at a rate] exceeding 10 million hertz while consuming roughly 33 attojoules of power per synaptic event (an attojoule is 10-18 of a joule). In contrast, human neurons have a maximum average [output] rate of about 340 hertz and consume roughly 10 femtojoules per synaptic event (a femtojoule is 10-15 of a joule).”

The article, however, skips the fact that the researchers operated only four circuit blocks in their experiment – while there are 86 billion neurons on average in the human brain working at the ‘lower’ efficiency. When such a large assemblage functions together, there are emergent problems that aren’t present when a smaller assemblage is at work, like removing heat and clearing cellular waste. (The human brain also contains “85 billion non-neuronal cells”, including the glial cells that support neurons.) The energy efficiency of the neurons must be seen in this context, instead of being directly compared to a bespoke laboratory setup.

Philip W. Anderson’s ‘more is different’ argument provides a more insightful argument against such reductive thinking. In a 1972 essay, Anderson, a theoretical physicist, wrote:

“The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. In fact, the more the elementary particle physicists tell us about the nature of the fundamental laws the less relevance they seem to have to the very real problems of the rest of science, much less to those of society.”

He contended that the constructionist hypothesis – that you can start from the first principles and arrive straightforwardly at a cutting-edge discovery in that field – “breaks down” because it can’t explain “the twin difficulties scale and complexity”. That is, things that operate on larger scale and with more individual parts are physically greater than the sum of those parts. (I like to think Anderson’s insight to be the spatial analogue of L.P. Hartley’s time-related statement of the same nature: “The past is a foreign country, they do things differently there.”)

So let’s not celebrate something because it’s “30,000x faster than” the same thing in nature – as the Spectrum article’s headline goes – but because it represents good innovation in and of itself. Indeed, the researchers who conducted the new study and are quoted in the article don’t make the comparison themselves but focus on the leap forward their innovation portends in the field of neuromorphic computing.

Faulty comparisons on the other hand could inflate readers’ expectations about what the outcomes of future innovation could be, and when it (almost) inevitably starts to fall behind nature’s achievements, those unmet expectations could seed disillusionment. We’ve already had this happen with quantum computing. Spectrum‘s choice could have been motivated by wanting to pique readers’ interest, which is a fair thing to aspire to, but it remains that the headline employed a clichéd comparison, with nature, instead of expending more effort and framing the idea right.

The constructionist hypothesis and expertise during the pandemic

Now that COVID-19 cases are rising again in the country, the trash talk against journalists has been rising in tandem. The Indian government was unprepared and hapless last year, and it is this year as well, if only in different ways. In this environment, journalists have come under criticism along two equally unreasonable lines. First, many people, typically supporters of the establishment, either don’t or can’t see the difference between good journalism and contrarianism, and don’t or can’t acknowledge the need for expertise in the practise of journalism.

Second, the recognition of expertise itself has been sorely lacking across the board. Just like last year, when lots of scientists dropped what they were doing and started churning out disease transmission models each one more ridiculous than the last, this time — in response to a more complex ‘playing field’ involving new and more variants, intricate immunity-related mechanisms and labyrinthine clinical trial protocols — too many people have been shouting their mouths off, and getting most of it wrong. All of these misfires have reminded us of two things: again and again that expertise matters, and that unless you’re an expert on something, you’re unlikely to know how deep it runs. The latter isn’t trivial.

There’s what you know you don’t know, and what you don’t know you don’t know. The former is the birthplace of learning. It’s the perfect place from which to ask questions and fill gaps in your knowledge. The latter is the verge of presumptuousness — a very good place from which to make a fool of yourself. Of course, this depends on your attitude: you can always be mindful of the Great Unknown, such as it is, and keep quiet.

As these tropes have played out in the last few months, I have been reminded of an article written by the physicist Philip Warren Anderson, called ‘More is Different’, and published in 1972. His idea here is simple: that the statement “if everything obeys the same fundamental laws, then the only scientists who are studying anything really fundamental are those who are working on those laws” is false. He goes on to explain:

“The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a ‘constructionist’ one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. … The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. The behaviour of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear, and the understanding of the new behaviours requires research which I think is as fundamental in its nature as any other.”

The seemingly endless intricacies that beset the interaction of a virus, a human body and a vaccine are proof enough that the “twin difficulties of scale and complexity” are present in epidemiology, immunology and biochemistry as well – and testament to the foolishness of any claims that the laws of conservation, thermodynamics or motion can help us say, for example, whether a particular variant infects people ‘better’ because it escapes the immune system better or because the immune system’s protection is fading.

But closer to my point: not even all epidemiologists, immunologists and/or biochemists can meaningfully comment on every form or type of these interactions at all times. I’m not 100% certain, but at least from what I’ve learnt reporting topics in physics (and conceding happily that covering biology seems more complex), scale and complexity work not just across but within fields as well. A cardiologist may be able to comment meaningfully on COVID-19’s effects on the heart in some patients, or a neurologist on the brain, but they may not know how the infection got there even if all these organs are part of the same body. A structural biologist may have deciphered why different mutations change the virus’s spike protein the way they do, but she can’t be expected to comment meaningfully on how epidemiological models will have to be modified for each variant.

To people who don’t know better, a doctor is a doctor and a scientist is a scientist, but as journalists plumb the deeper, more involved depths of a new yet specific disease, we bear from time to time a secret responsibility to be constructive and not reductive, and this is difficult. It becomes crucial for us to draw on the wisdom of the right experts, who wield the right expertise, so that we’re moving as much and as often as possible away from the position of what we don’t know we don’t know even as we ensure we’re not caught in the traps of what experts don’t know they don’t know. The march away from complete uncertainty and towards the names of uncertainty is precarious.

Equally importantly, at this time, to make our own jobs that much easier, or at least less acerbic, it’s important for everyone else to know this as well – that more is vastly different.