New management at Nautilus

When an email landed in my inbox declaring that the beleaguered science communication magazine Nautilus would be “acquired by ownership group of super-fans”, I thought it was going to become a cooperative. It was only when I read the extended statement that I realised the magazine was undergoing a transformation that wasn’t at all new to the global media landscape.

A super-group of investors has come to Nautilus‘s rescue, bearing assurances that publisher John Steele repeats in the statement without any penitence for having stiffed its contributors for months on end, in some cases for over a year, for pieces already published: “Together we will work even harder to expand the public’s knowledge and understanding of fundamental questions of scientific inquiry, as well as their connection to human culture.” Steele also appears to be blind to the irony of his optimism when the “craven shit-eating” of private equity just sunk the amazing Deadspin (to quote from a suitably biting obituary by Alex Shephard).

The statement doesn’t mention whether the new investment covers pending payments and by when. In fact, the whole statement is obsessed with Nautilus‘s commitment to science in a tone that verges on cheerleading – and now and then crosses over too – which is bizarre because Nautilus is a science communication magazine, not a science magazine, so its cause, to use the term loosely, is to place science in the right context and on occasion even interrogate it. But the statement mentions an accompanying public letter entitled ‘Science Matters’. According to Steele,

The letter is a public commitment by the Nautilus team, its staff, advisors, and its contributors; leading thinkers, researchers, teachers, and businesspeople; and the public at large to tirelessly advance the cause of science in America and around the world.

Huh?

By itself such commitments don’t bode well (they’re awfully close to scientism) but they assume a frightening level of plausibility when read together with the list of investors. The latter includes Larry Summers, his wife Elisa New, and Nicholas White. One of the others, Fraser Howie, is listed as an “author” but according to his bio in the Nikkei Asian Review, “He has worked in China’s capital markets since 1992.” His authorship probably refers to his three books but they’re all about the Chinese financial system.

Everyone here is a (white) capitalist, most of them men. Call me cynical but something about this doesn’t sit well. For all the details in the statement of the investors’ institutional affiliations, it’s hard to imagine them sitting around a table and agreeing that Nautilus needs to be critical of, instead of sympathetic to, science – especially since the takeover will also transform the magazine from a non-profit to a for-profit endeavour.

The private festival

I used to think I lived in a wonderful part of Bangalore: in Malleshwaram, and not just in Malleshwaram but in a gated apartment complex with great access to greenery and lots of eateries, safe walking areas, recreational spaces, and a balcony on the fourth floor that offered a lovely view of the city on rainy evenings.

But of late the wealthy residents of this complex – most of them Hindus – have become markedly louder in their celebrations of religious and traditional occasions, installing giant speakers in the common areas to blare Bollywood music, undertaking processions along the perimeter to the accompaniment of drums and other instruments, even going door to door to angrily demand residents attend a flag-hoisting ceremony on Independence Day.

Each occasion only seems to be louder than the last, with more ‘attractions’ thrown in. The complex’s sole notice board is located in the basement and the owners’ association isn’t in the habit of asking for permission before organising loud celebrations. Everyone is simply expected to have a good time (much like the obnoxious presumption among many Hindi-speakers that everyone speaks Hindi).

A few minutes ago, I had to shout to have my father hear me over the din of a procession downstairs marking Karnataka Rajyotsava Day. A small group of decked-up men and women were swaying to the rhythms of a dhol-playing band in the anterior plaza whose inner edge ends right up against houses on the ground floor, with little thought for the people within. Its outer edge, on the other hand, extends to a few score meters before ending behind a 20-ft high metal gate guarded by four or five security personnel.

These aren’t just dutiful assertions of one’s religious identity but altogether a ridiculous display of elitism that – even in its most sensational avatar – would much rather stay indoors and away from the hoi polloi.

The virtues of local travel

Here’s something I wish I’d read before overtourism and flygskam removed the pristine gloss of desirability from the selfies, 360º panoramas and videos the second-generation elites posted every summer on the social media:

It’s ok to prioritize friendships, community, and your mental health over travelling.

Amir Salihefendic, the head of a tech company, writes this after having moved from Denmark to Taiwan for a year, and reflects on the elements of working remotely, the toll it inevitably takes, and how the companies (and the people) that champion this mode of work often neglect to mention its unglamorous side.

Remote work works only if the company’s management culture is cognisant of it. It doesn’t work if one employee of a company that ‘extracts’ work by seating its people in physical proximity, such as in offices or even co-working spaces, chooses to work from another location. This is because, setting aside the traditional reasons for which people work in the presence of other people,  offices are also designed to institute conditions that maximise productivity and, ideally, minimise stress or mental turbulence.

But what Salihefendic wrote is also true for travelling, which he undertook by going from Denmark to Taiwan. Travelling here is an act that – in the form practiced by those who sustain the distinction between a place to work, or experience pain, and a place in which to experience pleasure – renders long-distance travel a class aspiration, and the ‘opposing’ short-distance travel a ‘lesser’ thing for not maintaining the same social isolation that our masculine cities do.

This is practically the Protestant ethic that Max Weber described in his analysis of the origins of capitalism, and which Silicon Valley dudebros dichotomised as ‘word hard, party harder’. And for once, it’s a good thing that this kind of living is out of reach of nearly 99% of humankind.

Exploring neighbourhood sites is more socio-economically and socio-culturally (and not just economically and just culturally) productive. Instead of creating distinct centres of pain and pleasure, of value creation and value dispensation, local travel can reduce the extent and perception of urban sprawl, contribute to hyperlocal economic development, birth social knowledge networks that enhance civilian engagement, and generally defend against the toll of extractive capitalism.

For example, in Bengaluru, I would like to travel from Malleshwaram to Yelahanka, or – in Chennai – from T Nagar to Kottivakkam, or – in Delhi – from Jor Bagh to Vasant Kunj, for a week or two at a time, and in each case exploring a different part of the city that might as well be a different city, characterised by a unique demographic distribution, public spaces, cuisine and civic issues. And when I do, I will still have my friends and access to my community and to the social support I need to maintain my mental health.

The calculus of creative discipline

Every moment of a science fiction story must represent the triumph of writing over world-building. World-building is dull. World-building literalises the urge to invent. World-building gives an unnecessary permission for acts of writing (indeed, for acts of reading). World-building numbs the reader’s ability to fulfil their part of the bargain, because it believes that it has to do everything around here if anything is going to get done. Above all, world-building is not technically necessary. It is the great clomping foot of nerdism.

Once I’m awake and have had my mug of tea, and once I’m done checking Twitter, I can quote these words of M. John Harrison from memory: not because they’re true – I don’t believe they are – but because they rankle. I haven’t read any writing of Harrison’s, I can’t remember the names of any of his books. Sometimes I don’t remember his name even, only that there was this man who uttered these words. Perhaps it is to Harrison’s credit that he’s clearly touched a nerve but I’m reluctant to concede anymore than this.

His (partial) quote reflects a narrow view of a wider world, and it bothers me because I remain unable to extend the conviction that he’s seeing only a part of the picture to the conclusion that he lacks imagination; as a writer of not inconsiderable repute, at least according to Wikipedia, I doubt he has any trouble imagining things.

I’ve written about the virtues of world-building before (notably here), and I intend to make another attempt in this post; I should mention what both attempts, both defences, have in common is that they’re not prescriptive. They’re not recommendations to others, they’re non-generalisable. They’re my personal reasons to champion the act, even art, of world-building; my specific loci of resistance to Harrison’s contention. But at the same time, I don’t view them – and neither should you – as inviolable or as immune to criticism, although I suspect this display of a willingness to reason may not go far in terms of eliminating subjective positions from this exercise, so make of it what you will.

There’s an idea in mathematical analysis called smoothness. Let’s say you’ve got a curve drawn on a graph, between the x- and y-axes, shaped like the letter ‘S’. Let’s say you’ve got another curve drawn on a second graph, shaped like the letter ‘Z’. According to one definition, the S-curve is smoother than the Z-curve because it has fewer sharp edges. A diligent high-schooler might take recourse through differential calculus to explain the idea. Say the Z-curve on the graph is the result of a function Z(x) = y. If you differentiate Z(x) where ‘x’ is the point on the x-axis where the Z-curve makes a sharp turn, the derivative Z'(x) has a value of zero. Such points are called critical points. The S-curve doesn’t have any critical points (except at the ends, but let’s ignore them); L-, and T-curves have one critical point each; P- and D-curves have two critical points each; and an E-curve has three critical points.

With the help of a loose analogy, you could say a well-written story is smooth à la an S-curve (excluding the terminal points): it it has an unambiguous beginning and an ending, and it flows smoothly in between the two. While I admire Steven Erikson’s Malazan Book of the Fallen series for many reasons, its first instalment is like a T-curve, where three broad plot-lines abruptly end at a point in the climax that the reader has been given no reason to expect. The curves of the first three books of J.K. Rowling’s Harry Potter series resemble the tangent function (from trigonometry: tan(x) = sin(x)/cosine(x)): they’re individually somewhat self-consistent but the reader is resigned to the hope that their beginnings and endings must be connected at infinity.

You could even say Donald Trump’s presidency hasn’t been smooth at all because there have been so many critical points.

Where world-building “literalises the urge to invent” to Harrison, it spatialises the narrative to me, and automatically spotlights the importance of the narrative smoothness it harbours. World-building can be just as susceptible to non-sequiturs and deus ex machinae as writing itself, all the way to the hubris Harrison noticed, of assuming it gives the reader anything to do, even enjoy themselves. Where he sees the “clomping foot of nerdism”, I see critical points in a curve some clumsy world-builder invented as they went along. World-building can be “dull” – or it can choose to reveal the hand-prints of a cave-dwelling people preserved for thousands of years, and the now-dry channels of once-heaving rivers that nurtured an ancient civilisation.

My principal objection to Harrison’s view is directed at the false dichotomy of writing and world-building, and which he seems to want to impose instead of the more fundamental and more consequential need for creative discipline. Let me borrow here from philosophy of science 101, specifically of the particular importance of contending with contradictory experimental results. You’ve probably heard of the replication crisis: when researchers tried to reproduce the results of older psychology studies, their efforts came a cropper. Many – if not most – studies didn’t replicate, and scientists are currently grappling with the consequences of overturning decades’ worth of research and research practices.

This is on the face of it an important reality check but to a philosopher with a deeper view of the history of science, the replication crisis also recalls the different ways in which the practitioners of science have responded to evidence their theories aren’t prepared to accommodate. The stories of Niels Bohr v. classical mechanicsDan Shechtman v. Linus Pauling and the EPR paradox come first to mind. Heck, the philosophers Karl Popper, Thomas Kuhn, Imre Lakatos and Paul Feyerabend are known for their criticisms of each other’s ideas on different ways to rationalise the transition from one moment containing multiple answers to the moment where one emerges as the favourite.

In much the same way, the disciplined writer should challenge themself instead of presuming the liberty to totter over the landscape of possibilities, zig-zagging between one critical point and the next until they topple over the edge. And if they can’t, they should – like the practitioners of good science – ask for help from others, pressing the conflict between competing results into the service of scouring the rust away to expose the metal.

For example, since June this year, I’ve been participating on my friend Thomas Manuel’s initiative in his effort to compose an underwater ‘monsters’ manual’. It’s effectively a collaborative world-building exercise where we take turns to populate different parts of a large planet with sizeable oceans, seas, lakes and numerous rivers with creatures, habitats and ecosystems. We broadly follow the same laws of physics and harbour substantially overlapping views of magic, but we enjoy the things we invent because they’re forced through the grinding wheels of each other’s doubts and curiosities, and the implicit expectation of one creator to make adequate room for the creations of the other.

I see it as the intersection of two functions: at first, their curves will criss-cross at a point, and the writers must then fashion a blending curve so a particle moving along one can switch to the other without any abruptness, without any of the tired melodrama often used to mask criticality. So the Kularu people are reminded by their oral traditions to fight for their rivers, so the archaeologists see through the invading Gezmin’s benevolence and into the heart of their imperialist ambitions.

IBT’s ice-nine effect on Newsweek

In his 1963 novel Cat’s Cradle, Kurt Vonnegut describes a fictitious substance called ice-nine: a crystalline form of water that converts all the liquid water it comes into contact with into more ice-nine. This is the sort of effect the International Business Times had on Newsweek, which, as Daniel Tovrov writes in the Columbia Journalism Review, went from being one of the ‘big three’ American news magazines to a lesser entity that can’t say why it exists within the last decade of its eighty-year – and counting – existence.

One big reason, apart from Newsweek editors’ continuing preference for page-views over informed reportage, is IBT’s ownership of the magazine from 2012 to 2018. IBT is a business, not a journalism organisation; it made its money through ads on its pages, and it got people to come see those ads and maybe click on a few by publishing a large volume of articles with clickbait headlines.

It’s certainly not alone in adopting this business model but what Tovrov leaves unsaid is that Google’s and Facebook’s – but especially Google’s – decisions to make this model profitable has allowed businesses like IBT to assume ownership of journalism organisations like Newsweek, running them aground. Like the ice-nine in Cat’s Cradle, it isn’t just that IBT was shot to hell but that Google empowered its employees – who are to blame here as much as Google itself – to consign other organisations it came into contact with to the same fate.

There’s even a distressing self-symmetry to this story; to quote Tovrov:

… Jeffrey Rothfeder, our Editor-in-Chief, said that the clickbait would bring in revenue while hard-news reporting would build our reputation. Much of Newsweek’s current disorder was incubated in those early days of IBT, when we were still figuring out how digital journalism would work. We quickly learned that the patience of the owners, who own Newsweek today, was short. I witnessed incredible journalists lose their jobs over inconsistent traffic, despite editors’ best efforts to save them by shifting them from desk to desk to avoid detection.

There’s a ‘moral of the story’ moment tucked away here about a causal link – which wasn’t so obvious until BuzzFeed’s famous failure came along in January this year – between the gambler’s conceit of adopting the CPM model and the eventual ruin the model brings to newsroom practices. The best safeguard would be to have editors empowered to hit the brakes but by that time the organisation has likely changed in a way that that’s too much to ask for.

Many of us adopted the strategy of using a pseudonym to [cook up stories] when we needed quick hits. The owners and editors were fine with this, but a CMS update created automated bylines and ended the practice. It was in this era that, due to a contagious morale problem, IBT management added a carrot to go along with the stick: traffic bonuses.

It seems Newsweek – of all the publications possible – today exemplifies the worst of what happens when publishers sink more money into the ads-based CPM model of generating revenue: the newsroom becomes yet another late-capitalism enterprise whose employees fight for a sliver of the pie while their work lands significant chunks in the hands of its owners. It’s also a sign of how dependent the magazine is on Google that (a part of) Newsweek‘s existing staff is optimistic Google’s new changes to its ranking algorithm, to prioritise original in-depth reportage over recycled material, will make their jobs more enjoyable.

Why are the Nobel Prizes still relevant?

Note: A condensed version of this post has been published in The Wire.

Around this time last week, the world had nine new Nobel Prize winners in the sciences (physics, chemistry and medicine), all but one of whom were white and none were women. Before the announcements began, Göran Hansson, the Swede-in-chief of these prizes, had said the selection committee has been taking steps to make the group of laureates more racially and gender-wise inclusive, but it would seem they’re incremental measures, as one editorial in the journal Nature pointed out.

Hansson and co. seems to find the argument that the Nobel Prizes award achievements at a time where there weren’t many women in science tenable when in fact it distracts from the selection committee’s bizarre oversight of such worthy names as Lise Meitner, Vera Rubin, Chien-Shiung Wu, etc. But Hansson needs to understand that the only meaningful change is change that happens right away because, even for this significant flaw that should by all means have diminished the prizes to a contest of, for and by men, the Nobel Prizes have only marginally declined in reputation.

Why do they matter when they clearly shouldn’t?

For example, according to the most common comments received in response to articles by The Wire shared on Twitter and Facebook, and always from men, the prizes reward excellence, and excellence should brook no reservation, whether by caste or gender. As is likely obvious to many readers, this view of scholastic achievement resembles a blade of grass: long, sprouting from the ground (the product of strong roots but out of sight, out of mind), rising straight up and culminating in a sharp tip.

However, achievement is more like a jungle: the scientific enterprise – encompassing research institutions, laboratories, the scientific publishing industry, administration and research funding, social security, availability of social capital, PR, discoverability and visibility, etc. – incorporates many vectors of bias, discrimination and even harassment towards its more marginalised constituents. Your success is not your success alone; and if you’re an upper-caste, upper-class, English-speaking man, you should ask yourself, as many such men have been prompted to in various walks of life, who you might have displaced.

This isn’t a witch-hunt as much as an opportunity to acknowledge how privilege works and what we can do to make scientific work more equal, equitable and just in future. But the idea that research is a jungle and research excellence is a product of the complex interactions happening among its thickets hasn’t found meaningful purchase, and many people still labour with a comically straightforward impression that science is immune to social forces. Hansson might be one of them if his interview to Nature is anything to go by, where he says:

… we have to identify the most important discoveries and award the individuals who have made them. If we go away from that, then we’ve devalued the Nobel prize, and I think that would harm everyone in the end.

In other words, the Nobel Prizes are just going to look at the world from the top, and probably from a great distance too, so the jungle has been condensed to a cluster of pin-pricks.

Another reason why the Nobel Prizes haven’t been easy to sideline is that the sciences’ ‘blade of grass’ impression is strongly historically grounded, with help from notions like scientific knowledge spreads from the Occident to the Orient.

Who’s the first person that comes to mind when I say “Nobel Prize for physics”? I bet it’s Albert Einstein. He was so great that his stature as a physicist has over the decades transcended his human identity and stamped the Nobel Prize he won in 1921 with an indelible mark of credibility. Now, to win a Nobel Prize in physics is to stand alongside Einstein himself.

This union between a prize and its laureate isn’t unique to the Nobel Prize or to Einstein. As I’ve said before, prizes are elevated by their winners. When Margaret Atwood wins the Booker Prize, it’s better for the prize than it is for her; when Isaac Asimov won a Hugo Award in 1963, near the start of his career, it was good for him, but it was good for the prize when he won it for the sixth time in 1992 (the year he died). The Nobel Prizes also accrued a substantial amount of prestige this way at a time when it wasn’t much of a problem, apart from the occasional flareup over ignoring deserving female candidates.

That their laureates have almost always been from Europe and North America further cemented the prizes’ impression that they’re the ultimate signifier of ‘having made it’, paralleling the popular undercurrent among postcolonial peoples that science is a product of the West and that they’re simply its receivers.

That said, the prize-as-proxy issue has contributed considerably as well to preserving systemic bias at the national and international levels. Winning a prize (especially a legitimate one) accords the winner’s work with a modicum of credibility and the winner, of prestige. Depending on how the winners of a prize to be awarded suitably in the future are to be selected, such credibility and prestige could be potentiated to skew the prize in favour of people who have already won other prizes.

For example, a scientist-friend ranted to me about how, at a conference he had recently attended, another scientist on stage had introduced himself to his audience by mentioning the impact factors of the journals he’d had his papers published in. The impact factor deserves to die because, among other reasons, it attempts to condense multi-dimensional research efforts and the vagaries of scientific publishing into a single number that stands for some kind of prestige. But its users should be honest about its actual purpose: it was designed so evaluators could take one look at it and decide what to do about a candidate to whom it corresponded. This isn’t fair – but expeditiousness isn’t cheap.

And when evaluators at different rungs of the career advancement privilege the impact factor, scientists with more papers published earlier in their careers in journals with higher impact factors become exponentially likelier to be recognised for their efforts (probably even irrespective of their quality given the unique failings of high-IF journals, discussed here and here) over time than others.

Brian Skinner, a physicist at Ohio State University, recently presented a mathematical model of this ‘prestige bias’ and whose amplification depended in a unique way, according him, on a factor he called the ‘examination precision’. He found that the more ambiguously defined the barrier to advancement is, the more pronounced the prestige bias could get. Put another way, people who have the opportunity to maintain systemic discrimination simultaneously have an incentive to make the points of entry into their club as vague as possible. Sound familiar?

One might argue that the Nobel Prizes are awarded to people at the end of their careers – the average age of a physics laureate is in the late 50s; John Goodenough won the chemistry prize this year at 97 – so the prizes couldn’t possibly increase the likelihood of a future recognition. But the sword cuts both ways: the Nobel Prizes are likelier than not to be the products a prestige bias amplification themselves, and are therefore not the morally neutral symbols of excellence Hansson and his peers seem to think they are.

Fourth, the Nobel Prizes are an occasion to speak of science. This implies that those who would deride the prizes but at the same time hold them up are equally to blame, but I would agree only in part. This exhortation to try harder is voiced more often than not by those working in the West, with publications with better resources and typically higher purchasing power. On principle I can’t deride the decisions reporters and editors make in the process of building an audience for science journalism, with the hope that it will be profitable someday, all in a resource-constrained environment, even if some of those choices might seem irrational.

(The story of Brian Keating, an astrophysicist, could be illuminating at this juncture.)

More than anything else, what science journalism needs to succeed is a commonplace acknowledgement that science news is important – whether it’s for the better or the worse is secondary – and the Nobel Prizes do a fantastic job of getting the people’s attention towards scientific ideas and endeavours. If anything, journalists should seize the opportunity in October every year to also speak about how the prizes are flawed and present their readers with a fuller picture.

Finally, and of course, we have capitalism itself – implicated in the quantum of prize money accompanying each Nobel Prize (9 million Swedish kronor, Rs 6.56 crore or $0.9 million).

Then again, this figure pales in comparison to the amounts that academic institutions know they can rake in by instrumentalising the prestige in the form of donations from billionaires, grants and fellowships from the government, fees from students presented with the tantalising proximity to a Nobel laureate, and in the form of press coverage. L’affaire Epstein even demonstrated how it’s possible to launder a soiled reputation by investing in scientific research because institutions won’t ask too many questions about who’s funding them.

The Nobel Prizes are money magnets, and this is also why winning a Nobel Prize is like winning an Academy Award: you don’t get on stage without some lobbying. Each blade of grass has to mobilise its own PR machine, supported in all likelihood by the same institute that submitted their candidature to the laureates selection committee. The Nature editorial called this out thus:

As a small test case, Nature approached three of the world’s largest international scientific networks that include academies of science in developing countries. They are the International Science Council, the World Academy of Sciences and the InterAcademy Partnership. Each was asked if they had been approached by the Nobel awarding bodies to recommend nominees for science Nobels. All three said no.

I believe those arguments that serve to uphold the Nobel Prizes’ relevance must take recourse through at least one of these reasons, if not all of them. It’s also abundantly clear that the Nobel Prizes are important not because they present a fair or useful picture of scientific excellence but in spite of it.

Review: ‘Salam – The First ****** Nobel Laureate’ (2018)

Awards are elevated by their winners. For all of the Nobel Prizes’ flaws and shortcomings, they are redeemed by what its laureates choose to do with them. To this end, the Pakistani physicist and activist Abdus Salam (1926-1996) elevates the prize a great deal.

Salam – The First ****** Nobel Laureate is a documentary on Netflix about Salam’s life and work. The stars in the title stand for ‘Muslim’. The label has been censored because Salam belonged to the Ahmadiya sect, whose members are forbidden by law in Pakistan to call themselves Muslims.

After riots against this sect broke out in Lahore in 1953, Salam was forced to leave Pakistan, and he settled in the UK. His departure weighed heavily on him even though he could do very little to prevent it. He would return only in the early 1970s to assist Zulfiqar Ali Bhutto with building Pakistan’s first nuclear bomb. However, Bhutto would soon let the Pakistani government legislate against the Ahmadiya sect to appease his supporters. It’s not clear what surprised Salam more: the timing of India’s underground nuclear test or the loss of Bhutto’s support, both within months of each other, that had demoted him to a second-class citizen in his home country.

In response, Salam became more radical and reasserted his Muslim identity with more vehemence than he had before. He resigned from his position as scientific advisor to the president of Pakistan, took a break from physics and focused his efforts on protesting the construction of nuclear weapons everywhere.

It makes sense to think that he was involved. Someone will know. Whether we will ever get convincing evidence… who knows? If the Ahmadiyyas had not been declared a heretical sect, we might have found out by now. Now it is in no one’s interest to say he was involved – either his side or the government’s side. “We did it on our own, you know. We didn’t need him.”

Tariq Ali

Whether or not it makes sense, Salam himself believed he wouldn’t have solved the problems he did that won him the Nobel Prize if he hadn’t identified as Muslim.

If you’re a particle physicist, you would like to have just one fundamental force and not four. … If you’re a Muslim particle physicist, of course you’ll believe in this very, very strongly, because unity is an idea which is very attractive to you, culturally. I would never have started to work on the subject if I was not a Muslim.

Abdus Salam

This conviction unified at least in his mind the effects of the scientific, cultural and political forces acting on him: to use science as a means to inspire the Pakistani youth, and Muslim youth in general, to shed their inferiority complex, and his own longstanding desire to do something for Pakistan. His idea of success included the creation of more Muslim scientists and their presence in the ranks of the world’s best.

[Weinberg] How proud he was, he said, to be the first Muslim Nobel laureate. … [Isham] He was very aware of himself as coming from Pakistan, a Muslim. Salam was very ambitious. That’s why I think he worked so hard. You couldn’t really work for 15 hours a day unless you had something driving you, really. His work always hadn’t been appreciated, shall we say, by the Western world. He was different, he looked different. And maybe that also was the reason why he was so keen to get the Nobel Prize, to show them that … to be a Pakistani or a Muslim didn’t mean that you were inferior, that you were as good as anybody else.

The documentary isn’t much concerned with Salam’s work as a physicist, and for that I’m grateful because the film instead offers a view of his life that his identity as a figure of science often sidelines. By examining Pakistan’s choices through Salam’s eyes, we get a glimpse of a prominent scientist’s political and religious views as well – something that so many of us have become more reluctant to acknowledge.

Like with Srinivasa Ramanujan, one of whose theorems was incidentally the subject of Salam’s first paper, physicists saw a genius in Salam but couldn’t tell where he was getting his ideas from. Salam himself – like Ramanujan – attributed his prowess as a physicist to the almighty.

It’s possible the production was conceived to focus on the political and religious sides of a science Nobel laureate, but it puts itself at some risk of whitewashing his personality by consigning the opinions of most of the women and subordinates in his life to the very end of its 75-minute runtime. Perhaps it bears noting that Salam was known to be impatient and dismissive, sometimes even manipulative. He would get angry if he wasn’t being understood. His singular focus on his work forced his first wife to bear the burden of all household responsibilities, and he had difficulty apologising for his mistakes.

The physicist Chris Isham says in the documentary that Salam was always brimming with ideas, most of them bizarre, and that Salam could never tell the good ideas apart from the sillier ones. Michael Duff continues that being Salam’s student was a mixed blessing because 90% of his ideas were nonsensical and 10% were Nobel-Prize-class. Then, the producers show Salam onscreen talking about how physicists intend to understand the rules that all inanimate matter abides by:

To do this, what we shall most certainly need [is] a complete break from the past and a sort of new and audacious idea of the type which Einstein has had in the beginning of this century.

Abdus Salam

This echoes interesting but not uncommon themes in the reality of India since 2014: the insistence on certainty, the attacks on doubt and the declining freedom to be wrong. There are of course financial requirements that must be fulfilled (and Salam taught at Cambridge) but ultimately there must also be a political maturity to accommodate not just ‘unapplied’ research but also research that is unsure of itself.

With the exception of maybe North Korea, it would be safe to say no country has thus far stopped theoretical physicists from working on what they wished. (Benito Mussolini in fact setup a centre that supported such research in the late-1920s and Enrico Fermi worked there for a time.) However, notwithstanding an assurance I once received from a student at JNCASR that theoretical physicists need only a pen and paper to work, explicit prohibition may not be the way to go. Some scientists have expressed anxiety that the day will come if the Hindutvawadis have their way when even the fruits of honest, well-directed efforts are ridden with guilt, and non-applied research becomes implicitly disfavoured and discouraged.

Salam got his first shot at winning a Nobel Prize when he thought to question an idea that many physicists until then took for granted. He would eventually be vindicated but only after he had been rebuffed by Wolfgang Pauli, forcing him to drop his line of inquiry. It was then taken up and to its logical conclusion by two Chinese physicists, Tsung-Dao Lee and Chen-Ning Yang, who won the Nobel Prize for physics in 1957 for their efforts.

Whenever you have a good idea, don’t send it for approval to a big man. He may have more power to keep it back. If it’s a good idea, let it be published.

Abdus Salam

Salam would eventually win a Nobel Prize in 1979, together with Steven Weinberg and Sheldon Glashow – the same year in which Gen. Zia-ul-Haq had Bhutto hung to death after a controversial trial and set Pakistan on the road to Islamisation, hardening its stance against the Ahmadiya sect. But since the general was soon set to court the US against its conflict with the Russians in Afghanistan, he attempted to cast himself as a liberal figure by decorating Salam with the government’s Nishan-e-Imtiaz award.

Such political opportunism contrived until the end to keep Salam out of Pakistan even if, according to one of his sons, it “never stopped communicating with him”. This seems like an odd place to be in for a scientist of Salam’s stature, who – if not for the turmoil – could have been Pakistan’s Abdul Kalam, helping direct national efforts towards technological progress while also striving to be close to the needs of the people. Instead, as Pervez Hoodbhoy remarks in the documentary:

Salam is nowhere to be found in children’s books. There is no building named after him. There is no institution except for a small one in Lahore. Only a few have heard of his name.

Pervez Hoodbhoy

In fact, the most prominent institute named for him is the one he set up in Trieste, Italy, in 1964 (when he was 38): the Abdus Salam International Centre for Theoretical Physics. Salam had wished to create such an institution after the first time he had been forced to leave Pakistan because he wanted to support scientists from developing countries.

Salam sacrificed a lot of possible scientific productivity by taking on that responsibility. It’s a sacrifice I would not make.

Steven Weinberg

He also wanted the scientists to have access to such a centre because “USA, USSR, UK, France, Germany – all the rich countries of the world” couldn’t understand why such access was important, so refused to provide it.

When I was teaching in Pakistan, it became quite clear to me that either I must leave my country, or leave physics. And since then I resolved that if I could help it, I would try to make it possible for others in my situation that they are able to work in their own countries while still [having] access to the newest ideas. … What Trieste is trying to provide is the possibility that the man can still remain in his own country, work there the bulk of the year, come to Trieste for three months, attend one of the workshops or research sessions, meet the people in his subject. He had to go back charged with a mission to try to change the image of science and technology in his own country.

In India, almost everyone has heard of Rabindranath Tagore, C.V. Raman, Amartya Sen and Kailash Satyarthi. One reason our memories are so robust is that Jawaharlal Nehru – and “his insistence on scientific temper” – was independent India’s first prime minister. Another is that India has mostly had a stable government for the last seven decades. We also keep remembering those Nobel laureates because of what we think of the Nobel Prizes themselves. This perception is ill-founded at least as it currently stands: of the prizes as the ultimate purpose of human endeavour and as an institution in and of itself – when in fact it is just one recognition, a signifier of importance sustained by a bunch of Swedish men that has been as susceptible to bias and oversight as any other historically significant award has been.

However, as Salam (the documentary) so effectively reminds us, the Nobel Prize is also why we remember Abdus Salam, and not the many, many other Ahmadi Muslim scientists that Pakistan has disowned over the years, has never communicated with again and to whom it has never awarded the Nishan-e-Imtiaz. If Salam hadn’t won the Nobel Prize, would we think to recall the work of any of these scientists? Or – to adopt a more cynical view – would we have focused so much of our attention on Salam instead of distributing it evenly between all disenfranchised Ahmadi Muslim scholars?

One way or another, I’m glad Salam won a Nobel Prize. And one way or another, the Nobel Committee should be glad it picked Salam, too, for he elevated the prize to a higher place.

Note: The headline originally indicated the documentary was released in 2019. It was actually released in 2018. I fixed the mistake on October 6, 2019, at 8.45 am.

Two sides of the road and the gutter next to it

I have a mid-October deadline for an essay so obviously when I started reading up on the topic this morning, I ended up on a different part of the web – where I found this: a piece by a journalist talking about the problems with displaying one’s biases. Its headline:

It’s a straightforward statement until you start thinking about what bias is, and according to whom. On 99% of occasions when a speaker uses the word, she means it as a deviation from the view from nowhere. But the view from nowhere seldom exists. It’s almost always a view from somewhere even if many of us don’t care to acknowledge that, especially in stories where people are involved.

It’s very easy to say Richard Feynman and Kary Mullis deserved to win their Nobel Prizes in 1965 and 1993, resp., and stake your claim to being objective, but the natural universe is little like the anthropological one. For example, it’s nearly impossible to separate your opinion of Feynman’s or Mullis’s greatness from your opinions about how they treated women, which leads to the question whether the prizes Feynman and Mullis won might have been awarded to others – perhaps to women who would’ve stayed in science if not for these men and made the discoveries they did.

One way or another, we are all biased. Those of us who are journalists writing articles involving people and their peopleness are required to be aware of these biases and eliminate them according to the requirements of each story. Only those of us who are monks are getting rid of biases entirely (if at all).

It’s important to note here that the Poynter article makes a simpler mistake. It narrates the story of two reporters: one, Omar Kelly, doubted an alleged rape victim’s story because the woman in question had reported the incident many months after it happened; the other, the author herself, didn’t express such biases publicly, allowing her to be approached by another victim (from a different incident) to have her allegations brought to a wider audience.

Do you see the problem here? Doubting the victim or blaming the victim for what happened to her in the event of a sexual crime is not bias. It’s stupid and insensitive. Poynter’s headline should’ve been “Reporters who are stupid and insensitive fail their sources – and their profession”. The author of the piece further writes about Kelly:

He took sides. He acted like a fan, not a journalist. He attacked the victim instead of seeking out the facts as a journalist should do.

Doubting the victim is not a side; if it is, then seeking the facts would be a form of bias. It’s like saying a road has two sides: the road itself and the gutter next to it. Elevating unreason and treating it at par with reasonable positions on a common issue is what has brought large chunks of our entire industry to its current moment – when, for example, the New York Times looks at Trump and sees just another American president or when Swarajya looks at Surjit Bhalla and sees just another economist.

Indeed, many people have demonised the idea of a bias by synonymising it with untenable positions better described (courteously) as ignorant. So when the moment comes for us to admit our biases, we become wary, maybe even feel ashamed, when in fact they are simply preferences that we engender as we go about our lives.

Ultimately, if the expectation is that bias – as in its opposition to objectivity, a.k.a. the view from nowhere – shouldn’t exist, then the optimal course of action is to eliminate our specious preference for objectivity (different from factuality) itself, and replace it with honesty and a commitment to reason. I, for example, don’t blame people for their victimisation; I also subject an article exhorting agricultural workers to switch to organic farming to more scrutiny than I would an article about programmes to sensitise farmers about issues with pesticide overuse.

Hard sci-fi

Come November, I will be at the Bangalore Literary Festival in conversation with Sri Lankan sci-fi author Navin Weeraratne. I am told Navin – “like you,” according to one of the organisers – is a proponent of hard sci-fi, the science fiction subgenre that draws upon legitimate scientific ideas and principles.

A less obsessive reader might not mind the difference, especially if the author’s invitation to suspend disbelief is smooth. But I draw a thick line between hard and soft sci-fi because science is more than, rather quite different from, technology, and I believe the ‘sci-fi’ label is warranted only if the principles of science are carried over as well, into everything from world-building to character-building. Heck, the act – and art – of deriving consequences from a finite set of first principles in a different universe and for a set of fictitious characters could be the point of a book in itself.

Soft sci-fi, on the other hand, is quite fond of inventing technologies to depict fantastic landscapes and cultures and is closer to fantasy fiction than to sci-fi.

Admittedly these are only lines in the sand but I believe the virtues of sci-fi could be extended to include many kinds of storytelling that the typical sci-fi author, usually dabbling in the softer parts of the subgenre, may not be inclined to explore.

Now, while I’ve expressed this view in public on a few occasions of late, I don’t know enough about the subgenre and its literary, historical and philosophical virtues – certainly not enough to speak to Navin Weeraratne on stage. The man has nine books to his name! Fortunately the event is over a month away and I have time to prepare. I dearly hope I don’t make a fool of myself onstage, in a room full of the ‘literary types’.

The mission that was 110% successful

Caution: Satire.

On October 2, Kailash S., the chairman of the Indian Wonderful Research Organisation (IWRO), announced that the Moonyaan mission had become a 110% success. At an impromptu press conference organised inside the offices of India Day Before Yesterday, he said that the orbiter was performing exceptionally well and that a focus on its secondary scientific mission could only diminish the technological achievement that it represented.

Shortly after the lander, carrying a rover plus other scientific instruments, crashed on the Moon’s surface two weeks ago, Kailash had called the mission a “90-95% success”. One day after it became clear Moonyaan’s surface mission had ended for good and well after IWRO had added that the orbiter was on track to be operational for over five years, Kailash revised his assessment to 98%.

On the occasion of Gandhi Jayanti, Kailash upgraded his score because despite the lander’s failure to touchdown, it had been able to descend from an altitude of 120 km to 2.1 km before a supposed thruster anomaly caused it to plummet instead of brake. “We have been analysing the mission in different ways and we have found that including this partially successful descent in our calculations provides a more accurate picture of Moonyaan’s achievement,” Kailash said to journalists.

When a member of a foreign publication prodded him saying that space doesn’t exactly reward nearness, Kailash replied, “I dedicate this mission to the Swachh Bharat mission, which has successfully ended open defecation in India today.” At this moment, Prime Minister A. Modern Nadir, who was sitting in front of him, turned around and hugged Kailash.

When another journalist, from BopIndia, had a follow-up question about whether the scientific mission of Moonyaan was relevant at all, Kailash responded that given the givens, the payloads onboard the orbiter had a responsibility to “work properly” or “otherwise they could harm the mission’s success and bring its success rate down to the anti-national neighbourhood of 100%”.

On all three occasions – September 7, September 22 and October 2 – India became the first country in the world as well as in history to achieve the success rates that it did in such a short span of time, in the context of a lunar mission. Thus, mission operators have their fingers crossed that the instruments won’t embarrass what has thus far been a historical technological performance with a corresponding scientific performance with returns of less than 110%.

Finally, while Moonyaan has elevated his profile, Kailash revealed his plan to take it even higher when he said the Heavenyaan mission would be good to go in the next 30 months. Heavenyaan is set to be India’s first human spaceflight programme and will aim to launch three astronauts to low-Earth orbit, have them spend a few days there, conducting small experiments, and return safely to Earth in a crew capsule first tested in 2014.

IWRO has already said it will test semi-cryogenic engines – to increase the payload capacity of its largest rocket so it can launch the crew capsule into space – it purchased from an eastern European nation this year. Considering all other components are nearly ready, including the astronauts who have managed with the nation’s help to become fully functioning adults, Heavenyaan is already 75% successful. Only 35% remains, Kailash said.

In financial terms, Heavenyaan is more than 10-times bigger than Moonyaan. Considering there has been some speculation that the latter’s lander couldn’t complete its descent because mission operators hadn’t undertaken sufficiently elaborate tests on Earth that could have anticipated the problem, observers have raised concerns about whether IWRO will skip tests and cut corners for Heavenyaan as well as for future interplanetary missions.

When alerted to these misgivings, Nadir snatched the mic and said, “What is testing? I will tell you. Testing is ‘T.E.S.T.’. ‘T’ stands for ‘thorough’. ‘E’ for ‘effort’. ‘S’ for ‘sans’. ‘T’ for ‘testing’. So what is ‘test’? It is ‘thorough effort sans testing’. It means that when you are building the satellite, you do it to the best of your ability without thinking about the results. Whatever will happen will happen. This is from the Bhagavad Gita. When you build your satellite to the best of your ability, why should you waste money on testing? We don’t have to spend money like NASA.”

Nadir’s quip was met with cheers in the hall. At this point, the presser concluded and the journalists were sent away to have tea and pakodas*.

*Idea for pakodas courtesy @pradx.