Failing Sarabhai

We are convinced that if we are to play a meaningful role nationally, and in the community of nations, we must be second to none in the application of advanced technologies to the real problems of man and society.

Normally this kind of comment would be a platitude. In fact, it still is except for the fact that there’s quite a bit here that can be interpreted differently according to the changing times. This comment, rather quote, was tweeted by ISRO on August 10 and attributed to its founding scientist, Vikram Sarabhai. (The text looks like it was copied and pasted from a PDF – or maybe it was intended as a poem.)

The following terms are subjective: “meaningful role”, “advanced technologies” and “real problems”. From ISRO’s POV, they likely stand for the commitment of space technologies towards resolving day-to-day issues faced by terrestrial enterprises. More specifically, to use space-borne assets to assist safety and rescue, mapping resources, tracking animals as well as land-use, forecasting the weather, etc. Moreover, the terms are also somewhat dangerous because Sarabhai doesn’t specify who decides what they mean. 😉

For example, the BJP government at the Centre believes “real problems” are technological problems, not scientific ones, and has in fact discouraged small-scale exploratory efforts. M.S. Santhanam penned an article in The Hindu when this year’s Economic Survey was released discussing this issue. I do not think the article received as much attention as it deserved, and is worth bookmarking.

Given that Sarabhai’s words seem to lend themselves to various other, and broader, contexts, it would seem disingenuous of ISRO to expect to be judged on its existing efforts and not for ones that it is failing at. For one, I choose to interpret the tweet as an admission of failure on ISRO’s part to play a “meaningful role” in communicating its research and dispelling the attendant fake news, a “real problem” by any yardstick, using “advanced technologies” like Twitter and Facebook, which allow scientists to take charge of the narrative from their desks, lab benches or wherever.

To this end, Sarabhai’s quote well illustrates a battle – joined in the realms of language and memory – that few pay attention to. For a government bent on normalising majoritarian authority, we need to fight and reclaim what “real problems” and “meaningful roles” mean, or can mean, wrenching them away from the justification of “what most people think” and towards “what is justified by reason”, and not abandon the latter just because it is harder to do.

High-temp. superconductivity, hype cycles and peer review

TL;DR – Journalists are already more accountable than the publishers of scientific journals are. If scientists find journalistic review of the scientific literature to be lacking, help them improve instead of completely disabling them from doing it.

***

The recent conversation about preprints, motivated by Tom Sheldon’s article in Nature News, focused on whether access to preprint manuscripts is precipitating bad or wrong articles in the popular science journalism press. The crux of Sheldon’s argument was that preprints aren’t peer-reviewed, which leaves journalists with the onerous task of validating their results when, in fact, that has been the traditional responsibility of independent scientists hired by journals to which the papers have been submitted. I contested this view because it is in essence a power struggle, with the scientific journal assuming the role of a knowledge-hegemon.

An interesting example surfaced in relation to this debate quite recently, when two researchers from the Indian Institute of Science, Bangalore, uploaded a preprint paper to the arXiv repository claiming they had detected signs of superconductivity at room temperature in a silver-gold nanostructure. They simultaneously submitted their paper to Nature, where it remains under embargo; in the meantime, public discussions of the paper’s results have been centred on information available in the preprint. Science journalists around the world have been reserved in their coverage of this development, sensational though it seems to be, likely because, as Sheldon noted, it hasn’t been peer-reviewed yet.

At the same time, The Hindu published an atypically uninquisitive article highlighting the study. For its part, The Wire (i.e. I) commissioned an article – since published here on August 6 – discussing the preprint paper in greater detail, with comments from various materials scientists around the country. The article’s overwhelming conclusion seemed to be that the results look neat to theorists but need more work to experimentalists, and that we should wait for Nature‘s ‘verdict’ before passing judgment. Nonetheless, the article found it fit, and rightly so based on the people quoted, to be optimistic.

Earlier today, there emerged a twist in the plot. Brian Skinner, a physicist at the Massachusetts Institute of Technology, uploaded a public comment to the arXiv repository discussing, in brief, a curious feature of the IISc preprint. He had found that two plots representing independent measurements displayed in the manuscript showed very similar, if not exact, noise patterns. Noise is supposed to be random; if two measurements are really independent, their respective noise patterns cannot, must not, look the same. However, the IISc preprint showed the exact opposite. To Skinner – or in fact to any observer engaged in experimental studies – this suggests that the data in one of the two plots, or both, was fabricated.

An image from Skinner's preprint paper showing the similar noise patters from two different plots (overlapped as blue and green dots for comparison). Source: arXiv:1808.02929v1
An image from Skinner’s preprint paper showing the similar noise patters from two different plots (overlapped as blue and green dots for comparison). Source: arXiv:1808.02929v1

This is obviously a serious allegation. Skinner himself has not attempted to reach any conclusion and has stopped at pointing out the anomaly. At this juncture, let’s reintroduce the science journalist: what should she do?

In a world without preprints, this paper would not have fed a story until after a legitimate journal had published the paper, and in which case the science journalist’s article’s legitimacy would bank on the peer-reviewers’ word. More importantly, in a world without preprints, this would have been a single story – à la the discovery of the Higgs boson or gravitational waves from colliding neutron stars. In a world with preprints, this has become an evolving story even though, excluding the “has been submitted to a journal for review” component, the study itself is not dynamic. (Contrast this for example to the search for dark matter: it is ongoing, etc.)

Against this context, the arguments Sheldon et al have put forth assumes a new clarity. What they’re saying is that the story is not supposed to be evolving, and that science journalists forced to write their stories based only on peer-reviewed papers would have produced a single narrative of an event fixed at one point in space and time. Overall, that if journalists could have waited for the paper to be peer-reviewed, they would have been able to deliver to the people a more finished tale, and whose substance and contours enjoy greater consensus within the scientific community.

This may seem like a compelling reason to not allow journalists to write articles based on preprints until you stop to consider some implicit assumptions that favour peer-review.

First off, peer-review is always viewed as a monolithic institution whereas the people quoted in an article are viewed as individuals – despite the fact that both groups are (supposed to be) composed of peers acting independently. As a result, the former appears to be indemnified. In The Wire article, the people quoted were Vijay Shenoy, T.V. Ramakrishnan, Ganapathy Baskaran, Pushan Ayyub and an unnamed experimentalist. The author, R. Ramachandran, also cited multiple studies and historical events for the necessary context and reminded the reader on two occasions that the analysis was preliminary. What the people get out of peer-review, on the other hand, is a ‘yes’/’no’ answer that, in the journal’s interests, are to be consider final.

In fact, should review – peer- or journalistic – fail, journalism affords various ways to deal with the fallout. The scientists quoted may have spoken on the record, and their contact details will be easily findable; the publication’s editor can be contacted and a correction or retraction asked for; in some cases (including The Wire‘s), a reader’s editor* acting independently of the editorial staff can be petitioned to set the public record straight. With a journal, however, the peer-reviewers are protected behind a curtain of secrecy, and the people and the scientists alike will have to await a decision that is often difficult to negotiate with. The numerous articles published by Retraction Watch are ready examples.

Second, it is believed that peer-reviewers perform checks that science journalists never can. But where do you draw the line? Do peer-reviewers check for all potential problems with a paper before green-flagging it? More pertinently, are they always more thorough in their checks than good science journalists can be? In fact, there is another group of actors here that science journalists can depend on: scientists who are publicly critiquing studies on their Twitter or Facebook pages and their blogs. I mention this here to quote the examples of Katie Mack, Adam Falkowski, Emily Lakdawalla, etc. – and, most of all, of Elizabeth Bik, a microbiologist. Bik has been carefully documenting the incidence of duplicated or manipulated images in published papers.

Circling back to peer-review’s being viewed as a monolith: many of the papers Bik has identified were published by journals after they were declared ‘good to go’ by review panels. So by casting their verdict as final, by describing each scientific paper as being fixed at a point in time and space, journals are effectively proclaiming that what they have published need not be revisited or revised. This is a questionable position. On the other hand, by casting the journalistic enterprise as the documentation of a present that is being constantly reshaped, journalists have access to a storytelling space that many scientific journals don’t afford the very scientists that they profit from.

Where this enterprise turns risky, or even potentially unreliable, is when it becomes dishonest about its intentions – rather, isn’t explicitly honest enough. That is, to effect change in what journalism stands for, we also have to change a little bit of how journalism does what it does. For example, in The Wire‘s article, Ramachandran was careful to note that (i) only the paper’s publication (or rejection) can answer some questions and perhaps even settle the ongoing debate, (ii) some crucial details of the IISc experiment are missing from the preprint (and likely will be from the submitted manuscript as well), (iii) the article’s discussion is based on conversations with materials scientists in India and (iv) the paper’s original authors have refused to speak until they have heard from Nature. Most of all, the article itself does not editorialise.

These elements, together with an informed readership, are necessary to stave off hype cycles – unnecessary news cycles typically composed of two stories, one making a mountain of a molehill and the next declaring that the matter at hand has been found to be a molehill. The simplest way to sidestep this fallacy, at least in my mind, is to remember at all stages of the editorial process that all stories will evolve irrespective of what those promoting it have to say. Of course, facts don’t evolve, but what conclusion a collection of facts lends itself to will. And so will opinions, implications, suggestions and whatnot. This is why attempting to call out science journalists who respect these terms of their enterprise will not work – because doing so also passively condones hype. What will work is to knock on the doors of those unquestioning journalists who pander to hype above all else.

This prescription is tied to one for ourselves: as much as science journalists want to reform the depiction of the scientific enterprise, moving it away from the idea that scientists find remarkable results with 100% confidence all the time (which is the impression journals give), they – rather, we – should also work towards reforming what journalism stands for in the people’s eyes. Inasmuch as science as well as journalism are bound by the pursuit of truth(s), it is important for all stakeholders to remember, and to be reminded, that – to adapt what historian of science Patrick McCray said tweeted – it’s about consensus, not certainty. Should they have a problem with journalists running a story based on a preprint instead of a published paper, journals can provide a way out (for reasons described here) by being more open about peer-review, what kind of issues reviewers check for and how journalists can perform the same checks.

*The Wire’s reader’s editor is Pamela Philipose, reachable at publiceditor at thewire.in.

Featured image credit: Verena Yunita Yapi/Unsplash.

Ghost’s Koenig editor

The new content editor in the Ghost CMS, named Koenig, is quite good, a much smoother use than WordPress’s new Gutenberg. Ghost’s latest iteration in v1.25.3 (as of today) has a full-screen minimal layout and bold serif text type brought together with the Markdown goodness to yield a seamless combination of new and old: software that stays quiet and out of the way and Ghost’s original promise that writers won’t have to take their hands off the keyboard (which is mostly true).

(I’m personally not a big fan of the big font but I fixed that with a Chrome bookmarklet.)

I’ve been using WordPress for over 10 years and was recently looking forward to the release of Gutenberg, Automattic’s new writing experience to be introduced in v5.0. A plugin they’d released a short while ago allowed WP.org users to install and preview Gutenberg, and I’d tried it out as well and was quite pleased with it.

However, Koenig makes Gutenberg feel clumsy simply because Koenig does more with less. Ghost is already tailored specifically to writers, unless WordPress which promises a lot for a lot of people. Gutenberg carries this forward by offering a stunning number of types of blocks to edit, but this means the experience for the simple writer is again something that feels quite cluttered.

Compared to Koenig, Gutenberg is certainly a letdown, which means I have five options going forward: stay on WP.com and use Gutenberg; self-host WP and install the classic editor; switch to Ghost on a VPS to use Koenig; or use an independent text editor and use WP.com to publish. I’ve decided to go with the last option for now.

Of course, Ghost isn’t without its flaws either. For example, a recent pricing revision means the lowest tier of Ghost.org hosting costs $36/mo ($29/mo for annual subscriptions) – a crap deal for bloggers when I can get first-class managed WP hosting on Flywheel for $15/mo or LightningBase for $10/mo.

Second, the self-hosted version requires some coding; while Ghost’s instructions make it simple, the tougher part is in securing the server you’re going to host your blog on. (Digital Ocean’s tutorials help.)

Third, while the Ghost CMS has improved in the last three years, their front-end is still very magazine-y, effectively putting itself in a niche and not in a position to displace WordPress at all as the CMS of choice for bloggers who just want to blog.

To be fair, on the flip side Ghost is easier to customise once you’ve got it going, especially if you start from themes available outside its small marketplace.

Weekend getaway

I just spent the weekend at Sunti koppa, a small village to the northwest of Mysore, in Kodagu, a.k.a. Coorg, district. I was there with my relatives – six of us in all – and we were put up at a homestay in the middle of a coffee and pepper estate. It was my first vacation in three years and although it was only 48 hours long, it was splendid. As travellers to the area will be familiar, there are lots of places to visit in the area. We ourselves visited the KRS dam, Brindavan gardens, Namdroling monastery, Mallapalli falls and the Dubare elephant camp. However, the highlight for me was the homestay itself and its situation in what city-dwellers would know as the Middle of Nowhere.

IMG_0103.JPG

Our homestay is to the left of this picture. The path here leads to the quarters’ front door. The structure visible ahead is the kitchen, and to its right the staff quarters. Around the perimeter encompassing these three structures, a coffee plantation 65 acres in area stretches on all sides. Our homestay was managed by Mr Cariappa, the owner of the plantation. We had a cook at attend to our food needs and another person who helped keep the place clean and running.

Mr Cariappa was also there frequently, talking to us about the area, discussing must-see places nearby and making sure we were comfortable. His house, where he stayed with his mother, brother and a dachshund, was located a few hundred metres away, right of the kitchen in the photo above.

To get to our house, you had to take an arterial road out of Kushalnagar (I’m not familiar with the exact route) and at one point take a left, plunging into a cluster of coffee and pepper plantations. Then you had to keep going for a few kilometres until you got to Emerald Estate – Mr Cariappa’s holding – and near its gate take another left and drive on for a few hundred metres. The video above is what you see on this drive: trees upon trees on either side of a narrow road, the ceaseless chatter of numerous insects, bird calls, and – if the sun has set – the occasional feeling that you’re hopelessly lost.

IMG_0108.JPG

It’s peak rainy season this time of the year in these parts, so what we saw was flora and fauna at their most profligate, the soil wet and bursting with ideas about what to incubate. The insects in particular never gave up, as if they had all congregated around our home. We saw the biggest mosquitoes and crickets. We heard calls at night that we thought at first were from some kind of perimeter alarm, designed to drive animals away. We then thought they were produced by thousands of frogs sitting in the pond a few score metres outside our front door. We finally found out they were from insects – friggin’ bugs – which was improbable in our minds because of the sheer amplitude of their nocturnal opera.

[soundcloud url=”https://api.soundcloud.com/tracks/481434468?secret_token=s-Xhwxv” params=”color=#00aabb&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false&show_teaser=true” width=”100%” height=”166″ iframe=”true” /]

[soundcloud url=”https://api.soundcloud.com/tracks/481434471?secret_token=s-8DeLm” params=”color=#00aabb&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false&show_teaser=true” width=”100%” height=”166″ iframe=”true” /]

My mum took to one bug in particular – although we never figured out the specific identities of the racketeers – that voiced a five-note trill of increasing pitch (I think it’s audible in recoding #7 below). It was comical to the ear because a similar sound is played during comedy scenes in many Tamil movies.

[soundcloud url=”https://api.soundcloud.com/tracks/481434474?secret_token=s-Vjf5b” params=”color=#00aabb&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false&show_teaser=true” width=”100%” height=”166″ iframe=”true” /]

[soundcloud url=”https://api.soundcloud.com/tracks/481434477?secret_token=s-ecRVN” params=”color=#00aabb&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false&show_teaser=true” width=”100%” height=”166″ iframe=”true” /]

In the coming week, I intend to identify all the birds and insects making the sounds in these recordings by myself.

[soundcloud url=”https://api.soundcloud.com/tracks/481434480?secret_token=s-1sDIi” params=”color=#00aabb&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false&show_teaser=true” width=”100%” height=”166″ iframe=”true” /]

[soundcloud url=”https://api.soundcloud.com/tracks/481434483?secret_token=s-G1c0C” params=”color=#00aabb&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false&show_teaser=true” width=”100%” height=”166″ iframe=”true” /]


On the first day of our two-day trip, we visited KRS dam. Access to the top of the dam itself had been cut off, according to some police personnel stationed nearby, after heightened tensions between Karnataka and Tamil Nadu about sharing the Cauvery water last year. In 2018, however, such tensions have been staved off thanks to excess rainfall in and around Kodagu, causing the river to swell and the dam to fill up. Only half a week before our visit, water had been released from the KRS dam because it had rained nonstop for three days, according to Mr Cariappa.

IMG_0064.JPG

The way to the dam’s walls is via the Brindavan gardens.

IMG_0067.JPG

IMG_0072.JPG

After Brindavan, our driver and local guide, Mr Shivanand, took us to the Namdroling Monastery, a shrine dedicated to Guru Rinpoche, an eighth century Indian Buddhist widely regarded as the ‘Second Buddha’. The object of attraction at the monastery was the Golden Temple, a big hall that houses three statues – of Guru Rinpoche (a.k.a. Padmasambhava) and two forms of the Buddha, Amitabha and Amitayus. Of them, Guru Rinpoche’s and Amitayus’s statues are each 58 feet high, and Amitabha’s, 60 feet high. They are all made of copper and plated with gold. The hall’s walls are also decorated with paintings, as described below:

IMG_0093

Videography was explicitly prohibited inside the hall and signages at various locations asked that visitors maintain their silence and not treat the temple as a place of entertainment. However, it looked like nobody was paying any attention to these requests, speaking loudly to each other, shooting videos, grazing their hands along the walls, etc. This was very sad, and at times infuriating, to witness, as if people were eager to exercise their freedoms mindless of their responsibilities.


After KRS dam, we had stopped briefly at a small Hindu temple nearby, surrounded on three sides by the backwaters of the Cauvery. Parts of the temple were still under construction but visitors thronged its corridors. There were signboards here as well asking people to keep quiet, but the temple’s administrators knew what they were dealing with. They had stationed officious-looking people at various points inside the temple charged only with walking up to loud dumbasses and asking them to shut up. It worked very well.


IMG_0095

I learned later that Namdroling Monastery is the largest teaching centre of the Nyingma lineage, founded by Guru Rinpoche, in the world. It was established in 1963, given the name Namdroling by the Dalai Lama, and hosts over 4,000 monks and almost 1,000 nuns.

As soon as you enter the monastery, you walk into a giant plaza lined on two adjacent sides by small souvenir shops and residential quarters. The other two sides are bare. If you walked right across the plaza in a diagonal, you’d reach a gate of sorts opening into a path. Walk straight down and you will reach the Zangdog Palri temple, whose backside is visible in the picture above, crowned by the chakra. Behind the temple is the Padmasambhava Buddhist Vihara that houses the statues.

IMG_0094

Unlike the other sites we visited on our tour, Namdroling was peopled. So I don’t know how the other visitors were comfortable pulling out their phones and taking pictures of themselves inside as well as of a place that housed other people, leave alone behaving as if they were at the beach. I for one was confused about why the monastery even opened itself up to visitors: to let people experience the wonder of Namdroling or to provide a source of amusement for its monks and nuns. Who was the more amused was anyone’s guess.

To be continued…

 

Pro-preprints around the world

After I published my rebuttal to the Nature anti-preprints article, a scientist in the US wrote to me saying he was on the journal’s side and that he expects preprints to be done away with in the future. It was dispiriting to hear.

But in the last five days, something interesting has been happening in my Twitter notifications section. Many of the article’s readers have been compelled enough to tweet/retweet the link (thank you!), and from these tweets/retweets, it appears as if the article has been slowly but surely snaking its way through audiences around the world. Here’s a summary of how it proceeded based on a subset of the notifications.

Vishu Guttal at the IISc tweeted a link to the article on July 28. On the same day, A.J. Sánchez-Padial from Madrid retweeted the link on the my blog where I’d republished the same article. On July 29, Justin Sègbédji Ahinon from Bénin retweeted the article. Shortly after, Gregor Kalinkat and Pawel Romanczuk retweeted from Berlin and Leonid Schneider from Frankfurt. Kirsten Sandberg from New York and Vicky Hellon from London followed on July 30. Hellon was retweeted by Carlos Blondel in Boston, Michael Markie in London and Nigel Temperton in Kent. Both Hellon and Markie work with the OA publisher F1000. Earlier today, it was favourited by Iryna Kuchma in Ukraine, Marlène Delhaye in Aix-en-Provence and Björn Brembs in Bavaria, and tweeted by @Aisa_OA in Trento.

Watching this journey unfold, and finding out that my views have found agreement in all these countries – esp. Bénin as well as India –, has been very satisfying. The dominating presence of Europe and the absence of East Asia and South America are notable; however, some readers, I suspect from both these regions, who were present in my notifications didn’t mention their location.

That said, obviously those who benefit the most from the existence of preprints are those who confront steep access costs to the scientific literature every day and those who are mindful of its potential to help reform performance evaluation in academia, among various other opportunities. Their probable deleterious effects on science journalism should be of least concern.

Non-ergodicity and diversity

Ergodicity is the condition wherein a sample is representative of the whole vis-a-vis some statistical parameter. An ergodic system is one that visits all possible states of its existence as it evolves. Axiomatically, a non-ergodic system is one that does not. Stuart A. Kauffman, a scientist at the University of Calgary, wrote on Edge a year ago:

… the evolution of life in our biosphere is profoundly “non-ergodic” and historical. The universe will not create all possible life forms. This, together with heritable variation, is the substantial basis for Darwin, without yet specifying the means of heritable variation, whose basis Darwin did not know.

This is a very elegant description of history that employs a dynamism one commonly encounters in physics and the language of physics. If the past encapsulated everything that could every happen, it would be an uninteresting object of study because its peculiarities would all cancel out, leaving a statistical flatland in its wake. Instead, if the past contained only a specific set of events connected to each other in unique ways – i.e. exhibiting a distinctly uncommon variation – then it becomes worthy of study, as to why it is what it is and not something else. As Kauffman says, “Non-ergodicity gives us history.”

https://twitter.com/johncarlosbaez/status/1023252524248137728

Though today I know that the concept is called ‘non-ergodicity’, I encountered its truth in a different context many years ago: when I had written an article that appeared in Quartz about how Venus could harbour life and that that should encourage us to look for life on Titan as well. I had quoted the following lines from a 2004 paper to strengthen my point:

The universe of chemical possibilities is huge. For example, the number of different proteins 100 amino acids long, built from combinations of the natural 20 amino acids, is larger than the number of atoms in the cosmos. Life on Earth certainly did not have time to sample all possible sequences to find the best. What exists in modern Terran life must therefore reflect some contingencies, chance events in history that led to one choice over another, whether or not the choice was optimal.

Somehow, and fortunately, these lines have stayed with me to this day four years on, and I hope and believe they will for longer. They present a simple message whose humility seems only to grow with time. They suggest that even life on Earth may not be the best (e.g. most efficient) it can be after billions of years of evolution. Imagine the number of evolutionary states that the whole universe has available to sample – the staggeringly large product of all the biospheres on all the planets in all the time…

The search for a ‘perfect lifeform’ is not a useful way to qualify humankind’s quest. Against such cosmic non-ergodicity, every single alien species we discover could, and likely will, stand for its own world of contingencies just as peoples of different cultures on Earth do. Perhaps then our xenophobia will finally become meaningless.

A new discrimination

An article in KurzweilAI begins,

Plants could soon provide our electricity.

Why would anyone take this seriously? More than excitement, this line rouses a discerning reader to suspicion. It is bound to be centred on the word “soon”, implying in the near-future, imminently. You’re not sure which timescales people are thinking on but I’m sure we can agree 10 years sounds reasonable here. Will plants power your home in 10 years? Heck, in 50 years? It is stupendously unlikely. The suggestion itself – as embodied in that line – is disingenuous because it 1) overestimates feasibility at scale and 2) underestimates the amount of conviction, work and coordination it will take to dislodge the fossil-fuel, nuclear and renewable energy industries.

Indeed, the line that “plants could soon provide our electricity” begins to make sense only when its words are assessed individually instead of being beheld with the seductive possibilities the whole sentence offers. Could? Of course, they already do through the technology described in the article, called Plant-e. Plants? I don’t see why not; they are batteries of sorts, too. Provide? Plants are terrestrial, ubiquitous, very accessible, well understood and seldom dangerous. Our? Who else’s is it, eh. Electricity? Again, Plant-e has demonstrated this already, in the Netherlands, where it was pioneered. But cognise the sentence as a whole and you’re left with gibberish.

The article then claims:

An experimental 15 square meter model can produce enough energy to power a computer notebook. Plant-e is working on a system for large scale electricity production in existing green areas like wetlands and rice paddy fields. … “On a bigger scale it’s possible to produce rice and electricity at the same time, and in that way combine food and energy production.”

The emphasised bit (my doing) sounds off: it implies a couple dozen kilowatt at best, whereas the article’s last line says, “In the future, bio-electricity from plants could produce as much as 3.2 watts per square meter of plant growth.” Either way, a solar panel with a tenth of the surface area produces about 250 W (comparable to the first claim and improving, 10x better than the second claim). People around the world are already concerned that the world may not have enough nickel, cadmium and lithium to build the batteries to store this energy and may not have enough land to build all the solar cells necessary to “provide our electricity”.

In this scenario, why should anyone give a fuck about Plant-e as an alternative worth one’s time? It is interesting and exciting that scientists were able to create this technology but its billing as a reasonable substitute for the commonly known sources of energy, and “soon”, suggests that this is certainly hype, and that the people behind this announcement seem to be okay with disguising an elitist solution as a sustainable one.

Second, said billing also suggests that there is less certainly – but plausibly – a misguided, white-skinned belief at work here, that, notwithstanding details about intraday variability of power generation, soil conditions and such, agriculture and power consumption in the Netherlands are both similar to those elsewhere in the world. But the social, economic and technological gap between these endeavours as they happen in Northwest Europe and Southeast Asia is so large as to suggest the article’s authors have no clue about the socioeconomics of electric power or are at ease with the wilful disregard of it.

Announcements like this don’t harm anyone but they certainly offend the sensibilities of those forced to grow, grow, grow while on the brink of the worst of climate change. It is crucial that we keep innovating, finding new, better, more considerate ways of surviving impending disasters as well as reducing our deleterious footprint on this planet. Let us do this without suggesting that a nascent, untested (at scale) and currently infeasible technology may provide a crucial part of the answer where numerous other governments have failed.

Through this exercise, let us also awaken our minds to a new form of discrimination in the Anthropocene epoch – lazy, short-sighted, selfish thinking – and call it out.

Preprints don’t promote confusion – so taking them away won’t fix anything

In response to my Twitter thread against Tom Sheldon’s anti-preprints article in Nature, I received more responses in support of Sheldon’s view than I expected. So I wrote an extended takedown for my blog and, of course, The Wire, pasted below.


In 1969, Franz J. Ingelfinger articulated a now-famous rule named after him in an attempt to keep the New England Journal of Medicine (NEJM), which he edited at the time, in a position to give its readers original and fully vetted research. Ingelfinger stated that the NEJM wouldn’t consider publishing a paper if it had already been publicised before submission or had been submitted to another journal at the same time. The Ingelfinger rule symbolised a journal’s attempt to recognise its true purpose and reorganise its way of functioning to stay true to it.

Would we say this is true of all scientific journals? In fact, what is a scientific journal’s actual purpose? First, it performs peer review, by getting the submissions it receives scrutinised by a panel of independent experts to determine the study’s veracity. Second, the journal publicises research. Third, it creates and maintains a record of the section of the scientific literature it is responsible for. In the historical context, these three functions have been dominant. In a more modern, economic and functional sense, scientific journals are also tasked with making profits, improving their impact metrics and making research more accessible.

As it happens, peer review is no longer the exclusive domain of the journal – nor is it considered to be an infallible institution. Second, journals still play an important part in publicising research, especially via embargoes that create hype, pointing journalists towards papers that they might otherwise not have noticed, as well as preparing and distributing press releases, multimedia assets, etc. Of course, there are still some flaws here. And third, the final responsibility of maintaining the scientific record continues to belong to the journal.

Too much breathing space

Pressures on the first two fronts are forcing journals to stay relevant in newer ways. A big source of such pressure is the availability of preprints – i.e. manuscripts of papers made available by their authors in the public domain before they have been peer-reviewed.

Preprint repositories like arXiv and biorXiv have risen in prominence over the last few years, especially the former. They are run by groups of scientists – like volunteers pruning the garden of Wikipedia – that ensure the formatting and publishing requirements are met, remove questionable manuscripts and generally – as they say – keep things going. Scientific journals typically justify their access cost by claiming that they have to spend it on peer review and printing. Preprints evade this problem because they are free to access online and are not peer-reviewed the way ‘published’ papers are. In turn, the reader who wishes to read the preprint must bear this caveat in mind.

This week, the journal Nature published a (non-scientific) article headlined ‘Preprints could promote confusion and distortion’. Authored by Tom Sheldon, a senior press manager at the Science Media Centre, London, it advanced a strange idea: that bad science was published in the press because journalists did not have “enough time and breathing space” to evaluate it. While Sheldon then urges scientists “to be part of these debates – with their eyes open to how the media works” – the more forceful language elsewhere in the article suggest that preprints should go and that that will fix the problem.

There are numerous questionable judgments embedded here. Principal among them is that embargoes are the best way to do it – and this may seem obvious from the journal’s point of view because an embargo functions like a pair of blinders, keeping a journalist focused on a journal-approved story, and reminding her that she must contact a scientist because a deadline is approaching after which all publications will ‘break’ the story. Of course, embargoes aren’t the norm; the Ingelfinger rule says that the journal will be responsible for ensuring that whatever it publishes is good-to-go.

But with a preprint, there are no deadlines; there are no pointers about which papers are good or bad; and there is no list of people to contact. The journal fears that the journalist will fumble, be overcome with agoraphobia and, as Sheldon writes, “rushing to be the first to do so … end up misleading millions, whether or not that was the intention of the authors.”

It is obvious that the Ingelfinger + embargo way of covering research will produce more legitimate reportage more often – but these rules are not the reasons why the papers are reported the way they are.

High-profile cases in which peer-review failed to disqualify bad and/or wrong papers and papers’ results being included in the scientific canon only for replication studies to completely overturn them later are proof that journals, together with the publishing culture in which they are embedded, aren’t exactly perfect.

Some scientists have even argued that embargoes should be done away with because the hype they create often misrepresents the modesty of the underlying science. Others focused their attention on universities, which often feed on the hype created by embargoes to pump out press releases making far-fetched claims about what the scientists on their payrolls have accomplished.

In turn, journalists have been finding that good journalism is the outcome only when good journalism is also the process. Give a good journalist a preprint to work with and the same level of due diligence will be applied. Plonk a bad journalist in front of an embargoed news releases and a preprint, and you will only get shoddy work both times. It is not as if journalists suspend their fact-checking process when they work with embargoed papers curated by journals and reinstate it when dealing with preprints. A publication that covers science well will quite likely cover other subjects with the same regard and sensitivity not because of the Ingelfinger rule but because of the overall newsroom culture.

Last line of defence

Moreover, an infuriating presumption in the Nature article is that the preprint flows as if by magic from the repository where it was uploaded into the hands of the “millions” now misled by it. Indeed, though it is annoying that the phrasing makes no room for a functional journalist who can step in, write about the paper and arrange for it to be publicised, it is simply frustrating that the journalistic establishment remains invisible to Sheldon’s eye even when we’re talking about an extra-journal agent messing up along the way.

It is the product of this invisibility – rather, a choice to not acknowledge evident work – that suggests to the scientific journal that it must take responsibility for ensuring all that it publishes is good and right. As a pathway to accruing more relevance, this can only be good for the journal; however, it is also a way to accrue more power, so it must not be allowed to happen. This is ultimately why taking preprints away makes no sense: journals must share knowledge, not withhold it.

By taking preprints away from journalists, Sheldon proposes to force us to subsist on journal-fed knowledge – knowledge that is otherwise impossible to access for millions in the developing world, knowledge that is carefully curated according to the journal’s interests and, most of all, pushing the idea that the journal knows what is best for us.

But journals are not the last line of defence, even though they would like to think so; journalists are. That is how journalism is structured, how it functions, how it is managed as a sector, how it is perceived as an industry. If you take control away from journalists to move beyond papers approved by a journal, we lose our ability to question the journal itself.

The only portion of the Nature article that elicits a real need for concern is when Sheldon refers to embargoes as a means of safeguarding novelty for news publishers. He quotes Tom Whipple, science editor of The Times, saying that it is impossible to compete with the BBC because the BBC’s army of reporters are able to pick up on news faster. The alternative, he implies, is to preserve embargoes because they keep the results of a paper new until a given deadline – letting journalists from publishers small and large cover it at the same time.

In fact, if it is reform that we are desperate for, this is the first of three courses of action: to keep removing the barriers instead of making access more equitable. The second is to fix university press releases. The third is to stop interrogating preprints and start questioning publishing practices. For example, is it not curious that both Nature and NEJM, as well as many other ‘prestigious’ titles, rank almost as highly on the impact index as they do on the retraction index?

Update: The following correction was made to the Nature article on July 25 (h/t @kikisandberg). I guess that’s that now.

Screen Shot 2018-07-30 at 07.16.31

A detector for electron ptychography

Anyone who writes about physics research must have a part of their headspace currently taken up by assessing a new and potentially groundbreaking claim out of the IISc: the discovery of superconductivity at ambient pressure and temperature in a silver nanostructure embedded in a matrix of gold. Although The Hindu has already reported it, I suspect there’s more to be said about the study than is visible at first glance. I hope peer review will help the dust settle a little, but we all know post-publication peer-review is where the real action is. Until then, other physics news beckons…


Unlike room-temperature superconductivity, odds are you haven’t heard of ptychography. In the field of microscopy, ptychography is a solution to the so-called phase problem. When you take a selfie, the photographic sensor in your phone captures the intensity of light waves scattering off your face to produce a picture. In more sophisticated experiments, however, information about the intensity of light alone doesn’t suffice.

This is because light waves have another property called phase. When light scatters off your face, the phase change doesn’t embody any useful information about the selfie you’re taking. But if physicists are studying, say, atoms, then the phase change can tell them about the distribution of electrons around the nucleus. The phase problem comes to life when microscopes can’t capture phase information, leaving scientists with only a part of the picture.

Sadly, this constraint only exacerbates electron microscopy’s woes. Scientists in various fields use electron microscopy to elucidate structures of matter that are much smaller than the distances across which photons can act as probes. Thanks to their shorter wavelength, electrons are used to study the structure of proteins, the arrangement of atoms in solids and even aid in the construction of complex nanostructure materials.

However, the technique’s usefulness in studying individual atoms is limited by how well scientists are able to focus the electron beams onto their samples. To achieve atomic-scale resolution, scientists use a technique called high-angle annular dark-field imaging (ADF), wherein the electrons are scattered at high angles off the sample to produce an incoherent image.

For ADF to work better, the electrons need to possess more momentum, so scientists typically use sophisticated lenses to adjust the electron beam while they boost the signal strength to take stronger readings. This is not desirable. If the object of their study is fragile, the stronger beam can partially or fully disintegrate it. Thus, the high-angle ADF resolution for scanning transmission electron microscopy has been chained to the 0.05 nm mark, and going up to 0.1 nm for more fragile structures.

Ptychography solved the phase problem for X-ray crystallography in 1969. The underlying technique is simple. When X-rays interact with a sample under study and return to a detector, the detector produces a diffraction pattern that contains information about the sample’s shape.

In ptychography, scientists iteratively record the diffraction pattern obtained from different angles by changing the position of the illuminating beam, allowing them to compute the phase of returning X-rays relative to each other. By repeating this process multiple times from various directions, scientists will have data about the sample that they can reverse-process to extract the phase information.

Ptychography couldn’t be brought to electron microscopy straightaway, however, because of a limitation inherent to the method. For it to work, the microscope has to measure the diffraction intensity values with equal precision in all the required directions. “However, as electron scattering form factors have a very strong angular dependence, the signal falls rapidly with scattering angle, requiring a detector with high dynamic range and sensitivity to exploit this information” (source).

In short, electron microscopy couldn’t work with ptychography because these detectors didn’t exist. As an interim solution, in 2004, researchers from the University of Sheffield developed an algorithm to fill in the gaps in the data.

Then, on July 18, researchers from the US reported that they had built just such a detector (preprint), which they called an “electron microscope pixel array detector” (EMPAD), and claimed that they had used it to retrieve images of a layer of molybdenum disulphide with a resolution of 0.4 Å. One image from their paper is particularly stunning: it shows the level of improvement ptychography brings to the table, leaving the previous “state of the art” resolution of 1 Å achieved by ADF in the dust.

Source: https://arxiv.org/pdf/1801.04630.pdf
Source: https://arxiv.org/pdf/1801.04630.pdf

The novelty here isn’t that the detector is finally among us. The same research group (+ some others) had announced that it had built the EMPAD in 2015, and claimed then that it could be used for better electron ptychography. What’s new now is that the group has demonstrated it.

a) Schematic of STEM imaging using the EMPAD. b) Schematic of the EMPAD physical structure. The pixelated sensor (blue) is bump-bonded pixel-by-pixel to the underlying signal processing chip (pink). Source: https://arxiv.org/pdf/1511.03539.pdf
a) Schematic of STEM imaging using the EMPAD. b) Schematic of the EMPAD physical structure. The pixelated sensor (blue) is bump-bonded pixel-by-pixel to the underlying signal processing chip (pink). Source: https://arxiv.org/pdf/1511.03539.pdf

According to their 2015 paper, the device

consists of a 500 µm thick silicon diode array bump-bonded pixel-by-pixel to an application-specific integrated circuit. The in-pixel circuitry provides a 1,000,000:1 dynamic range within a single frame, allowing the direct electron beam to be imaged while still maintaining single electron sensitivity. A 1.1 kHz framing rate enables rapid data collection and minimizes sample drift distortions while scanning.

For the molybdenum disulphide imaging test, the EMPAD had 128 x 128 pixels, operated in the 20-300 keV energy range, possessed a dynamic range of 1,000,000-to-1 and with a readout speed of 0.86 ms/frame. The scientists also modified the ptychographic reconstruction algorithm to work better with the detector.

Redshift and eclipse

I am thoroughly dispirited. I had wanted to write today about how it is fascinating that we have validated Einstein’s theory of general relativity for the first time in an extreme environment: in the neighbourhood of a black hole. The test involved the detection of an effect called the gravitational redshift, whereby light that is moving from a region of higher to lower gravitational potential appears redshifted. In other words, light seen moving from an area of stronger gravitational field to an area of weaker gravitational field appears to be redder than it actually is, if the observer is sufficiently far from the source of this field. The observation of this redshift is doubly fascinating because it is also an observation of time dilation in action.

The European Southern Observatory’s Very Large Telescope (VLT) took the initiative 26 years to make this check; it was completed and announced yesterday, July 25. The source of the gravitational potential was the black hole at the Milky Way’s centre, called Sagittarius A*, and the source of starlight, a stellar body known only as S2. Triply fascinating is the fact that the VLT observed S2 swinging by Sgr A* at a searing 25 million km/hr. Phew!

But through this all, I am distressed because of an article I spotted a few minutes ago on NDTV’s website, about how we must not eat certain foods during a lunar eclipse – given the one set to happen tomorrow – because they could harm us. I thought we had been able to go a full day without a mainstream publication spreading pseudoscientific information about the eclipse, but here we are. I weep for many reasons; right now, I weep most of all not for the multitude of quacks we inhabit this country with but for Yash Pal. And I wish that, like S2, I can escape this nonsense at 3% of the speed of light when it becomes too much.