Crypto: Climate change means new tech has less time today to prove itself

I spent this weekend reading about permissioned and permissionless blockchain systems. If you want to get in on it, I can’t recommend this post by David Rosenthal enough. Much of the complexity of executing transactions of the major extant cryptocurrencies, including bitcoin and ether, arises from the need for these systems to ensure they are permissionless from start to finish, i.e. to maintain their integrity and reliability without deferring to a centralised authority entity.

This simple fact is more important than it seems at first because it challenges in a significant way the reality that most bitcoin and ether mining pools are highly concentrated in the hands of a very small number of people. Put another way, everything from the verbal sophistry to the speculative fundraising to the enormous power consumption that sustain the major cryptocurrencies have failed to do the one thing that cryptocurrencies were invented to do: decentralise.

Most other cryptocurrencies likely operate with the same problems; I say ‘major’ only to limit myself to what I’m familiar with. Second, don’t underestimate the value of simple facts in an ecosystem in which jargon and verbiage are core components of defending against criticism. One such bit of verbiage is the oft-repeated claim that “it’s still the early days” – in the face of questions about how much more time cryptocurrencies will need to become stable and, importantly, socially useful. Software engineer Molly White has written about how this is simply not true:

… a lot has changed in the technology world in the past six to twelve years. One only needs to look at Moore’s law to see how this is pretty much built in to the technology world, as once-impossible ideas are rapidly made possible by exponentially more processing power. And yet, we are to believe that as technology soared forward over the past decade, blockchain technologies spent that time tripping over their own feet?

Something I see missing from this already expansive discussion (i.e. I might have missed it) is how climate change alters the picture.

The biggest criticism facing bitcoin and ether is that their power consumption, based on the method they use to protect against fraud in a decentralised way – called ‘proof of work’ – is colossal. Rosenthal defers to the Cambridge Bitcoin Energy Consumption Index, according to which the annualised bitcoin network power consumption (at 6:47 pm on February 13, 2022) was 125.13 TWh – roughly equal to that of the Netherlands.

Others, like Molly White, have written about the fact that 13-14 years after the advent of the web, there was much more adoption and innovation than there has been in the 13-14 years since the birth of the idea of using permissionless blockchains to execute financial transactions. This can be interpreted to imply that the proponents of cryptocurrencies have been expending energy – both literal and otherwise – fighting against the system’s indefatigable tendency to centralise. And by failing, they have kept this energy out of reach of its “more socially valuable uses,” to use Rosenthal’s words.

I think both these arguments – the straightforward carbon footprint and the social disempowerment – are significant and legitimate but often lead people to ignore a third implication specific to technology: the time a technology has available to prove that its adoption is desirable is falling rapidly, perhaps as fast as the atmospheric concentration of carbon dioxide (CO2) is increasing.

The creation and implementation of the web – technically, web1 from the early 1990s and web2 from the mid-2000s – happened at a time when the atmospheric CO2 concentration was 354.45 ppm (1990) and then 379.98 ppm (2005). In 2021, the concentration was 416.45 ppm.

Tech folks may find this arbitrary, but for an observer at infinity (which I consider myself and anyone outside of the cryptocurrency as well as IT/software spaces and located in an economically developing or ‘under-developed’ country to be), it seems eminently reasonable. Climate change has broken the symmetry between our past and our future vis-à-vis our ability to tolerate energy-intensive technologies, and constantly breaks it.

Roughly 16 years lapsed between the advent of web1 and the birth of Twitter, but in the era of manifest climate change, the fuller statement has to be: “Roughly 16 years lapsed between the advent of web1 and the birth of Twitter, as the atmospheric CO2 concetration increased by 27.64 ppm.” Obviously there may be no generally accepted way to compare levels or even types of innovation, so saying “innovating something in the cryptocurrency space comparable to Twitter” doesn’t make sense. Let’s flip it to a marginally more meaningful statement, one that I hope will also illustrate my point better: how much innovation did technologists achieve in the cryptocurrency-space in the time in which atmospheric CO2 concentrations increased by 27.64 ppm?

Note here that web3 – a web based on storing, transporting and validating information using blockchains – seeks to depart from the incumbent web2 by decentralising, and liberating, user experience from the silos of ‘Big Tech’, a group of companies that includes Twitter. So there may be a way to compare the carbon emissions vis-à-vis efforts to achieve web3 versus efforts to achieve web2. Proponents of cryptocurrencies and NFTs may contend in turn that the social consequences of web2 and web3 would be apples and oranges, but I think I’m comfortable ‘cancelling’ that difference with the opportunities for social welfare squandered by wasteful energy consumption.

Second note: the concentration of atmospheric CO2 is distributed like this. But in our calculations, we need to adopt the global average for reasons both obvious (it’s climate change, not weather change) and subtle. Some entities have created (permissionless) “carbon-negative” blockchains; the negativity is attained through carbon offsets, which is a stupid idea. To quote from a previous post:

Trees planted today to offset carbon emitted today will only sequester that carbon at optimum efficiencies many years later – when carbon emissions from the same project, if not the rest of the world, are likely to be higher. Second, organisations promising to offset carbon often do so in a part of the world significantly removed from where the carbon was originally released. Arguments against the ‘Miyawaki method’ suggest that you can only plant plants up to a certain density in a given ecosystem, and that planting them even closer together won’t have better or even a stagnating level of effects – but will in fact denigrate the local ecology. Scaled up to the level of countries, this means … emitting many tonnes of carbon dioxide over North America and Europe and attempting to have all of that sequestered in the rainforests of South America, Central Africa and Southeast Asia won’t work, at least not without imposing limitations on the latter countries’ room to emit carbon for their own growth as well as on how these newly created ‘green areas’ should be used.

To conclude: Global warming is accelerating, so I’m comfortable comparing two events – such as two bits of innovation – only if they occurred in a period of the same atmospheric CO2 concentration (give or take 10%). Perhaps more fundamentally, clock-time is a less useful way today to measure the passage of time than the value of this number, including vis-à-vis the tolerability of innovation.

Christopher Nolan’s explosion

In May, Total Film reported that the production team of Tenet, led by director Christopher Nolan, found that using a second-hand Boeing 747 was better than recreating a scene involving an exploding plane with miniatures and CGI. I’m not clear how exactly it was better; Total Film only wrote:

“I planned to do it using miniatures and set-piece builds and a combination of visual effects and all the rest,” Nolan tells TF. However, while scouting for locations in Victorville, California, the team discovered a massive array of old planes. “We started to run the numbers… It became apparent that it would actually be more efficient to buy a real plane of the real size, and perform this sequence for real in camera, rather than build miniatures or go the CG route.”

I’m assuming that by ‘numbers’ Nolan means the finances. That is, buying and crashing a life-size airplane was more financially efficient than recreating the scene with other means. This is quite the disappointing prospect, as must be obvious, because this calculation limits itself to a narrow set of concerns, or just one as in this case – more bang for the buck – and consigns everything else to being negative externalities. Foremost on my mind is carbon emissions from transporting the vehicle, the explosion and the debris. If these costs were factored in, for example in terms of however much the carbon credits would be worth in the region where Nolan et al filmed the explosion, would the numbers have still been just as efficient? (I’m assuming, reasonably I think, that Nolan et al aren’t using carbon-capture technologies.)

However, CGI itself may not be so calorifically virtuous. I’m too lazy in this moment to cast about on the internet for estimates of how much of the American film industry’s emissions CGI accounts for. But I did find this tidbit from 2018 on Columbia University’s Earth Institute blog:

For example, movies with a budget of $50 million dollars—including such flicks as Zoolander 2, Robin Hood: Prince of Thieves, and Ted—typically produce the equivalent of around 4,000 metric tons of CO2. That’s roughly the weight of a giant sequoia tree.

A ‘green production guide’ linked there leads to a page offering an emissions calculator that doesn’t seem to account for CGI specifically; only broadly “electricity, natural gas & fuel oil, vehicle & equipment fuel use, commercial flights, charter flights, hotels & housing”. In any case, I had a close call with bitcoin-mining many years ago that alerted me to how energy-intensive seemingly straightforward computational processes could get, followed by a reminder when I worked at The Hindu – where the two computers used to render videos were located in a small room fit with its own AC, fixed at 18º C, and when they were rendering videos without any special effects, the CPUs’ fans would scream.

Today, digital artists create most CGI and special effects using graphics processing units (GPUs) – a notable exception was the black hole in Nolan’s 2014 film Interstellar, created using CPUs – and Nvidia and AMD are two of the more ‘leading’ brands from what I know (I don’t know much). One set of tests whose results a site called ‘Tom’s Hardware’ reported in May this year found an Nvidia GeForce RTX 2080 Ti FE GPU is among the bottom 10% of performers in terms of wattage for a given task – in this case 268.7 W to render fur – among the 42 options the author tested. An AMD Radeon RX 5700 XT GPU consumed nearly 80% as much for the same task, falling in the seventh decile. A bunch of users on this forum say a film like Transformers will need Nvidia Quadro and AMD Firepro GPUs; the former consumed 143 W in one fur-rendering test. (Comparability may be affected by differences in the hardware setup.) Then there’s the cooling cost.

Again, I don’t know if Nolan considered any of these issues – but I doubt that he did – when he ‘ran the numbers’ to determine what would be better: blowing up a real plane or a make-believe one. Intuition does suggest the former would be a lot more exergonic (although here, again, we’re forced to reckon with the environmental and social cost of obtaining specific metals, typically from middle-income nations, required to manufacture advanced electronics).

Cinema is a very important part of 21st century popular culture and popular culture is a very important part of how we as social, political people (as opposed to biological humans) locate ourselves in the world we’ve constructed – including being good citizens, conscientious protestors, sensitive neighbours. So constraining cinema’s remit or even imposing limits on filmmakers for the climate’s sake are ridiculous courses of action. This said, when there are options (and so many films have taught us there are always options), we have a responsibility to pick the more beneficial one while assuming the fewest externalities.

The last bit is important: the planet is a single unit and all of its objects occupants are wildly interconnected. So ‘negative externalities’ as such are more often than not trade practices crafted to simplify administrative and/or bureaucratic demands. In the broader ‘One Health’ sense, they vanish.