Truthenomics #6: Heads in the clouds

Whether you like it or not, much about your life that you might want to keep private is in the cloud. ‘The cloud’ being mere doublespeak for someone else’s computer.

That ‘someone else’ syphoning your life from your phone and reselling it across the cloud is typically one of the Fabulous Five (Amazon, Apple, Microsoft, Google and Meta) US-centric digital behemoths for most of us residing in Western countries.

This year, roughly 150 zettabytes of data will be sucked up and sold across the cloud. That's 150 British trillion (billion billion) or what an American would call 150 quadrillion gigabytesPut another way, about 2.4 billion smartphone’s worth of data every year. A lot… so a good thing that we are not paying for it all.

But we are paying for it all, in every possible way.

'The cloud' is just someone else's computer

No free lunch
Aside from the cloud-delivered ads we have to scythe our way through - that are built into the price of every item or service we buy online - we pay far more than just dollars for cloud computing. The exhilaratingly clever, often useful, albeit addictive engagement we have with the cloud has both direct and climate-induced environmental impacts.

To begin with, is there another complex device humanity routinely manufactures in such large quantities as computer chips? The beating heart of our cloud-connected phones, our car sat-navs, or crammed by the thousands into massive cloud data-processing and servicing computers. A wide variety of minerals, sourced from across the globe, ranging from the quite common (basically very pure sand) to the quite rare (such as rare-earth metals) are needed to manufacture each and every computer chip.




The US is so concerned about mineral inputs for US chip manufacture that it enacted the CHIPS and Science Act in 2022. This Act is intended to shore up (pun intended) the US’s tenuous chip mineral supply chains. Arsenic, cobalt and gallium and various rare-earth metals top the US’s hard-to-come-by list. The world’s supply of rare-earths is controlled by China, the US’s greatest geopolitical competitor.

Others minerals are the product of war-torn countries, eked out with forced child labour, such as cobalt from the Democratic Republic of Congo by ‘artisanal’ (aka DIY) miners. Such supply lacks effective corporate or government management of local environmental degradation and pollution.

The cloud relies on cobalt - much of it mined like this. Source

Then there is the pure energy needed to create the wonder that is a computer chip. Taiwan’s largest manufacturer, Taiwan Semiconductor Manufacturing Company, producer of about 90% of the world’s high-end chips, consumes over 5% of Taiwanese total grid power output.

A super power drain
Cloud computers are housed together in power-hungry data centres. The ‘size’ of cloud computing data centres aren’t described in geeky terms like ‘terabytes of storage’ or ‘megaflops of processing speed’ (gotta love ‘flops’ 🙂). Their capacity is so tightly correlated with their draw of grid power that their size is described in terms of their maximum power consumption.

The NextDC Hyperscale data centre recently opened in Perth is described as a 20 megawatt ‘IT Capacity’ data centre. That is, this single Perth data centre can draw up to 20 megawatts from the grid when operating at peak capacity. This is about 5% of the power output of a typical large coal or natural gas 400 megawatt power station operating in Western Australia. About the same draw on power as 30,000 households. Yes, quite a lot.

About 9000 data centres are scattered across the globe. These range from China Telecom’s 150 megawatt Inner Mongolia Information Park, serving 50% of the Chinese market, to more modest offerings like Next DC’s 20 megawatt centre here in WA.

Cloud computing likely accounts for at least 2% of today’s world electricity consumption and is growing exponentially. Business as usual, which assumes no cold or hot wars disrupting energy or technology supply chains, will likely see the cloud consume as much as 8% of electricity production by the end of the decade. Even if we are lucky enough to flatten the greenhouse emissions curve by then, over a billion tons of CO2 per year will be pumped into the atmosphere to feed our digital addictions.

Annual worldwide cloud computing data processing projected through to 2025. Source

Our cloud addiction is understandable - I want automated, digital, guardrails assisting my clinical colleagues deliver safer patient care. I love having my ebooks read to me and I routinely use voice recognition to dictate my clinical notes. What business would want to roll back corporate email, on-line meetings or cloud-based supply chain management?

The explosion in artificial intelligence will be an important contributor to rising power consumption. Generative AI use is growing rapidly - you know - although your kid’s teacher doesn’t want your child using Chat GPT for their homework, make no mistake, teachers love using Chat GPT to grade assignments. They are also using generative AI to develop syllabus and transform a few dot points about your Hope of the Nation’s reading and writing performance/issues into a school report. These reports come complete with some excellent pointers about things Skyla or Chase should work on at home. Most of the report is generated by the AI technology itself. Within a decade, likely all of it will be. Just feed in the whole semester's raw assignments!

Exponential growth can't last forever
It is a simple exercise to project that, sometime over the next decade or so, the growth in cloud computing will collide with our old friend reality - the physical limits of our planet’s natural resource abundance and power generation systems. Unabated growth of cloud data processing, climbing to the mid-yottabytes (1000 zettabytes) annually, would see cloud computing consuming all grid power generation on Earth.

There won't be many humans to buy smartphones if we don’t set aside some energy for global food production. There are also homes to build, air conditioning, healthcare and education to deliver.  We will also likely want to allocate significant resources and energy to personal & public transport, to the businesses that service our lives, as well as warfare materiel production (given growing geopolitical flakiness). The cloud is already competing with these energy demands.

Data centres are becoming the largest - and still growing - consumer of Ireland's power supply. Source

Cooling the cloud
A digital cloud does not make rain. Quite the opposite in fact. All those megawatts of power consumption create megawatts of waste heat. Data centres located anywhere but the South Pole - noting we have had small ones there since 2010 - need sophisticated water-based cooling systems. In addition to the energy competition, our increasing adoption of AI is competing with other life-sustaining demands for scarce and depleting global fresh water supplies.  About 5 billion cubic metres of fresh water - 2 million olympic swimming pools worth - will be diverted from other uses, just to keep data centres cool, as we deepen our engagement with AI within just a few years.

Stepping back, we need to make some hard choices about the extent to which we want to add AI to online searches just because we can (which consumes 3-4 times as much energy as current searches do). Curbing greenhouse emissions and preserving energy, freshwater and other natural resources for more meaningful uses than scrolling cat videos depends upon those choices.

However, if all this sounds environmentally unhelpful, you will be pleased to know I have saved the ‘best’ until last: the environmental lunacy of cryptocurrency mining.




Crypto mining - abject insanity
‘It’s still early’
is the mantra of crypto bros who cling to the idea that cryptocurrencies will, one day this century, find real-world uses outside of pseudonymous porn, drug and money laundering payments. However, even these uses are miniscule compared to the main activity in the crypto-sphere - that is feeding humankind’s insatiable appetite for gambling.

Across the planet, day and night, the crypto casinos - euphemistically called ‘exchanges’ -  allow punters to trade pairs of crypto 'coins' - typically buying and selling bitcoin or any of thousands of me-too coins with ‘stable coin’ money. Stable coins are like casino chips denominated in something you might recognise - such as US dollars and are called ‘stable’ because each stable coin ‘dollar’ is supposed to be backed, one to one, by a real dollar (or equivalent asset) in a real bank. All of this digital gambling chicanery adds to cloud computing energy consumption. However, it is dwarfed by the process that gives birth to each and every bitcoin.

The creation of a bitcoin has been likened to running your car engine to solve sudoku problems you can trade for heroin.

Source

Bitcoin miners use arrays of powerful computer chips in a ‘race’ with other miners to identify a random number that holds the key to unlocking and making available new bitcoins to the world. This race is called 'bitcoin mining'. The race is repeated every few minutes - which creates a new bitcoin every few minutes. Depending on the bitcoin price and the cost of energy to power the array of chips, winning the race on a fairly regular basis can be quite lucrative for bitcoin miners. Bitcoin miners like to set up their arrays of computer chips near cheap energy sources, irrespective of their greenhouse emissions, in order to maximise their chance of profits.

Best estimates are that the incessant 24-hour cycle of races to create new bitcoins consumes about 150 terawatts, about the same power consumption as Argentina. That is, every second of every day, we are wasting a country's worth of energy on a 'technology' useful only for gambling and illegal payments. As I said, abject insanity.

Constraining a cloud?
Apart from banning crypto mining, constraining the ecocidal footprint of our digital lives will not be easy. I hypocritically run this blog on the cloud - but I can find no cheaper, reliable alternative. I could stream less video I suppose…

Unlike crypto, we are still early when it comes to understanding how to harness the power of AI. In a recent BlueSky post I likened our understanding of AI to our understanding of the first lasers. It took a couple of decades before lasers became genuinely useful - other than as a prop nearly slicing Sean Connery's wedding tackle in half in the movie Goldfinger.

Like lasers, the latest AI tools, such as generative AI, are solutions looking for a problem. It may be twenty years or more before we fully bed tools like Open AI into daily life and business workflows most effectively; and above all - energetically and environmentally efficiently.

If we are to realise AI's potential, we need realistic energy transition plans. We also need to learn how to recycle and reuse the material or chips themselves, and where in our economies we can reduce competing, wasteful energy and resource uses.

The clip below from Goldfinger serves as a reminder of how rapidly things can change. We can wonder how folk in 60 years time might look at our first fumblings of the digital economy, stumblings with generative AI and our current mismanagement of the energy supplies and natural resource extraction. We are pointing the laser at our own nether regions rather than 007's. Until next time... enjoy the video!

0:00
/3:33