Recent research suggests that AI might consume between 85 and 134 terawatt-hours (TWh) of electricity each year by 2027, potentially as much as the Netherlands.
It’s a huge amount of energy, by any standard, accounting for something like 0.5% of total global consumption. The author of the study, a PhD candidate at the VU Amsterdam School of Business and Economics called Alex De Vries, based the estimate on the recent growth rate of AI, the availability of the microchips that AI users, and the capacity at which servers will be working. Nvidia chips, which are thought to supply around 95% of all processing power required by the AI sector, are an important part of the puzzle.
De Vries – whose study has been peer-reviewed and is published in the journal Joule – recommends that AI should only be used where it is really necessary.
This is, of course, an unrealistic conclusion, for many reasons. Nonetheless, AI’s increasing energy usage is likely to be of interest to those involved in Web3, for at least two reasons.
One is that AI is a cutting-edge technology that is on the cusp of breaking through into mainstream use. The emergence of chatbots such as ChatGPT, based on Large Language Models (LLM), is in the process of changing many industries, and will doubtless have numerous important roles in Web3. Just like decentralised technologies, AI will become part of the fabric of online services in the coming months and years; anyone who has an eye on the future of the web can’t help but be aware of them both.
The second reason is that we’ve seen this criticism of a technology’s high energy use before. AI’s forecast energy consumption happens to be roughly in line with that of Bitcoin.
Bitcoin’s Energy Use
Ever since Bitcoin became big enough to be noticed, its energy use has been highly controversial. The proof-of-work consensus mechanism used to secure the Bitcoin blockchain requires a large network of computers, numbering many thousands, using specialist hardware as they compete to find the answer to a cryptographic challenge. Bitcoin’s security is directly proportional to its energy use, since anyone who wants to attack the network would need to control over half of all its computational resources.
In 2022, academics estimated that the Bitcoin network uses around 150 TWh of electricity per year, roughly the same as AI is forecast to use within the next few years. AI, like Bitcoin’s PoW consensus, is computationally expensive, requiring devices that can crunch through enormous volumes of data. An AI query is significantly more energy-intensive than a regular web search.
Should we worry about the amount of energy these two technologies use – since between them, they will soon account for 1% of global energy consumption? Is their energy use a price worth paying – and is that even the right question to ask?
Driving Clean Energy
As it happens, there’s increasing evidence that Bitcoin mining is actually helping drive sustainable energy use. This has been front-and-centre for miners since Elon Musk suspended Bitcoin payments for Tesla, saying he would restart them once the network used over 50% clean energy. The figure is now around 53%, and trending upwards.
Let’s not forget that BlackRock—the world’s largest asset manager and a well-known (and sometimes controversial) advocate for ESG investment—recently filed for a spot bitcoin ETF.
In its own way, AI may also produce answers to some of the world’s hard-to-solve environmental problems, including creating more accurate climate models, improving decision-making for decarbonising processes in major industries, and helping to allocate energy use to balance grids and minimise load.
All of this is good, but it’s beside the point.
Energy Use Is Economic Activity
Increasing energy use is a function of human progress. It’s no coincidence that energy consumption has tracked technological innovation, growing exponentially in the last 200 years, and particularly in the years following the Second World War.
Taking a step back, criticisms of any technology based on energy use are misguided. They tend to start from the position that the technology – whatever it is – lacks value, and therefore isn’t worth the cost in terms of carbon footprint and pollution. It’s hardly a scientific way of making decisions about what technologies to adopt.
It’s noteworthy, for example, that there is no discussion around the energy use required by the Visa network, by the banking sector, or by gold mining. The energy consumption of the gold industry is roughly twice that of Bitcoin; a high price to pay for shiny rocks that cannot easily be moved around or divided into useful fragments. It all depends on your starting assumptions, not empirical data about the benefits these technologies might provide in the future – if they are only allowed to develop freely.
The smart approach is to pursue clean energy production and maximise energy efficiency. This is a good idea for a number of reasons, quite apart from climate change – reducing pollution, ensuring energy independence in an uncertain world, and acknowledging the fact that non-renewable resources are (by definition) going to run out in the not-too-distant future, and it makes sense to find a replacement as long before that happens as possible.
The idea that we might simply shelve new technologies, without even trying to leverage them for good, based on assumptions and information that can only be partial before those technologies have even been developed, is ludicrous, irresponsible, and unworkable. In practice, even if we did halt AI development, other nations with a more forward-thinking approach would not do the same (assuming that global consensus could not be reached).
The bottom line, though, is that these are two cutting-edge and very promising technologies that are both reaching a tipping point, with both destined to be a key part of future online services. Together, they hold out the possibility of providing solutions to the world’s problems that might otherwise be unavailable.
If so, 1% of global energy use is easily worth it.