ChatGPT’s boss claims nuclear fusion is the answer to AI’s soaring energy needs. Not so fast, experts say

Artificial intelligence is energy-hungry and as companies race to make it bigger, smarter and more complex, its thirst for electricity will increase even further. This sets up a thorny problem for an industry pitching itself as a powerful tool to save the planet: a huge carbon footprint.

Yet according to Sam Altman, head of ChatGPT creator OpenAI, there is a clear solution to this tricky dilemma: nuclear fusion.

Altman himself has invested hundreds of millions in fusion and in recent interviews has suggested the futuristic technology, widely seen as the holy grail of clean energy, will eventually provide the enormous amounts of power demanded by next-gen AI.

“There’s no way to get there without a breakthrough, we need fusion,” alongside scaling up other renewable energy sources, Altman said in a January interview. Then in March, when podcaster and computer scientist Lex Fridman asked how to solve AI’s “energy puzzle,” Altman again pointed to fusion.

Nuclear fusion — the process that powers the sun and other stars — is likely still decades away from being mastered and commercialized on Earth. For some experts, Altman’s emphasis on a future energy breakthrough is illustrative of a wider failure of the AI industry to answer the question of how they are going to satiate AI’s soaring energy needs in the near-term.

It chimes with a general tendency toward “wishful thinking” when it comes to climate action, said Alex de Vries, a data scientist and researcher at Vrije Universiteit Amsterdam. “It would be a lot more sensible to focus on what we have at the moment, and what we can do at the moment, rather than hoping for something that might happen,” he told CNN.

A spokesperson for OpenAI did not respond to specific questions sent by CNN, only referring to Altman’s comments in January and on Fridman’s podcast.

The appeal of nuclear fusion for the AI industry is clear. Fusion involves smashing two or more atoms together to form a denser one, in a process that releases huge amounts of energy.

It doesn’t pump carbon pollution into the atmosphere and leaves no legacy of long-lived nuclear waste, offering a tantalizing vision of a clean, safe, abundant energy source.

But “recreating the conditions in the center of the sun on Earth is a huge challenge” and the technology is not likely to be ready until the latter half of the century, said Aneeqa Khan, a research fellow in nuclear fusion at the University of Manchester in the UK.

“Fusion is already too late to deal with the climate crisis,” Khan told CNN, adding, “in the short term we need to use existing low-carbon technologies such as fission and renewables.”

Fission is the process widely used to generate nuclear energy today.

The problem is finding enough renewable energy to meet AI’s rising needs in the near term, instead of turning to planet-heating fossil fuels. It’s a a particular challenge as the global push to electrify everything from cars to heating systems increases demand for clean energy.

A recent analysis by the International Energy Agency calculated electricity consumption from data centers, cryptocurrencies and AI could double over the next two years. The sector was responsible for around 2% of global electricity demand in 2022, according to the IEA.

The analysis predicted demand from AI will grow exponentially, increasing at least 10 times between 2023 and 2026.

As well as the energy required to make chips and other hardware, AI requires large amounts of computing power to “train” models — feeding them enormous datasets —and then again to use its training to generate a response to a user query.

As the technology develops, companies are rushing to integrate it into apps and online searches, ramping up computing power requirements. An online search using AI could require at least 10 times more energy than a standard search, de Vries calculated in a recent report on AI’s energy footprint.

The dynamic is one of “bigger is better when it comes to AI,” de Vries said, pushing companies toward huge, energy-hungry models. “That is the key problem with AI, because bigger is better is just fundamentally incompatible with sustainability,” he added.

The situation is particularly stark in the US, where energy demand is shooting upward for the first time in around 15 years, said Michael Khoo, climate disinformation program director at Friends of the Earth and co-author of a report on AI and climate. “We as a country are running out of energy,” he told CNN.

In part, demand is being driven by a surge in data centers. Data center electricity consumption is expected to triple by 2030, equivalent to the amount needed to power around 40 million US homes, according to a Boston Consulting Group analysis.

“We’re going to have to make hard decisions” about who gets the energy, said Khoo, whether that’s thousands of homes, or a data center powering next-gen AI. “It can’t simply be the richest people who get the energy first,” he added.

For many AI companies, concerns about their energy use overlook two important points: The first is that AI itself can help tackle the climate crisis.

“AI will be a powerful tool for advancing sustainability solutions,” said a spokesperson for Microsoft, which has a partnership with OpenAI.

The technology is already being used to predict weather, track pollution, map deforestation and monitor melting ice. A recent report published by Boston Consulting Group, commissioned by Google, claimed AI could help mitigate up to 10% of planet-heating pollution.

AI could also have a role to play in advancing nuclear fusion. In February, scientists at Princeton announced they found a way to use the technology to forecast potential instabilities in nuclear fusion reactions — a step forward in the long road to commercialization.

AI companies also say they are working hard to increase efficiency. Google says its data centers are 1.5 times more efficient than a typical enterprise data center.

A spokesperson for Microsoft said the company is “investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application.”

There has been a “tremendous” increase in AI’s efficiency, de Vries said. But, he cautioned, this doesn’t necessarily mean AI’s electricity demand will fall.

In fact, the history of technology and automation suggests it could well be the opposite, de Vries added. He pointed to cryptocurrency. “Efficiency gains have never reduced the energy consumption of cryptocurrency mining,” he said. “When we make certain goods and services more efficient, we see increases in demand.”

In the US, there is some political push to scrutinize the climate consequences of AI more closely. In February, Sen. Ed Markey introduced legislation aimed at requiring AI companies to be more transparent about their environmental impacts, including soaring data center electricity demand.

“The development of the next generation of AI tools cannot come at the expense of the health of our planet,” Markey said in a statement at the time. But few expect the bill would get the bipartisan support needed to become law.

In the meantime, the development of increasingly complex and energy-hungry AI is being treated as an inevitability, Khoo said, with companies in an “arms race to produce the next thing.” That means bigger and bigger models and higher and higher electricity use, he added.

“So I would say anytime someone says they’re solving the problem of climate change, we have to ask exactly how are you doing that today?” Khoo said. “Are you making every next day less energy intensive? Or are you using that as a smokescreen?”

Must Read

error: Content is protected !!