Is Nuclear Fusion Really The Ultimate Solution to AI’s Crazy Power Use? | OilPrice.com
- A Boston Consulting Group analysis has predicted that data center electricity consumption will triple by 2030.
- Past trends in technology advances suggest that AI cons are very likely to outweigh the pros as far as power demand is concerned.
- OpenAI’s Altman: nuclear fusion is the ultimate solution to the AI energy puzzle.
Two weeks ago, we reported how Artificial Intelligence (AI), cryptocurrency mining and clean energy manufacturing are powering the Fourth Industrial Revolution, or simply 4R, and driving disruptive trends including the rise of data and connectivity, analytics, human-machine interaction, and improvements in robotics. Unfortunately, these secular megatrends are pushing the U.S. power grid to its limits.
According to Sreedhar Sistu, vice president of artificial intelligence at Schneider Electric (OTCPK:SBGSF), excluding China, AI represents 4.3 GW of global power demand, and could grow almost five-fold by 2028. Another analysis has predicted that demand from AI will grow exponentially, increasing at least 10x between 2023 and 2026.
AI tasks typically demand more powerful hardware than traditional computing tasks. Meanwhile, bitcoin mining shows no signs of slowing down, with mining rates hitting 565 exahashes per second (EH/s) currently, a five-fold increase from three years ago.
Bitcoin mining consumes 148.63 TWh of electricity per year and emits 82.90 Mt CO2 per year, comparable to the power consumption of Malaysia. And, data center demand is not helping matters at all. Data center storage capacity is expected to grow from 10.1 zettabytes (ZB) in 2023 to 21.0 ZB in 2027, good for a 18.5% CAGR.
A Boston Consulting Group analysis has predicted that data center electricity consumption will triple by 2030, enough electricity to power 40 million U.S. homes.
The situation is already getting out of hand: U.S. power demand has started rising for the first time ever in 15 years. “We as a country are running out of energy,” Michael Khoo, climate disinformation program director at Friends of the Earth and co-author of a report on AI and climate, has told CNN.
To be fair, AI has been touted as one of the key technologies that will help tackle climate change. The revolutionary technology is already being used to track pollution, predict weather, monitor melting ice and map deforestation. A recent report commissioned by Google and published by the Boston Consulting Group claimed AI could help mitigate up to 10% of planet-heating pollution.
Unfortunately, past trends in technology advances suggest that AI cons are very likely to outweigh the pros as far as power demand is concerned.
“Efficiency gains have never reduced the energy consumption of cryptocurrency mining. When we make certain goods and services more efficient, we see increases in demand,” Alex de Vries, a data scientist and researcher at Vrije Universiteit Amsterdam, has pointed out.
At this point, nearly everybody agrees that we are incapable of developing renewable energy plants fast enough to meet this skyrocketing power demand. So, what other recourse do we have, short of saying let’s just build more natural gas and fossil fuel power plants?
Enter nuclear fusion, long regarded by scientists as the Holy Grail of clean and almost limitless energy. Sam Altman, head of ChatGPT creator OpenAI, says nuclear fusion is the ultimate solution to the AI energy puzzle, “There’s no way to get there without a breakthrough, we need fusion,” Altman said in a January interview. Altman reiterated this view a few weeks ago when podcaster and computer scientist Lex Fridman asked him about the AI energy conundrum.
Blue Sky Thinking
Unfortunately, Altman’s proposal is likely another case of overly optimistic blue-sky thinking, and we might not be any closer to building a commercial nuclear fusion reactor than we are to harvesting energy from blackholes.
For decades, nuclear fusion has been considered the “Holy Grail” of clean energy. If we were able to harness its power it would mean endless clean and sustainable energy. It’s what powers stars, and the theory is that it could be successfully applied to nuclear reactors–without the risk of a catastrophic meltdown disaster.
Scientists have been working on a viable nuclear fusion reactor since the 1950s–ever hopeful that a breakthrough is just around the corner. Unfortunately, the running joke has become that a practical nuclear fusion power plant could be decades or even centuries away, with milestone after milestone having fallen time and again.
To be fair again, there’s been some promising glimpses into the possibilities here. Last year, a nuclear fusion reactor in California produced 3.15 megajoules of energy using only 2.05 megajoules of energy input, a rare instance where a fusion experiment produced more energy than it consumed. The vast majority of fusion experiments are energy negative, taking in more energy than they generate thus making them useless as a form of electricity generation. Despite growing hopes that fusion could soon play a part in climate change mitigation by providing vast amounts of clean power for energy-hungry technologies like AI, the world is “still a way off commercial fusion and it cannot help us with the climate crisis now”, Aneeqa Khan, research fellow in nuclear fusion at Manchester University, told the Guardian just after the initial December breakthrough.
You don’t have to look very far to get a healthy dose of reality check.
For decades, 35 countries have collaborated on the largest and most ambitious scientific experiments ever conceived: the International Thermonuclear Experimental Reactor (ITER), the biggest-ever fusion power machine. ITER plans to generate plasma at temperatures 10x higher than that of the sun’s core, and generate net energy for seconds at a time. As is usually the case with many nuclear power projects, ITER is already facing massive cost overruns that puts its future viability in question. W
ADVERTISEMENT
When the ITER project formally commenced operations in 2006, its international partners agreed to fund an estimated €5 billion (then $6.3 billion) for a 10-year plan that would have seen the reactor come online in 2016. Charles Seife, director of the Arthur L. Carter Institute of Journalism at New York University, has sued ITER for lack of transparency on cost and incessant delays. According to him, the project’s latest official cost estimate now stands at more than €20 billion ($22 billion), with the project nowhere near achieving its key objectives. To make matters worse, none of ITER’s key players, including the U.S. Department of Energy, has been able to provide concrete answers of whether the team can overcome the technical challenges or estimates of the additional delays, much less the extra expenses.
Source: Scientific American
Seife notes that whereas the Notre Dame took a century to complete, it eventually was used for its intended purpose less than a generation after construction began. However, he concludes by saying that the same can hardly be said about ITER, which looks less and less like a cathedral–and more like a mausoleum.
By Alex Kimani for Oilprice.com
More Top Reads From Oilprice.com:
Alex Kimani
Alex Kimani is a veteran finance writer, investor, engineer and researcher for Safehaven.com.