Video: Can US Power Grid Handle AI’s Growing Energy Needs? – Planetizen

author
1 minute, 21 seconds Read
image

As the use of AI technology grows, experts have their eyes on implications for the energy sector and electrical grid, reports Yahoo Finance. According to a 2021 study, training a large language model program like ChatGPT consumes enough energy to power 120 homes for the entire year. With the energy demands of data centers used to facilitate AI expected to nearly double by the end of this decade, from 17 gigawatts to up to 35 gigawatts a year, concerns about viability of the electrical grid, supply, and cost abound. Timothy Fox, a managing director for ClearView Energy Partners, put 35 gigawatts into perspective for Yahoo Finance interviewers:

That’s “about as much power as the state of New York consumes on the hottest day of the summer.”

“This is coming at one of the most pressing times for the grid,” Fox said, “one of the most prevalent issues facing the grid today, the power sector, is trying to ensure grid reliability at the same time it transitions to cleaner but also intermittent resources…Data centers not only facilitate AI, but they’re the backbone for our industries, for our commerce, for our transportation, for our health. These are mission critical infrastructure, so it’s going to be important that, while we’re facilitating this new industry, it’s not lights out for other industries.”

These concerns will primarily impact places with high concentrations of data centers (e.g., Virginia, California, Texas, and Arizona) and those trying to attract them. Fox says the utility companies and regulators know the demand is coming, but he didn’t delve into specific implications for infrastructure, regulations, and electricity rates.

This post was originally published on this site

Similar Posts