Five Things to Know About AI's Thirst for Energy -- Journal Report

Dow Jones02-01 22:00

By Jennifer Hiller

The race for AI dominance launched a stampede to connect to America's vast electric grid. Plugging in, however, is likely to be easier said than done.

Tech companies and data-center providers want city-sized amounts of power as soon as possible. That's upending the business of utilities, power producers and grid planners, who had been living in a world of nearly flat electricity demand for roughly two decades.

Chinese AI company DeepSeek's recent release of an AI model raised questions about just how much computing power and electricity will be needed for AI. DeepSeek appears to perform on par with a cutting-edge counterpart from OpenAI, but needed far less computing power to develop.

It isn't yet clear what that means for the AI energy outlook. Tech giants Meta Platforms and Microsoft said Wednesday that they were sticking with ambitious investments in the technology, and that advancements could make AI cheaper and more widely used . That would lead to higher energy use because even if AI models can be trained more efficiently, their use requires more electricity than something like a traditional Google search.

Here's what you need to know about why AI needs so much juice and how the power industry is trying to deliver it.

1. Computing demand is increasing exponentially

Current and widely used AI models, including OpenAI's GPT-4 and Meta's Llama 3.1, were trained at data centers that use around 30 megawatts of electricity at a time, according to the nonprofit research group Epoch AI.

That's roughly as much power as 30 Walmart stores use at any given moment.

Trends suggest that by 2030 data centers for training the largest AI models will require more than 5 gigawatts of electricity, about what Manhattan consumes at a time, on average.

"There is an incentive to go bigger," says Jaime Sevilla, director of Epoch AI. "Since 2020, we've known that if you train a model for longer and on more data, you are able to squeeze more performance out of it." He doesn't think that DeepSeek changes the outlook for power consumption.

Companies use graphics processing units, or GPUs, to train AI models. Companies training AI models to perform ever more complex tasks are doing what has worked so far: building larger clusters of GPUs, which require more electricity.

Each year, new AI models are trained on about four times as much computing power as the prior year. That far outpaces any hardware efficiency gains.

While the main energy cost of AI training is running the GPUs, there also are servers to store and manage data, interconnections between GPUs and cooling that contribute to consumption.

2. Future power demand is uncertain

Forecasts for AI's power use vary widely.

Projections from several analysts before the DeepSeek release suggested that data centers could consume anywhere from 4.6% to 17% of U.S. electricity by 2030. That would be up from about 4% in 2024, according to the Electric Power Research Institute.

Growth anywhere in those windows is a major challenge for the U.S. electric grid, which is sometimes called the world's largest machine. Utilities compare this to the advent of air conditioning.

3. There are bottlenecks and timing mismatches

It can take 18 months to two years to build a large data center.

But building renewable-energy projects or natural-gas-fired power plants often takes three years or more. New transmission lines can take a decade or longer.

Connecting a new user that requires 1 gigawatt or more of power in short order is a challenge in any location, regardless of utility size. Initial assumptions have been that the 5 gigawatts of computing power that could be needed to train the largest AI models would need to be on one massive data-center campus.

Companies and academics are considering an idea -- not proven or being tested yet -- that a 5-gigawatt AI model could train in data centers spread across 100 to 200 miles, with strong fiber connections in between them, says Arshad Mansoor, chief executive of the Electric Power Research Institute. That could make it slightly easier and quicker to build the needed grid infrastructure.

4. Preventing power shortages will be challenging

The size and speed at which data centers can be built and connected creates challenges for system planning, according to the North American Electric Reliability Corporation, a nonprofit that develops standards for utilities and power producers. Industrial growth and the adoption of electric vehicles and heat pumps adds pressure, too.

In Oregon, where tax breaks and cheap hydroelectric power have helped make the state a major U.S. data-center market, the data-center industry could consume as much as 24% of electricity by 2030, according to the Electric Power Research Institute.

The Northwest Power and Conservation Council, the regional energy planner, forecasts potential winter power shortages by then if data-center growth continues and new power generation can't be built quickly enough.

Jennifer Light, director of power planning for the council, says the potential shortfalls showing up in models don't necessarily mean the lights go out. But the region could have to rely more on emergency measures such as diesel generators or increased wholesale-market purchases of electricity, which could be expensive for all customers.

Planning models are trying to find the lowest-cost mix of investments that can meet the needs, she says.

5. Renewables can't fill the gap fast enough

Tech companies especially want renewable or nuclear power for their data centers to avoid carbon emissions.

There's only so much nuclear power available, however, and it's difficult to quickly add more, considering the long regulatory and construction timelines.

Renewables like wind and solar are clean, but aren't available around the clock to match up with a data center's 24-7 demand.

Tech companies will continue backing new wind and solar projects, but it will take years for clean and reliable technologies such as proposed small nuclear reactors or geothermal projects to make a dent in the market.

Meanwhile, analysts expect new natural-gas-fired power plants to be added to meet rising power demand.

Jennifer Hiller is a reporter for The Wall Street Journal in Houston. She can be reached at jennifer.hiller@wsj.com.

 

(END) Dow Jones Newswires

February 01, 2025 09:00 ET (14:00 GMT)

Copyright (c) 2025 Dow Jones & Company, Inc.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment