Across the U.S. and worldwide, energy demand is soaring as data centers work to support the wide and growing use of artificial intelligence. These large facilities are filled with powerful computers, called servers, that run complex algorithms to help AI systems learn from vast amounts of data.
This process requires tremendous computing power, which consumes huge quantities of electricity. Often, a single data center will use amounts comparable to the power needs of a small town. This heavy demand is stressing local power grids and forcing utilities to scramble to provide enough energy to reliably power data centers and the communities around them.
My work at the intersection of computing and electric power engineering includes research on operating and controlling power systems and making the grid more resilient. Here are some ways in which the spread of AI data centers is challenging utilities and grid managers, and how the power industry is responding.
Upsetting a delicate balance
Electricity demand from data centers can vary dramatically throughout the day, depending on how much computing the facility is doing. For example, if a data center suddenly needs to perform a lot of AI computations, it can draw a huge amount of electricity from the grid in a period as short as several seconds. Such sudden spikes can cause problems for the power grid locally.
Electric grids are designed to balance electricity supply and demand. When demand suddenly increases, it can disrupt this balance, with effects on three critical aspects of the power grid:
- Voltage can be thought of as the push that makes electricity move, like the pressure in a water hose. If too many data centers start demanding electricity at the same time, it’s like turning on too many taps in a building at once and reducing its water pressure. Abrupt shifts in demand can cause voltage fluctuations, which may damage electrical equipment.
- Frequency is a measurement of how electric current oscillates back and forth per second as it travels from power sources to load demand through the network. The U.S. and most major countries transmit electricity as alternating current, or AC, which periodically reverses direction. Power grids operate at a stable frequency, usually 50 or 60 cycles per second, known as hertz; the U.S. grid operates at 60 Hz. If demand for electricity is too high, the frequency can drop, which can cause equipment to malfunction.
- Power balance is the constant real-time match between electricity supply and demand. To maintain a steady supply, power generation must match power consumption. If an AI data center suddenly demands a lot more electricity, it’s like pulling more water from a reservoir than the system can provide. This can lead to power outages or force the grid to rely on backup power sources, if available.
Peaks and valleys in power use
To see how operating decisions can play out in real time, let’s consider an AI data center in a city. It needs 20 megawatts of electricity during its peak operations – the equivalent of 10,000 homes turning on their air conditioners at the same time. That’s large but not outsize for a data center: Some of the biggest facilities can consume more than 100 megawatts.
Many industrial data centers in the U.S. draw this amount of power. Examples include Microsoft data centers in Virginia that support the company’s Azure cloud platform, which powers services such as OpenAI’s ChatGPT, and Google’s data center in The Dalles, Oregon, which supports various AI workloads, including Google Gemini.
The center’s load profile, a timeline of its electricity consumption through a 24-hour cycle, can include sudden spikes in demand. For instance, if the center schedules all of its AI training tasks for nighttime, when power is cheaper, the local grid may suddenly experience an increase in demand during these hours.
Here’s a simple hypothetical load profile for an AI data center, showing electricity consumption in megawatts:
- 6 a.m.-8 a.m.: 10 MW (low demand)
- 8 a.m.-12 p.m.: 12 MW (moderate demand)
- 12 p.m.-6 p.m.: 15 MW (higher demand due to business hours)
- 6 p.m.-12 a.m.: 20 MW (peak demand due to AI training tasks)
- 12 a.m.-6 a.m.: 12 MW (moderate demand due to maintenance tasks)
Ways to meet demand
There are several proven strategies for managing this kind of load and avoiding stress to the grid.
First, utilities can develop a pricing mechanism that gives AI data centers an incentive to schedule their most power-intensive tasks during off-peak hours, when overall electricity demand is lower. This approach, known as demand response, smooths out the load profile, avoiding sudden spikes in electricity usage.
Second, utilities can install large energy storage devices to bank electricity when demand is low, and then discharge it when demand spikes. This can help smooth the load on the grid.
Third, utilities can generate electricity from solar panels or wind turbines, combined with energy storage, so that they can provide power for periods when demand tends to rise. Some power companies are using this combination at a large scale to meet growing electricity demand.
Fourth, utilities can add new generating capacity near data centers. For example, Constellation plans to refurbish and restart the undamaged unit at the Three Mile Island nuclear plant near Middletown, Pennsylvania, to power Microsoft data centers in the mid-Atlantic region.
In Virginia, Dominion Energy is installing gas generators and plans to deploy small modular nuclear reactors, along with making investments in solar, wind and storage. And Google has signed an agreement with California-based Kairos Power to purchase electricity from small modular nuclear reactors.
Finally, grid managers can use advanced software to predict when AI data centers will need more electricity, and communicate with power grid resources to adjust accordingly. As companies work to modernize the national electric grid, adding new sensor data and computing power can maintain voltage, frequency and power balance.
Ultimately, computing experts predict that AI will become integrated into grid management, helping utilities anticipate issues such as which parts of the system need maintenance, or are at highest risk of failing during a natural disaster. AI can also learn load profile behavior over time and near AI data centers, which will be useful for proactively balancing energy and managing power resources.
The U.S. grid is far more complicated than it was a few decades ago, thanks to developments such as falling prices for solar power. Powering AI data centers is just one of many challenges that researchers are tackling to supply energy for an increasingly wired society.
(C) THE CONVERSATION
BIO:
As the Chairperson of the Computer Science and Electrical Engineering Department and Raymond J. Lane Professor at West Virginia University, Dr. Srivastava leads a team of faculty, staff, and students passionate about advancing computer science, cybersecurity, electrical engineering, computer engineering, AI, robotics, and biometrics. Over the past three years, research expenses in the department have more than doubled, while also significantly increasing both the quality and quantity of total departmental publications.
With over 18 years of experience as a scientist, educator, researcher, and leader, he brings a robust background in academic leadership, power grid operation and control, electric power grid resilience against cyber and weather events, renewable energy integration, physics-aware ML, synchrophasor applications, and integrated cyber-power simulations. Currently, Dr. Srivastava holds a joint appointment as a Senior Scientist at Pacific Northwest National Laboratory, working on projects leveraging computational methods to address complex scientific and societal challenges. Additionally, he serves as an Adjunct Professor at Washington State University, mentoring graduate students in electric power engineering.
In previous roles, Dr. Srivastava has worked internationally in visiting positions at institutions such as the Réseau de transport d´électricité in France, RWTH Aachen University in Germany, Indian Institute of Technology Kanpur in India, and at Asian Institute of Technology in Thailand. He also worked at the PEAK Reliability Coordinator, Idaho National Laboratory, PJM Interconnection, Schweitzer Engineering Lab (SEL), GE Grid Solutions, Massachusetts Institute of Technology and Mississippi State University.