Although it is accessible around the world, the Internet is largely located in Northern Virginia. This region of the U.S., containing 35 percent of the world’s hyperscale data centers, nondescript buildings house thousands of servers hosting (you guessed it) the data of the Internet. As previously detailed, data centers are a vital component of physical Internet infrastructure. 

They also pose a formidable challenge for America’s energy needs, especially given the increasing popularity of generative AI. Data center servers require truly massive amounts of energy to run and cool down. A single data center can consume between 10 and 50 times more energy per unit area than the average commercial office building. Generative AI will only push these demands far higher. A ChatGPT query currently uses almost 10 times more energy than a Google search. 

The energy hunger of AI applications, and the data centers facilitating them, has put states and municipalities in a dilemma. On the one hand, data centers provide property tax windfalls, potential latency reductions, and further economic benefits from their construction and operation. However, their energy consumption could result in higher electricity rates for households served by the same utility company. They could also increase carbon emissions, as many utilities are bringing natural gas power plants online to meet data centers’ strained electricity demands. 

So far, state and local governments have by and large forged ahead with tax exemptions and other policies to attract new data centers, and utilities have been scrambling to prepare for the upcoming surge in energy consumption. Increasing energy capacity, however, is already a challenge in the United States, where new energy projects wait four years on average to connect to the grid. The backlog of projects in this queue has reached 2,600 GW. Data centers, on the other hand, can be brought online in a little more than a year – and that time is likely to fall. If these trends continue, power generating and transmission utilities could find themselves unprepared.  

Bolstering energy supplies, therefore, is an immediate and essential national priority. Fortunately, there is no shortage of methods for doing so. Because data centers are designed to run with minimal downtime, constructing natural gas power plants will likely be necessary. Nuclear is another promising option for providing consistent electricity. Wind and solar projects are also likely to play a large role, but by definition cannot provide constant power supply due to intermittent renewable sources of wind and sunlight. Of the energy projects awaiting connection to the grid, a large portion were wind, solar, and battery storage. Regionally-dependent geothermal power may also be an option for certain centers in the future.

There is also a multitude of options for installing this capacity. Tech companies have expressed interest in helping to shoulder the energy burdens of data centers. Some AI companies have announced plans to use small modular reactors for their data centers. Microsoft announced a 250 MW solar project to accompany its planned data center in Wisconsin. These efforts should be welcomed and incentivized by policymakers, with special attention to smoothing out permitting and approvals over subsidies. 

After these projects have been completed, the major bottleneck is connecting them to the grid, which currently does not have the capacity to accommodate them. Remedying this problem means building new transmission lines. The process of doing so, however, entails complex negotiations between utilities, landowners, and regulators over how to allocate the cost of lines. Permitting is also a sizable roadblock, as projects must navigate a series of local, state, and federal approval processes before proceeding. 

The Federal Energy Regulatory Commission (FERC) recently took steps to expedite transmission line construction. Its Order No. 1920 requires grid operators to undertake 20-year advance planning and establishes new cost allocation rules. This rule is a step in the right direction. Permitting reforms for the projects themselves, such as those recently passed by Minnesota, should also be advanced. 

Of course, addressing energy concerns on the supply side is only half the battle. Initiatives aimed at reducing energy consumption may also necessary play a role in the solution in certain regions or contexts. Energy efficiency regulations, for example, could accommodate portions of the demands new data centers will place on the grid. Innovations in semiconductor efficiency have great potential to reduce these demands in the short-term. 

Encouragingly, the requisite components to address this problem already exist. Semiconductor designers are continuously chasing greater energy efficiency. Transmission infrastructure is a recognized policy issue. Even the United States’ energy backlog has a bright side: it reveals great interest in bolstering American clean energy capacity. The challenge of powering data centers in the age of generative AI may have the unintended but positive consequence of bringing U.S. energy infrastructure into the future. For now, the future is simple: we need more power and lots of it.

 

Written by Isaac Oh, Public Policy Intern

The Alliance for Innovation and Infrastructure (Aii) is an independent, national research and educational organization. An innovative think tank, Aii explores the intersection of economics, law, and public policy in the areas of climate, damage prevention, energy, infrastructure, innovation, technology, and transportation.