US Election: 1 Day Left | The Opening Trade 11/04
A flurry of polls released Sunday show Vice President Kamala Harris and former President Donald Trump remain poised for a photo finish in this weekâs preside...
Current Access Level “I” – ID Only: CUID holders and approved guests only
Insights from the Center on Global Energy Policy
The rapid expansion of artificial intelligence (AI), especially Large Language Models (LLMs) such as GPT-3 and Gemini on which the now well-known ChatGPT AI and Gemina assistant systems are based, has raised concerns about whether and how this new technology may impinge on the ability of the United States to meet its zero-carbon electricity goals. After all, AI relies on data centers that are highly energy intensive. Worldwide, demand from data centers accounts for about 0.5 percent[1] of electrical energy use. AI data centers specifically could require approximately 14 gigawatts (GW)[2] of additional new power capacity by 2030. If the United States follows a similar data center growth trajectory as Ireland,[3] a path setter whose data centers are projected to consume as much as 32 percent of the country’s total annual electricity generation by 2026,[4] it could face a significant increase in energy demand, strain on infrastructure, increased emissions, and a host of new regulatory challenges.
Some argue that despite the increasing energy footprint of data centers, it is important to keep in mind that analysts have a history of overestimating data center electricity use.[5] Advances in efficiency, particularly in terms of power usage effectiveness and computational efficiency of LLMs, have also helped keep energy use in check.[6] But new data centers tend to be clustered in particular regions of the United States rather than distributed evenly across the country, meaning that host regions may face massive and sudden demand spikes not encountered elsewhere.
For all of these reasons, a detailed understanding of the energy demand of data centers is crucial for planning the future energy system. This blog post adapts a method that has been used to determine the energy and carbon footprint of data centers to estimate the contribution of AI LLM data centers in particular to US power demand growth through 2030, which represents a milestone year for several US climate goals and a realistic and manageable timeframe for projecting technological and market trends based on current data. The post also raises questions about the implications of that growth for the US power grid and energy transition.
Much work has been done to estimate data center demand. In a 2021 report, the US Energy Information Administration (EIA) used existing data for miscellaneous electric loads in NEMS to determine the average power draw of data center servers and, in turn, estimate the annual energy consumption of the data centers themselves going forward. The report projected that US data centers will consume about 88 terawatt-hours (TWh) annually by 2030,[7] which is about 1.6 times the electricity consumption of New York City. The EIA also conducted a pilot study focused on assessing the feasibility of estimating the energy consumption of data centers in order to include them as a separate category of building in its Commercial Buildings Energy Consumption Survey. Based on a limited survey, the study found that quantifying the actual energy consumption of these centers is challenging due to inadequate sampling frames and low response rates.[8] A more recent Bank of America report drew on a McKinsey data center demand model based on the number of servers within data centers to project that data centers will consume 14 GW of energy annually by 2030.[9]
In order to estimate the electrical demand of those latter centers in particular, this post couples the use of data on servers with that of data on Graphics Processing Units (GPUs), the energy-intensive core technology (mainly supplied by the tech company Nvidia) within data centers that powers AI applications. Specifically, it analyzes server specifications and the power consumption of servers used in the data centers[10] as well as the expected number of GPU shipments over the next few years[11] and the energy required to operate these GPUs,[12] which together yield a comprehensive estimate of the overall electricity needs of AI LLM centers.
Based on this approach, and in light of data from the EIA Annual Energy Outlook 2023,[13] this analysis suggests that by 2027 GPUs will constitute about 1.7 percent of the total electric capacity or 4 percent of the total projected electricity sales in the United States. While this might seem minimal, it constitutes a considerable growth rate over the next six years and a significant amount of energy that will need to be supplied to data centers.
Looking to 2030, it is evident that GPU energy demand is on an upward trajectory. When this trend is analyzed alongside broader US sectoral electricity demand projections by the EIA, the significance of the surge in GPU power usage becomes more apparent. GPUs and their servers could make up as much as 27 percent of the planned new generation capacity for 2027 and 14 percent of total commercial energy needs that year.
These projections prompt several important questions for policymakers, energy providers, and technology leaders:
Rising power demand from AI data centers likely does not pose an immediate crisis for the US power grid. But it will clearly need to be closely tracked and accounted for if the goal of realizing a sustainable and resilient energy system that supports economic growth is to be achieved.
The ability to innovate in energy efficiency and renewable sources could be pivotal in shaping the future of AI and aligning it with environmental objectives. While states like California, Texas, and Virginia, which house most data centers, continue to add capacity to their renewable energy resources, it is also important to assess future energy requirements and to ensure that the infrastructure and capacity of renewable sources can keep up with growing data center demand.
In the private sector, Microsoft has just announced that it will be purchasing 10.5 gigawatts of renewable energy from Brookfield Asset Management between 2026 and 2030 to power its data centers and operations with carbon-free energy.[14] This marks one of the largest agreements of its kind for a single corporation and confirms the tech giant’s commitment to achieving zero-carbon emissions. But tech companies like Microsoft will also need to do more, such as increasing investment in energy efficiency technologies and further promoting sustainability and environmental responsibility.
The US power grid also faces challenges related to congestion, reliability, and the need to include more renewable sources of energy.[15] Current transmission capacity is insufficient to serve these loads, leading to increased energy prices and reduced service reliability. Addressing these challenges through continued investment in and policy support for transmission finance, cost allocation, siting, permitting reform, and the integration of advanced technologies to streamline and enhance the development of US power transmission infrastructure can help meet the increasing data center energy demand projected in this short piece.
[1] https://www.nytimes.com/2023/10/10/climate/ai-could-soon-need-as-much-electricity-as-an-entire-country.html
[2] https://rsch.baml.com/access?q=cdtyt3g8dmw
[3] https://publications.jrc.ec.europa.eu/repository/handle/JRC135926
[4] https://www.datacenterknowledge.com/energy/electricity-demand-data-centers-could-double-three-years
[5] https://www.datacenterknowledge.com/energy-power-supply/electricity-demand-at-data-centers-could-double-in-three-years
[6] https://www2.datainnovation.org/2024-ai-energy-use.pdf
[7] https://www.eia.gov/analysis/studies/demand/miscelectric/pdf/miscelectric.pdf
[8] https://www.eia.gov/consumption/commercial/data/2018/pdf/2018_CBECS_Data_Center_Pilot_Results.pdf
[9] https://rsch.baml.com/access?q=cdtyt3g8dmw
[10] https://resources.nvidia.com/en-us-dgx-systems/ai-enterprise-dgx
[11] https://www.nextplatform.com/2024/01/11/the-datacenter-gpu-gravy-train-that-no-one-will-derail/
[12] https://resources.nvidia.com/en-us-tensor-core/nvidia-tensor-core-gpu-datasheet
[13] https://www.eia.gov/outlooks/aeo/
[14] https://www.utilitydive.com/news/brookfield-microsoft-corporate-clean-energy-ppa/714989/#:~:text=Microsoft%20will%20buy%20more%20than%2010.5%20GW%20of%20clean%20energy,time%2C%20the%20companies%20said%20Wednesday
[15] https://www.energypolicy.columbia.edu/wp-content/uploads/2024/04/USTransmissionInfrastructureDevelopment-CGEP_Report_050124-1.pdf
The idea of a global electricity market has always been a bit of a misnomer. Fuels such as gas, oil, coal typically travel around the world to fuel...
Recent congressional legislation has the potential to accelerate the construction of new electricity transmission projects that cross existing electric grid boundaries. These interregional transmission projects provide a host...
In 2022, European Union nuclear and hydro generation dropped by 118 terawatt-hours (TWh) and 71 TWh, respectively, preventing gas-fired generation from dropping and contributing to the gas crisis....
This summer’s wildfires on Maui are the largest environmental disaster in Hawaiian history. One of them, the deadliest[1] wildfire in the United States since 1918, caused near-total destruction...
About one in four American households experience some form of energy insecurity. Within this group, Black, Indigenous, Latine, low- and moderate-income (LMI), and other disadvantaged communities face a disproportionately higher burden.
Millions of US households struggle to meet their energy needs due to low wages, rising living costs, and other historical and structural drivers of poverty.