U.S. data-center power demand could nearly triple over the next three years, potentially consuming up to 12% of the country's electricity, as the industry undergoes a significant transformation driven by artificial intelligence. This projection comes from a Department of Energy-backed study by the Lawrence Berkeley National Laboratory, which was first reported by Reuters on Friday.
The study aims to help the U.S. power industry and government understand the potential impacts of increased data-center demand from Big Tech on electrical grids, energy costs, and the environment. By 2028, data centers' annual energy use could range from 74 to 132 gigawatts, representing 6.7% to 12% of total U.S. electricity consumption. Currently, data centers account for just over 4% of the nation's power load.
The wide range of estimates depends partly on the availability and demand for AI chips known as GPUs, which are crucial for powering AI-driven data centers. Avi Shultz, director of the DOE's Industrial Efficiency and Decarbonization Office, noted, "This really signals to us where the frontier is in terms of growing energy demand in the U.S."
The rise in data-center electricity needs is occurring alongside increasing power consumption from onshoring U.S. manufacturing and the electrification of buildings and transportation. After peaking in 2024, overall U.S. power demand is expected to set another record next year.
Shultz highlighted that the report underscores the rapid growth of AI-driven data centers, which are currently the leading edge of U.S. energy demand. The findings could influence DOE efforts to enhance the flexibility and resilience of the grid, including building long-duration battery storage at data-center sites and advancing technologies like small nuclear reactors and advanced geothermal systems.
The report also noted that starting in 2017, the deployment of GPU-accelerated servers more than doubled the sector's power use over six years. AI's need for increasingly powerful chips and intensive cooling systems is the primary factor behind the projected growth in data-center energy consumption. Back in 2016, AI servers accounted for just 2% of total server energy use.
Arman Shehabi, the report's lead researcher, recommended that the study be published annually or biannually to better monitor data-center trends. The estimates were based on calculations of electricity use from installed GPUs and other data-center IT equipment, utilizing publicly available information, market research, and feedback from power-sector and data-center experts.
"By showing what the energy use is and, more importantly, what's causing the growth in energy use, it helps us think about what opportunities there are for efficiencies," Shehabi explained.
The report also encourages further research and development of energy-efficiency strategies for the rapidly expanding AI data-center sector. Some new AI data centers are being constructed with power capacities as large as one gigawatt — enough to power all the homes in Philadelphia.
Trump has told Walmart, $WMT, to 'eat the tariffs' instead of raising prices
5/17/2025 11:59 PMMoody’s downgrades US credit rating to Aa1 from Aaa
5/17/2025 4:55 AMYouTube, GOOGL, viewers will start seeing ads after ‘peak’ moments in videos
5/16/2025 7:55 PMCEOs say that just a fraction of AI initiatives are actually delivering the return on investment they expected
5/16/2025 7:51 PM
Stay Updated
Subscribe to our newsletter for the latest financial insights and news.
