Data centers used about 4.4% of all U.S. electricity in 2023, the U.S. Department of Energy says, anticipating that figure could balloon to 12% in the next three years as AI’s use increases. Here’s why: A ChatGPT query can use nearly 10 times the electricity as a regular Google search. OpenAI says the bot receives 2.5 million prompts per day.
That helps explain why tech giants believe they can’t build data centers fast enough. Each data center is a giant warehouse filled with racks of servers housing graphics and central processing units (GPUs and CPUs) and storage devices connected to rich fiber cable.
Electricity running through those systems generates heat. Engineers rely on cooled air and water to keep things from overheating, constituting the vast majority of a data center’s daily operating costs.
A June report from the Environmental and Energy Study Institute estimated large data centers can use up to 5 million gallons of water a day, the same as towns with at least 10,000 residents.
Ongoing research aims to reduce both the electricity and water usage. During a November Florida Chamber of Commerce forum, Amazon Web Service Responsible AI Lead Diya Wynn said newer AI processors require 60% less energy and that by 2030 the company plans to put more water back into its communities than it uses.













