AI Data Centers Come With a Hidden Environmental Cost

January 27, 2026

Artificial intelligence has become deeply entwined with all sorts of human enterprise. It pervades business, medicine, scientific research, and, increasingly, everyday life — OpenAI, the company behind ChatGPT, said recently that the platform processes 2.5 billion queries each day, or 29,000 per second.

All that computing, intangible though it seems, requires a lot of energy. As demand for ever more powerful AI soars, major tech companies like Amazon, Google, and Microsoft are racing to construct data centers, which house the servers that support the world’s digital activity. Those facilities consume vast amounts of electricity to run computing equipment, and also vast amounts of water to cool it, with growing consequences for nearby communities and the planet at large.

“The environmental cost of data centers, driven by AI, is increasing really fast,” says Shaolei Ren, a computer engineer at the University of California, Riverside. “It’s probably the fastest-growing sector.”


Read More: The Pros and Cons of Artificial Intelligence


Why Does AI Use So Much Energy?

Data centers are nothing new — the first was built in 1945 to house ENIAC, the world’s first general-purpose computer, according to Enconnex. But the peculiar nature of AI computation, which involves scouring datasets to learn about underlying patterns, makes it far more energy-intensive than traditional computing.

“Extracting patterns from data is in some sense inefficient,” says Priya Donti, a machine learning researcher at the Massachusetts Institute of Technology. “To make it work well, you also want to scale up the data and scale up the model, and that ends up coming with much more computational burden.”

How Much Energy Are Data Centers Consuming?

Right now, data centers account for about 4 percent of electricity consumed in the U.S. By 2028, the Lawrence Berkeley National Laboratory estimates their share will rise to between 7 and 12 percent. On a global scale, data center electricity usage this year is projected to reach as high as 1,050 terawatt-hours, up from 460 terawatt-hours in 2022, according to the International Energy Agency. That’s roughly equivalent to Japan’s usage.

Because most electrical grids in the U.S. and abroad rely on fossil fuels, the meteoric growth of AI comes with a surge in carbon emissions. Companies could, of course, choose locations with cleaner grids, but Ren says that’s usually more expensive.

Data centers also generate massive amounts of heat, and their cooling systems often draw from the same municipal water as nearby homes and businesses. In 2023, U.S. data centers consumed more than 17 billion gallons of water. The largest facilities, known as “hyperscalers,” can reportedly use up to 5 million gallons on a hot day — comparable to the needs of a large town, according to the Environmental and Energy Study Institute (EESI).

These figures must be taken with a grain of salt, according to Donti, because tech companies aren’t terribly forthcoming. “Data transparency in this area is actually a huge issue,” she says. “Hyperscalers are not providing granular information about usage patterns.”

Data Center Impacts Are Distributed Inequitably

The problem, in Ren’s view, isn’t so much that data centers are inherently unsustainable. Rather, he says, it’s that “the environmental cost is more concentrated around some local communities.”

As a global civilization, he thinks we can absorb the impact of growing AI use. The industry’s carbon output is only a small fraction of total emissions worldwide, after all, and as for water, Ren says, “there’s enough, just not everywhere and anytime.”

With that in mind, his focus is on how to spread the negative effects more equitably. As it stands, data centers are popping up in regions ill-equipped to accommodate them, like drought-prone Arizona and Chile, according to The Atlantic. And the fossil-fuel power plants that run data centers, besides contributing to global warming, also create local air pollution. In many cases, reduced air quality disproportionately harms already economically disadvantaged people, according to an article in Time.

How Can We Make AI More Sustainable?

Aside from the risk of reputational damage, tech companies have little incentive to pursue sustainability for its own sake. But, Ren notes, “they do have incentive to reduce the energy cost.”

Cost reductions can come from improved hardware, like the Graphic Processing Units (GPUs) that perform calculations for AI computation, as well as algorithm optimization, which speeds up both the training process for AI models and the time it takes those models to complete tasks. Another promising advance is geographical load balancing, which directs user traffic to whichever servers are performing most efficiently.

There are some encouraging signs. In terms of water, Amazon, Google, and Microsoft have each vowed to replenish more water than they consume by 2030. As for energy, power usage effectiveness, a measure of data center energy efficiency, has been consistently improving over the past two decades, according to a Google report. That said, the rate of improvement has slowed in recent years. Basically, Donti explains, the demand for AI “is outpacing the extent to which hardware is getting more efficient.”

From a global environmental standpoint, there’s another issue: More efficient data centers — those that need less water and electricity for the same level of operation — may actually lead to more energy and water consumption, not less. This is known as the Jevons paradox: Lower costs spur greater demand, and thus greater overall resource use. Without a significant shift to renewable energy and more careful water use, greater efficiency won’t necessarily solve anything.

The bottom line, in Donti’s opinion, is that we need to be “clear-eyed when we are consuming resources” related to AI. We know — if imprecisely — what the environmental costs will be. So responsible AI use demands thoughtful, democratic dialogue around which applications are worth the risk, and how to shape the field moving forward.

“It’s abundantly shapeable,” Donti says. “There’s a lot we can do to change how this picture looks.”


Read More: AI Supports Dishonesty in Humans, Making It Easier for Users to Cheat With an Accomplice


Article Sources

Our writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:

 

Search

RECENT PRESS RELEASES