Navigating the Challenges of Data Center Growth—Part I: Energy Sustainability
October 9, 2025
This is the first blog post in a series dedicated to identifying the sustainable solutions to the challenges facing data center development. Stay tuned for additional posts as Foley Hoag takes a deep dive into these critical topics.
A surge in data center construction is transforming landscapes across the United States, driven by soaring demand for cloud services, artificial intelligence, streaming, and the Internet of Things. Major tech companies like Amazon, Microsoft, Google, and Meta are investing billions in new facilities, while smaller providers compete for resources to support enterprise and edge computing. The U.S. now hosts about half of global data center capacity, with double-digit growth expected through the decade.
The immense computing power needed to run cloud-based data storage and artificial intelligence applications requires large amounts of energy, cooling water, and land. These resource demands can rival the needs of small cities, and present challenges for carbon emissions reductions efforts, electricity affordability, natural resource conservation, electronic waste management, and community character.
How developers, operators, and regulators respond in the months and years ahead will determine the industry’s viability. Success will hinge in large part on the extent of the industry’s commitment to renewable energy, flexible power arrangements, efficient cooling, and genuine community engagement. In short, sustainability, innovation, and a durable social compact will help drive successful data center development.
Fresh off Climate Week NYC where data center development took center stage at dozens of high-level events, Foley Hoag is taking a deep dive into the industry to explore how industry growth can be paired with sustainability commitments. In this first of a three-part series, we examine the first major pillar of data center sustainability: energy use. The next post in the series will cover water demand, land use, electronic waste, and community engagement. And we will end with a review of emerging federal and state data center policy and a look at innovations and experimentation in data center development.
First, let’s start with the basics.
The Basics: What Are Data Centers?
Data centers are the physical spaces that house the equipment that runs the digital world. These centers can be single rooms or multistory warehouses, and can be filled with any combination of rows of servers, data storage systems, networking and cooling equipment, batteries, and backup power systems. Electricity powers this equipment and water (traditionally) cools it. The average data center’s lifespan is ten to fifteen years. Operators typically replace equipment during that period until a substantial portion of equipment becomes obsolete and requires retrofitting or decommissioning.
Data centers are the backbone of the internet. They support everything people do online from streaming movies and communicating on social media, to storing customer data, accessing online banking and operating 911 call centers. Any business, university, government, or individual who relies on any of those services also relies on data center functionality. A disruption to data center infrastructure can impact critical communications and commercial systems that underpin modern society.
Though data centers have been around for decades, the number of data centers constructed in recent years has grown exponentially due to the proliferation of cloud computing and AI. In 2018, there were approximately 1,000 data centers nationwide. Fast forward to March 2025, and the United States alone was home to 5,426 data centers.
Hyperscale data centers have recently dominated the national headlines. The term “hyperscale” data center typically refers to a large facility containing more than 5,000 servers and using millions of square feet of space.
Fermi America made waves in September with the revelation that it would fund an eleven gigawatt data center campus in Texas (by contrast, Xcel Energy’s Texas-New Mexico service area, which includes Amarillo, has a current peak demand of just over six gigawatts). Meta has a similar vision for a “Manhattan-sized” center in Louisiana. Three Mile Island in Pennsylvania is set to reopen to meet Microsoft’s data needs. On the other end of the spectrum, there are thousands of smaller, “boutique-sized” centers around the world that offer specialized facilities built to serve a wider range of customers, providing more personalized and flexible services.
While the scope and speed of data center growth has created an economic development and innovation boom and helped to position the U.S. at the lead in the AI race, it is not without challenges. In this post, we focus on the complexities of supporting data center growth in the face of rising energy demand, generation development delays, and emissions concerns.
The First Challenge—Energy Use.
Data centers are driving the first major increase in U.S. electricity demand since 2009. Although electrification efforts have increased electricity demand in recent years, energy efficiency policies, technological innovation and other factors have kept electricity demand growth flat—at least until now. Demand is expected to increase two percent annually for the next ten years. According to the International Energy Agency, the use of AI plays a large role in this increase: “a typical AI-focused data centre consumes as much electricity as 100 000 households, but the largest ones under construction today will consume 20 times as much.”
Driven by the global race for AI leadership, data centers may represent as much as twelve percent of total U.S. electricity demand by 2028, up from 4.4 percent in 2023, according to estimates by the U.S. Department of Energy. About half of data center capacity in the U.S. is concentrated in regional clusters in Virginia and near cities such as Chicago, Dallas, Omaha, Phoenix, and Atlanta. The Massachusetts Institute of Technology found that Pennsylvania-New Jersey-Maryland “PJM” Interconnection (PJM), which also services Virginia, expects to see a data center-driven increase in demand equivalent to adding a mid-sized state’s demand to its system in the coming years.
This recent surge in large loads has brought with it complex energy usage issues that will require collaboration between regulators, developers, utilities and transmission operators to maintain access to affordable, reliable, and sustainable energy for all users. Data centers are being increasingly singled out for rising electricity prices. For example, in July, PJM’s costs for ensuring grid reliability spiked from $2.2 billion at the last capacity auction to $14.7 billion. This means electricity bills for customers in PJM will be higher, with some estimates of a twenty percent increase in summer 2025.
Utilities and grid operators must therefore work collaboratively with the industry to meet growing demand while maintaining reliability and managing costs for all other customers. The need for new transmission lines, substations, and generation capacity can impact electricity prices and raises the risk of creating stranded assets, according to the Institute for Energy Economics and Financial Analysis. If utilities upgrade electrical or gas infrastructure now for future data centers that do not materialize, ratepayers would still be on the hook for those upgrade costs. Careful coordination between utilities and developers can help ensure that investments are aligned with actual project needs.
There are many creative solutions available to help mitigate these concerns.
First, energy efficiency measures can help avoid price increases by decreasing energy use. Many data centers already implement energy efficiency improvements to reduce the amount of power they require from the grid. Examples of energy efficient data center technology include using the most efficient hardware whenever possible to reduce energy consumption. Other technologies include data processing units to help handle networking and security tasks that would otherwise be done with central processing units (CPUs), freeing up processing power on CPUs for core application tasks. This can optimize workload distribution and reduce electricity usage. The second blog post in this series will cover innovative approaches to cooling. To further incentivize such measures, some legislators in Virginia are trying to tie use of energy efficiency measures to eligibility for tax credits.
Second, a September 2025 report from Rewiring America, a nonprofit led by CEO Ari Matusiak, inverts the approach to energy efficiency, exploring the potential for widescale investments in residential energy efficiency to make space for data centers on the grid. The report proposes that if data center developers invested in household upgrades—think heat pumps and rooftop solar and storage—they could “unlock the capacity they need . . . by decreasing residential peak demand.” The report estimates that the costs of those upgrades would be competitive with those of building new gas generation, but would reduce electricity costs for households, lower overall greenhouse gas emissions, and improve overall air quality.
Third, where efficiency measures are maxed out, demand response measures can help avoid increases in ratepayer costs by flattening the net load profile and shifting energy demand from peak to off-peak hours. Google recently announced two agreements to create data centers with demand response capabilities.
Duke University’s Tyler Norris and Tim Profeta have promoted the idea of a flexible approach to electricity use. Their report Rethinking Load Growth explores the benefits of flexible loads that would allow large data centers to temporarily reduce electricity consumption during periods of grid stress by shifting workloads, using on-site generation, or adjusting operations. The report finds that if flexibility is adopted broadly, nearly 100 GW of new load could be integrated onto the grid with minimal impact while ensuring reliability and affordability for ratepayers.
Researchers from MIT’s Center for Energy and Environmental Policy Research also argue that data center temporal flexibility, the shifting of a data center’s workload across time, could be a “robust, low-cost reliability resource” that helps to minimize the risk of overbuilding new infrastructure while also keeping ratepayers’ costs from skyrocketing.
Notably, PJM appears to be the first regional transmission operator to advance a concept that incorporates flexible load. The proposal, which initially included both voluntary and mandatory energy usage reductions, was met with pushback from industry players and environmental groups like NRDC, and instead has recently been updated to focus on voluntary commitments only.
Fourth, modernized rate design can be an option. Environmental advocates such as Earthjustice and Energy Futures Group proposed creating a new ratepayer class for large energy users that would recover certain infrastructure costs from the large energy user, rather than distributing them among all ratepayers. This approach may include requiring developers to contribute to the upfront cost of grid upgrades needed for data center projects as another way to shift some of the cost burden from ratepayers. The Union of Concerned Scientists echoes this in a report, calling on state and federal regulators to require that costs be assigned to the rate class that is causing the costs to avoid subsidization by all other customers.
Lastly, on-site battery storage systems could also help keep energy costs in check. During peak hours, when energy is most expensive, data centers could rely on their own battery energy storage systems to provide relief to the strained grid. On-site battery energy storage solutions could reduce the need for some costly grid upgrades to handle increases in peak demand.
The Second Challenge—Generation Deployment Delays.
Another challenge to powering data centers is the speed at which new power sources and necessary transmission infrastructure can be brought online. While most data centers meet their energy needs through the grid, some developers are seeking to use behind-the-meter generation either as their primary energy source or to backstop their power needs to provide critical reliability. However, siting and permitting processes for new generation sources can add years to project development. Despite recent reform efforts, interconnection queues continue to take upwards of twelve months.
In PJM territory, projects face years of delays and uncertainty because of lengthy and complex interconnection queues. In the United States overall, building new transmission lines can take ten years or more. Developers are attempting to balance years-long supply chain constraints for key project equipment such as new gas turbine facilities with the interconnection queues. Orders for new turbines for gas generation facilities are said to be delayed for up to seven years. Although manufacturers plan to double gas turbine production capacity, the change may not be enough to clear the order queue for new turbines. The mismatch between development demand and supply capacity will likely cause many projects to stall or cancel.
Notably, there are challenges facing nearly every type of generator. The Trump Administration’s senseless opposition to renewable energy is well known. The Administration has cancelled hundreds of renewable energy projects, leading to over 80,000 lost jobs and over $40 billion in delayed or terminated investment. The Administration’s July 2025 changes to the tax credits for certain renewable energy development will likely raise the costs of building on-site wind and solar facilities.
Fossil-fired generators, as noted above, may experience years of delays waiting for equipment, exposure to volatile natural gas prices, and responsibility for increases in greenhouse gas (“GHG”) emissions that may run counter to state emissions rules or corporate policy. Other data centers envisioning construction or retrofitting of nuclear reactors could face prolonged permitting processes, with consulting firm Deloitte predicting that nuclear is unlikely to play a large role in meeting AI energy demand until well into the 2030s.
Energy-related development delays can be reduced through a few different strategies.
First, renewables like wind and solar remain the fastest and least expensive new sources of energy to deploy. The market demand underpinning this reality should herald a return to renewable energy despite current federal policy. Renewable energy development could help minimize, though not entirely avoid, delays as a result of interconnection and other siting and permitting processes. The alternative, building new off-site fossil fuel plants or nuclear facilities, however, can be slower and more expensive, while remaining politically contentious.
Second, locating data centers in areas of high power and grid availability and in areas where renewables are accessible could also help to avoid delays by relying on existing low-carbon resources. For example, Oregon and Washington State have abundant and reliable hydropower that is providing clean energy to data centers and attracting them to those states.
Third, as mentioned above, some data center developers are pursuing the construction of behind-the-meter generation sources, an arrangement in which the data center load is served directly by locally-sited (or “co-located”) generation capacity built for the data center. Behind-the-meter generation gives the developer power supply certainty and can help insulate the data center from electricity market price fluctuations.
Co-location can also help alleviate interconnection delays: if the co-located generator and data center are fully “islanded” from (i.e. not interconnected to) the grid, developers will be able to avoid interconnection queues entirely. This may avoid certain regulatory burdens associated with connection to the grid, but may not avoid regulatory scrutiny of new emissions sources.
In addition, clear and uniform guidelines for co-located facilities seeking interconnection service would allow developers to sell excess power when necessary, or to access black-start capabilities that on-site generators cannot provide but that can be needed to get facilities back online after widespread power outages.
Another creative solution is the increased use of fiber optic technology, which could enable companies to tap into underused areas of the electricity grid and increase use of renewables, as they are more challenging to deploy close to high-density areas. New technology which allows data to travel fifty percent faster through hollow-core fiber cables could make it easier to site data centers farther from populated areas while maintaining data transfer speeds. This could help reduce the burdens of development on those areas and expedite the data center development process by unlocking the possibility of siting renewable energy resources and data centers on land previously determined to be impractical for data center development.
The Third Challenge—Emissions.
The last challenge explored in the first part of this blog post series is the concern about increased greenhouse gas (GHG) emissions associated with data center energy consumption. If not developed sustainably, data centers could contribute to increased emissions, particularly if they use significant amounts of power from grids served by carbon-intensive generation. While data centers already interconnected to the grid have little to no control over the generation mix serving their load, there are options for existing and new data centers to significantly reduce their carbon footprint.
Building onsite or nearby renewable energy resources concurrently with, and in support of, a data center project, is just one way that data centers can reduce their contributions to GHG emissions from power generation. Data centers can also rely on grid-connected renewable resources by entering into power purchase agreements to purchase energy and environmental attributes from those resources.
In most circumstances, data centers can require a constant, around-the-clock power supply. This can affect data centers’ GHG emissions since renewable energy sources may generate sufficient power during only certain times of the day. Relying on a mix of renewables, battery storage, and other energy resources on the grid may be required to meet power needs, but including renewables and energy storage systems will dramatically reduce the emissions associated with data centers when compared to data centers relying fully on fossil power generation resources, according to the World Bank.
In addition, the energy efficiency measures noted above will not only help moderate electricity demand, but also reduce a data center’s overall emissions profile. In of the next blog post in this series, we will discuss some of the innovative cooling technologies that are allowing for dramatic reductions in energy use.
A final strategy for minimizing emissions is to make software “carbon-aware.” If software is designed to account for variations in carbon emissions throughout the day, it can move non-urgent AI workloads to run at different times or in a different place to avoid peak energy usage periods that are the most carbon intensive.
____________
The tech and energy sectors are becoming ever more intertwined. While there are challenges related to the high energy needs of data centers, creative solutions that incorporate sustainability can be found, allowing projects to move forward with less impact to the climate, electricity grid, and surrounding communities.
Foley Hoag’s deep dive into the data center industry will continue. In the next blog post in this series, we will consider the potential for sustainable approaches to water use, land use, community engagement, and electronic waste.
Search
RECENT PRESS RELEASES
Related Post