Next Upcoming
Climate Night Live at Climate Week NYC
By Canary Media
By Jeff St. John .
This is the first article in our four-part series “Boon or bane: What will data centers do to the grid?”
In January, Virginia lawmakers unveiled a raft of legislation aimed at putting some guardrails on a data center industry whose insatiable hunger for electricity threatens to overwhelm the grid.
As the home of the world’s densest data center hub, Virginia is on the vanguard of dealing with these challenges. But the state is far from alone in a country where data center investments may exceed $1 trillion by mid-2029, driven in large part by “hyperscalers” with aggressive AI goals, like Amazon, Google, Meta, and Microsoft.
“If we fail to act, the unchecked growth of the data center industry will leave Virginia’s families, will leave their businesses, footing the bill for infrastructure costs, enduring environmental degradation, and facing escalating energy rates,” state Sen. Russet Perry, a Democrat representing Loudoun County, the heart of Virginia’s “Data Center Alley,” told reporters at the state capitol in Richmond last month. “The status quo is not sustainable.”
Perry’s position is backed by data. A December report commissioned by Virginia’s legislature found that a buildout of data centers to meet “unconstrained demand” would double the state’s electricity consumption by 2033 and nearly triple it by 2040.
To meet the report’s unconstrained scenario, Virginia would need to erect twice as many solar farms per year by 2040 as it did in 2024, build more wind farms than all the state’s current offshore wind plans combined, and install three times more battery storage than Dominion Energy, the state’s biggest utility, now intends to build.
Even then, Virginia would need to double current power imports from other states. And it would still need to build new fossil-gas power plants, which would undermine a state clean energy mandate. Meeting just half the unconstrained demand would require building seven new 1.5-gigawatt gas plants by 2040. That’s nearly twice the 5.9 gigawatts’ worth of gas plants Dominion now plans to build by 2039, a proposal that is already under attack by environmental and consumer groups.
But Perry and her colleagues face an uphill battle in their bid to more closely regulate data center growth. Data centers are big business in Virginia. Gov. Glenn Youngkin, a Republican, has called for the state to “continue to be the data center capital of the world,” citing the up to 74,000 jobs, $9.1 billion in GDP, and billions more in local revenue the industry brings. Most of the proposed data center bills, which include mandates to study how new data centers could impose additional costs on other utility customers and worsen grid reliability, have failed to move forward in the state legislature as of mid-February.
Still, policymakers can’t avoid their responsibility to “make sure that residential customers aren’t necessarily bearing the burden” of data center growth, Michael Webert, a Republican in Virginia’s House of Delegates who’s sponsoring one of the data center bills, said during last month’s press conference.
From the mid-Atlantic down to Texas, tech giants and data center developers are demanding more power as soon as possible. If utilities, regulators, and policymakers move too rashly in response, they could unleash a surge in fossil-gas power-plant construction that will drive up consumer energy costs — and set back progress on shifting to carbon-free energy.
But this outcome is not inevitable. With some foresight, the data center boom can actually help — rather than hurt — the nation’s already stressed-out grid. Data center developers can make choices right now that will lower grid costs and power-system emissions.
And it just so happens that these solutions could also afford developers an advantage, allowing them to pay less for interconnection and power, win social license for their AI products, and possibly plug their data centers into the grid faster than their competitors can.
When it comes to the grid, the nation faces a computational crossroads: Down one road lie greater costs, slower interconnection, and higher emissions. Down the other lies cheaper, cleaner, faster power that could benefit everyone.
After decades with virtually no increase in U.S. electricity demand, data centers are driving tens of gigawatts of power demand growth in some parts of the country, according to a December analysis from the consultancy Grid Strategies.
Providing that much power would require “billions of dollars of capital and billions of dollars of consumer costs,” said Abe Silverman, an attorney, energy consultant, and research scholar at Johns Hopkins University who has held senior policy positions at state and federal energy regulators and was an executive at the utility NRG Energy.
Utilities, regulators, and everyday customers have good reason to ask if the costs are worth it — because that’s far from clear right now, he said.
A fair amount of this growth is coming from data centers meant to serve well-established and solidly growing commercial demands, such as data storage, cloud computing, e-commerce, streaming video, and other internet services.
But the past two years have seen an explosion of power demand from sectors with far less certain futures.
A significant, if opaque, portion is coming from cryptocurrency mining operations, notoriously unstable and fickle businesses that can quickly pick up and move to new locations in search of cheaper power. The most startling increases, however, are for AI, a technology that may hold immense promise but that doesn’t yet have a proven sustainable business model, raising questions about the durability of the industry’s power needs.
Hundreds of billions of dollars in near-term AI investments are in the works from Amazon, Google, Meta, and Microsoft as well as private equity and infrastructure investors. Some of their announcements strain the limits of belief. Late last month, the CEOs of OpenAI, Oracle, and SoftBank joined President Donald Trump to unveil plans to invest $500 billion in AI data centers over the next four years — half of what the private equity firm Blackstone estimates will be invested in U.S. AI in total by 2030.
Beyond financial viability, these plans face physical limits. At least under current rules, power plants and grid infrastructure simply can’t be built fast enough to provide what data center developers say they need.
Bold data center ambitions have already collided with reality in Virginia.
“It used to take three to four years to get power to build a new data center in Loudoun County,” in Virginia’s Data Center Alley, said Chris Gladwin, CEO of data analytics software company Ocient, which works on more efficient computing for data centers. Today it “takes six to seven years — and growing.”
Similar constraints are emerging in other data center hot spots.
The utility Arizona Public Service forecasts that data centers will account for half of new power demand through 2038. In Texas, data centers will make up roughly half the forecasted new customers that are set to cause summer peak demand to nearly double by 2030. Georgia Power, that state’s biggest utility, has since 2023 tripled its load forecast over the coming decade, with nearly all of that load growth set to come from the projected demands of large power customers including data centers.
These saturated conditions are pushing developers into new markets, as the below chart from the energy consultancy Wood Mackenzie shows.
By Jeff St. John .
This is the second article in our series “Boon or bane: What will data centers do to the grid?”
There’s no question that data centers are about to cause U.S. electricity demand to spike. What remains unclear is by how much.
Right now, there are few credible answers. Just a lot of uncertainty — and “a lot of hype,” according to Jonathan Koomey, an expert on the relationship between computing and energy use. (Koomey has even had a general rule about the subject named after him.) This lack of clarity around data center power requires that utilities, regulators, and policymakers take care when making choices.
Utilities in major data center markets are under pressure to spend billions of dollars on infrastructure to serve surging electricity demand. The problem, Koomey said, is that many of these utilities don’t really know which data centers will actually get built and where — or how much electricity they’ll end up needing. Rushing into these decisions without this information could be a recipe for disaster, both for utility customers and the climate.
Those worries are outlined in a recent report co-authored by Koomey along with Tanya Das, director of AI and energy technology policy at the Bipartisan Policy Center, and Zachary Schmidt, a senior researcher at Koomey Analytics. The goal, they write, “is not to dismiss concerns” about rising electricity demand. Rather, they urge utilities, regulators, policymakers, and investors to “investigate claims of rapid new electricity demand growth” using “the latest and most accurate data and models.”
Several uncertainties make it hard for utilities to plan new power plants or grid infrastructure to serve these data centers, most of which are meant to power the AI ambitions of major tech firms.
AI could, for example, become vastly more energy-efficient in the coming years. As evidence, the report points to the announcement from Chinese firm DeepSeek that it replicated the performance of leading U.S.-based AI systems at a fraction of the cost and energy consumption. The news sparked a steep sell-off in tech and energy stocks that had been buoyed throughout 2024 on expectations of AI growth.
It’s also hard to figure out whose data is trustworthy.
Companies like Amazon, Google, Meta, Microsoft, OpenAI, Oracle, and xAI each have estimates of how much their demand will balloon as they vie for AI leadership. Analysts also have forecasts, but those vary widely based on their assumptions about factors ranging from future computing efficiency to manufacturing capacity for AI chips and servers. Meanwhile, utility data is muddled by the fact that data center developers often surreptitiously apply for interconnection in several areas at once to find the best deal.
These uncertainties make it nearly impossible for utilities to gauge the reality of the situation, and yet many are rushing to expand their fleets of fossil-fuel power plants anyway. Nationwide, utilities are planning to build or extend the life of nearly 20 gigawatts’ worth of gas plants as well as delaying retirements of aging coal plants.
If utilities build new power plants to serve proposed data centers that never materialize, other utility customers, from small businesses to households, will be left paying for that infrastructure. And utilities will have spent billions in ratepayer funds to construct those unnecessary power plants, which will emit planet-warming greenhouse gases for years to come, undermining climate goals.
“People make consequential mistakes when they don’t understand what’s going on,” Koomey said.
Some utilities and states are moving to improve the predictability of data center demand where they can. The more reliable the demand data, the more likely that utilities will build only the infrastructure that’s needed.
In recent years, the country’s data center hot spots have become a “wild west,” said Allison Clements, who served on the Federal Energy Regulatory Commission from 2020 to 2024. “There’s no kind of source of truth in any one of these clusters on how much power is ultimately going to be needed,” she said during a November webinar on U.S. transmission grid challenges, hosted by trade group Americans for a Clean Energy Grid. “The utilities are kind of blown away by the numbers.”
A December report from consultancy Grid Strategies tracked enormous load-forecast growth in data center hot spots, from northern Virginia’s “Data Center Alley,” the world’s densest data center hub, to newer boom markets in Georgia and Texas.
By Jeff St. John .
This is the third article in our four-part series “Boon or bane: What will data centers do to the grid?”
The world’s wealthiest tech companies want to build giant data centers across the United States to feed their AI ambitions, and they want to do it fast. Each data center can use as much electricity as a small city and cost more than $1 billion to construct.
If built, these data centers would unleash a torrent of demand for electricity on the country’s power grids. Utilities, regulators, and policymakers are scrambling to keep pace. If they mismanage their response, it could lead to higher utility bills for customers and far more carbon emissions. But this mad dash for power could also push the U.S. toward a cleaner and cheaper grid — if tech giants and other data center developers decide to treat the looming power crunch as a clean-power opportunity.
Utilities from Virginia to Texas are planning to build large numbers of new fossil-gas-fired power plants and to extend the life of coal plants. To justify this, they point to staggering — but dubious — forecasts of how much electricity data centers will gobble up in the coming years, mostly to power the AI efforts of the world’s largest tech companies.
Most of the tech giants in question have set ambitious clean energy goals. They’ve also built and procured more clean power than any other corporations in the country, and they’re active investors in or partners of startups working on next-generation carbon-free energy sources like advanced geothermal.
But some climate activists and energy analysts believe that given the current frenzy to build AI data centers, these firms have been too passive — too willing to accept the carbon-intensive plans that utilities have laid out on their behalf.
It’s time, these critics say, for everyone involved — tech giants, utilities, regulators, and policymakers — to “demand better.” That’s how the Sierra Club put it in a recent report urging action from Amazon, Google, Meta, Microsoft, and other tech firms driving data center growth across the country.
“I’m concerned the gold rush — to the extent there’s a true gold rush around AI — is trumping climate commitments,” said Laurie Williams, director of the Sierra Club’s Beyond Coal campaign and one of the report’s authors.
Williams isn’t alone. Climate activists, energy analysts, and policymakers in states with fast-growing data center markets fear that data center developers are prioritizing expediency over solving cost and climate challenges.
“I think what we’re seeing is a culture clash,” she said. “You have the tech industry, which is used to moving fast and making deals, and a highly regulated utility space.”
Some tech firms intend to rely on unproven technologies like small modular nuclear reactors to build emissions-free data centers, an approach that analysts say is needlessly unreliable. Others want to divert electricity from existing nuclear plants — as Amazon hopes to do in Pennsylvania — which simply shifts clean power from utility grids to tech companies. Yet others are simply embracing new gas construction as the best path forward for now, albeit with promises to use cleaner energy down the road, as Meta is doing in Louisiana.
Meanwhile, several fossil fuel companies are hoping to convince tech firms and data center developers to largely avoid the power grid by building fossil-gas-fired plants that solely serve data centers — an idea that’s both antithetical to climate goals and, according to industry analysts, impractical.
But a number of tech firms and independent data center developers are pursuing more realistic strategies that are both affordable and clean in order to meet their climate goals.
These projects should be the model, clean power advocates say, if we want to ensure the predicted AI-fueled boom in energy demand doesn’t hurt utility customers or climate goals.
And ideally, the companies involved would go even further, Williams said, by engaging in utility proceedings to demand a clean energy transition, by bringing their own grid-friendly “demand management” and clean power and batteries to the table, and by looking beyond the country’s crowded data center hubs to places with space to build more solar and wind farms.
The basic mandate of utilities is to provide reliable and affordable energy to all customers. Many utilities also have mandates — issued by either their own executives or state policymakers — to build clean energy and cut carbon emissions.
But the scale and urgency of the data center boom has put these priorities on a collision course.
As the primary drivers of that conflict, data centers have a responsibility to help out. That’s Brian Janous’ philosophy. He’s the cofounder of Cloverleaf Infrastructure, a developer of sites for large power users, including data centers. Cloverleaf is planning a flagship data center project in Port Washington, a city about 25 miles north of Milwaukee.
Cloverleaf aims to build a data center campus that will draw up to 3.5 gigawatts of power from the grid when it reaches full capacity by the end of 2030, “which we think could be one of the biggest data center projects in the country,” Janous said. That’s equivalent to the power used by more than 2.5 million homes and a major increase in load for the region’s utility, We Energies, to try to serve.
Together with We Energies and its parent company, WEC Energy, Cloverleaf is working on a plan that, the companies hope, will avoid exposing utility customers to increased cost and climate risks.
“The utility has done a great job of building a very sustainable path,” Janous said. WEC Energy and Cloverleaf are in discussions to build enough solar, wind, and battery storage to meet more than half the site’s estimated energy needs. The campus may also be able to tap into zero-carbon electricity from the Point Beach nuclear power plant, which is now undergoing a federal relicensing process, he said.
The key mechanism of the deal is what Janous called a “ring-fenced, bespoke tariff.” That structure is meant to shield other utility customers from paying more than their fair share for infrastructure built to meet data centers’ demand.
“This tariff puts it completely in the hands of the buyer what energy mix they’re going to rely on,” he said. That allows Cloverleaf — and whatever customer or customers end up at the site it’s developing — to tap into the wind, solar, and battery storage capacity WEC Energy plans to build to meet its clean energy goals.
To be clear, this tariff structure is still being finalized and hasn’t yet been submitted to state utility regulators, said Dan Krueger, WEC Energy’s executive vice president of infrastructure and generation planning. But its fundamental structure is based on what he called a “simple, just not easy,” premise: “If you come here and you say you’ll pay your own way”— covering the cost of the energy and the transmission grid you’ll use — “we invest in power plants” to provide firm and reliable power.
“We make sure we can get power to the site, we make sure we have enough capacity to give you firm power, and then we start lining up the resources that can help make you green,” he said.
WEC Energy’s broader plans to serve its customers’ growing demand for power haven’t won the backing of environmental advocates. The Sierra Club is protesting the utility’s proposal to build or repower 3 GW of gas-fired power plants in the next several years, and has pressed Microsoft, which is planning its own $3.3 billion data center in We Energies territory, to engage in the state-regulated planning process to demand cleaner options.
Krueger said that the gas buildout is part of a larger $28 billion five-year capital plan that includes about $9.1 billion to add 4.3 GW of wind, solar, and battery capacity through 2029. That plan encompasses meeting new demand from a host of large customers including Microsoft, but it doesn’t include the resources being developed for Cloverleaf.
Janous said he agreed with the Sierra Club’s proposition that “the biggest customers should be using their influence to affect policy.” At the same time, Cloverleaf is building its data center for an eventual customer, and “our customers are looking for speed, scale, and sustainability,” in that order. Cementing a tariff with a host utility is a more direct path to achieving this objective, he said.
Similar developer partnerships between utilities and data centers are popping up nationwide.
In Georgia, the Clean Energy Buyers Association and utility Georgia Power are negotiating to give tech companies more freedom to contract for clean energy supplies. In North Carolina, Duke Energy is working with Amazon, Google, Microsoft, and steelmaker Nucor to create tariffs for long-duration energy storage, modular nuclear reactors, and other “clean firm” resources. In Nevada, utility NV Energy and Google have proposed a “clean transition tariff,” which would commit both companies to securing power from an advanced geothermal plant that Fervo Energy is planning.
“In the near term, to get things going quickly, we’re looking at solar and wind and storage solutions,” said Amanda Peterson Corio, Google’s global head of data center energy. “As we see new firming and clean technologies develop, we’ve planted a lot of seeds there,” like the clean transition tariff.
Insulating utility customers from bearing the costs of data center power demand is a core feature of these tariffs. Broader concerns over the costs that unchecked data center growth could impose have triggered pushback from communities, politicians, and regulators in data center hot spots. But Janous highlighted that Cloverleaf’s Wisconsin project will have “no impact on existing ratepayers — 100% of the cost associated with the site will flow through this tariff.”
That’s also good utility-ratemaking policy, said Chris Nagle, a principal in the energy practice at consultancy Charles River Associates, who worked on a recent report on the challenge of building what he described as “adaptable and scalable” tariffs that can apply to data centers across multiple utilities. “In the instances where one-off contracts or schedules are done, they should be replicable,” he said.
At the same time, Nagle continued, “each situation is different. Some operators may place more value on sourcing from carbon-free resources. Others may value cost-effectiveness more. Utilities may have sufficient excess generation capacity, or they may have none at all.”
Right now, top-tier tech companies appear willing to pay extra for clean power, said Alex Kania, managing director of equity research at Marathon Capital, an investment banking firm focused on clean infrastructure. He pointed to reports that Microsoft is promising to pay Constellation Energy roughly twice the going market price for long-term electricity supply for the zero-carbon power it expects to secure from restarting a unit of the former Three Mile Island nuclear plant.
Given that willingness to pay, “I think these hyperscalers could go further,” Kania said, using the common term for the tech giants like Amazon, Google, Meta, and Microsoft that have the largest data centers. With their scale, these companies can “go to regulators and say, ‘We’re going to find a way for utilities to grow and make these investments but also hold rates down for customers.’”
But cost is not the primary barrier to building clean power today.
In fact, portfolios of new solar, wind, and batteries are cheaper than new gas-fired power plants in most of the country. Instead, the core barrier to getting clean power online — be it for data centers or other large-scale power buyers, or even just for utilities — is the limited capacity of the power grid itself.
Across much of the U.S., hundreds of gigawatts of solar, wind, and battery projects are held up in yearslong waitlists to get interconnected to congested transmission grids. Facing this situation, some data center developers are targeting parts of the country where they can build their own clean power and avoid as much of the grid logjam as possible.
In November, for example, Google, infrastructure investor TPG Rise Climate, and clean power developer Intersect Power unveiled a plan to invest $20 billion by 2030 into clusters of wind, solar, and batteries that are largely dedicated to powering newly built data centers.
With the right balance of wind, solar, and batteries, topped off by power from the connecting grid or from on-site fossil-gas generators, Intersect Power CEO Sheldon Kimber says this approach can be “cleaner than any part of the grid. You’re talking 80% clean energy.”
And importantly, that clean energy is “all new and additional,” Google’s Peterson Corio said. That’s important for her employer, which wants to “make sure any new load we’re building, we’re building new generation to match it.”
Think tank Energy Innovation has cited this “energy park” concept as a neat solution to the twin problems of grid congestion and ballooning power demand. Combining generation and a big customer behind a single interconnection point can “speed up development, share costly onsite infrastructure, and directly connect complementary resources,” policy adviser Eric Gimon wrote in a December report.
And while many existing data center hubs aren’t well suited to energy parks, plenty of other places around the country are, said Gary Dorris, CEO and cofounder of energy analysis firm Ascend Analytics. Swaths of the Great Plains states, “roughly from Texas to the Dakotas,” offer “the combination of wind and solar, and then storage, to get to close to 100%” of a major power customer’s electricity needs, he said.
That’s not to say that building these energy parks will be simple. First, there’s the sheer amount of land required. A gigawatt-scale data center may occupy a “couple hundred acres,” Kimber said, but powering it will take about 5,000 acres of solar and another 10,000 for wind turbines.
And then there are the regulatory hurdles involved. Almost all U.S. utilities hold exclusive rights to provide power and build power-delivery infrastructure within the territories they serve. The exception is Texas, which has a uniquely competitive energy regulatory regime. Intersect Power plans to build its first energy park with Google and TPG Rise Climate in Texas, and the partners haven’t disclosed if they’re working on projects in any other states.
Cloverleaf’s Janous highlighted this and other constraints to the energy-park concept.
Getting the workforce to build large-scale projects in remote areas is another challenge, he added, as is accessing the fiber-optic data pipelines needed by data centers serving time- and bandwidth-sensitive tasks.
“We think the market for those sorts of deals is relatively small,” he said.
On the other hand, the task of training AI systems, which many hyperscalers are planning to dedicate billions of dollars to over the next few years, doesn’t require the same bandwidth or latency and can be “batched” to run at times when power is available.
“Historically a lot of data centers have landed close to each other to make communications faster, but it isn’t clear that the data centers being built today have those same constraints,” said Jeremy Fisher, principal adviser for climate and energy at the Sierra Club’s Environmental Law Program and co-author of Sierra Club’s recent report. “To the extent that the AI demand is real, those data centers should be closer to clean energy and contracting with new, local renewable energy and storage to ensure their load isn’t met with coal and gas.”
The Sierra Club and other climate advocates would prefer data center demand is met with no new fossil-fuel power at all. But few, if any, industry analysts think that is realistic. So, the question becomes how much gas will be necessary.
A growing number of companies are targeting data centers as potential new customers for gas-fired power, including oil and gas majors. ExxonMobil announced plans to enter the power-generation business in December, proposing to build a massive gas-fired power plant dedicated to powering data centers. A partnership between Chevron, investment firm Engine No. 1, and GE Vernova launched last month with a promise to build the country’s “first multi-gigawatt-scale co-located power plant and data center.”
President Donald Trump has also backed this idea of building fossil-fuel power plants for data centers. “I’m going to give emergency declarations so that they can start building them almost immediately,” he told attendees of the World Economic Forum in Davos, Switzerland, last month. “You don’t have to hook into the grid, which is old and, you know, could be taken out. … They can fuel it with anything they want. And they may have coal as a backup — good clean coal.”
The federal government doesn’t regulate utilities and power plants, however — states do. And even if that weren’t the case, Janous and Intersect Power’s Kimber agreed that building utility-scale gas power plants solely for data centers is a nonstarter. “We’ve been pitched so many projects on building behind-the-meter combined-cycle gas plants,” Janous said. “We think that’s absolutely the wrong approach.”
Kimber said that Intersect Power’s energy-parks concept does include gas-fueled generators. But they’ll be relatively cheap, allowing them to earn their keep even if run infrequently to fill the gaps at times when solar, wind, and batteries can’t supply power. Eventually they can be replaced with next-generation storage technologies.
That’s quite different from building a utility-scale power plant that must run most of the time for decades to pay back its cost, he said. “Our solution is more dynamic: It exhibits less lock-in, and it’s faster and more practical.”
Nor can large-scale gas power plants be built quickly enough to match the pace that developers have set for themselves to get data centers up and running. Multiple energy analysts have repeated that point over the past year, as have the companies that make and deploy the power plant technologies, like GE Vernova, which is reporting a three-year, $3 billion backlog for its gas turbines.
Utility holding company and major clean energy developer NextEra Energy announced last month that it was moving into building gas power plants with GE Vernova. But CEO John Ketchum noted in an earnings call that gas-fired generation “won’t be available at scale until 2030 and then, only in certain pockets of the U.S.,” and that costs of those power plants have “more than doubled over the last five years due to the limited supply of gas turbines.” Renewables and batteries, by contrast, “are ready now to meet that demand and will help lower power prices.”
Marathon Capital’s Kania agreed with this assessment. “Time-to-power is the true bottleneck,” he said. “But if you can figure out how to pull a rabbit out of a hat and get power resources up in the next few years, that’s going to be very valuable — because that’s very scarce.”
In the fourth and final part of this series, Canary Media reports on how flexibility can help the grid better handle data centers.
This is the fourth and final article in our series “Boon or bane: What will data centers do to the grid?”
Tom Wilson is aware that the explosive growth of data centers could make electricity costlier and dirtier. As a principal technical executive at the Electric Power Research Institute, the premier U.S. utility research organization, he’s studied the risks himself.
But he also thinks conversations about the problem tend to miss a key point. Data centers could also make the grid cleaner and cheaper by embracing a simple concept: flexibility.
“Data centers are not just load — they can also be grid assets,” he said. Turning that proposition into reality is the goal of his most recent project, DCFlex, a collaborative effort to get data centers to “support the electric grid, enable better asset utilization, and support the clean energy transition.”
DCFlex is short for “data center flexibility,” a term that encompasses all the ways that these sprawling campuses and buildings full of servers, cooling equipment, power-control systems, backup generators, and batteries can reduce or shift their power use.
Since its October launch, DCFlex has grown from 15 to 37 funding participants. On the data-center side are tech “hyperscalers” like Google, Meta, and Microsoft; major data center developers like Compass and QTS; and AI computing and power equipment suppliers like Nvidia and Schneider Electric.
On the grid side are utilities such as Duke Energy, Pacific Gas & Electric, Portland General Electric, and Southern Company; power plant owners like Constellation Energy, NRG Energy, and Vistra; and five of the continent’s seven grid operators, which manage energy markets serving electricity to two-thirds of the U.S. population.
The range of participants reflects the broad interest in solving the pressing challenge of powering the data centers being proposed around the country without driving up grid costs and emissions.
That won’t be easy. From Virginia’s “Data Center Alley” to emerging hot spots in Arizona, Georgia, Indiana, Ohio, and beyond, utilities are being inundated with demands for round-the-clock power from data center projects that can add the equivalent of a small city’s electricity consumption within a few years. Meanwhile, it usually takes four or five years to connect new power plants to the grid.
Flexibility could make a big difference, however, said Tom Wilson, who’s worked on climate and energy issues for more than four decades, including advising projects at the Massachusetts Institute of Technology and Stanford University and serving at the White House Office of Science and Technology Policy during the Biden administration.
That’s because the impacts of massive new utility customers like data centers are tied not just to how much power they need but specifically to when they need it.
Utilities live and die by the few hours per year when demand for electricity peaks — usually during the hottest and coldest days. By refraining from using grid power during those peak hours, new data centers could significantly reduce their impact on utility costs and carbon emissions.
If data centers and other big electricity customers committed to curtailing their power use during peak hours, it could unlock tens of gigawatts of “spare” capacity on U.S. grids, according to a recent analysis from Duke University’s Nicholas Institute for Energy, Environment & Sustainability.
Realizing that spare capacity will be challenging, though. For starters, every new large power customer would have to agree not to use grid power during key hours of the year, which is far from a realistic expectation today. What’s more, utilities would need some serious proof that those big customers can actually follow through with promises to not use power during those peak times before letting them connect, because broken promises in this case could lead to overloaded grids or forced blackouts.
And flexibility can’t solve all the power problems that massive data center expansion could cause.
In a December report, consultancy Grid Strategies found that key data center markets are driving an unprecedented fivefold increase in the amount of new power demand that U.S. utilities and grid operators forecast over the next half decade or so. While that analysis “really focused on the peak demand forecast,” the sheer amount of power needed over the course of a year is “potentially just as big of a story,” said John Wilson, Grid Strategies’ vice president.
Still, for utilities struggling to plan and build the generation and grid infrastructure needed to support data centers, flexibility is worth exploring. That’s because data centers could make their operations flexible a lot faster than utilities can expand power grids and build power plants.
It typically takes seven to 10 years to build high-voltage transmission lines and four to five years to build a gas-fired power plant — “even in Texas,” Tom Wilson pointed out. The concept of relying on big customers to avoid using power instead of building all that infrastructure is just starting to take hold in utility planning, but it could play a major role in managing the surge in power demand.
Flexible data centers may also be able to secure space on capacity-constrained grids more quickly than inflexible competitors, Tom Wilson said. A U.S. Department of Energy report released last year included interviews with dozens of utilities, and one key takeaway was that “electricity providers often can accommodate the energy and capacity requests of a data center for (say) 350 days but need to find a win-win solution for the remaining 15 days.”
“If you have two projects in the queue, and one says they can be flexible and the other says they can’t be flexible, and they’re about the same size, then the one that can be flexible is more likely to be successful,” Tom Wilson said.
That’s not lost on Brian Janous, cofounder of Cloverleaf Infrastructure, which develops what the company describes as “clean-powered, ready-to-build sites for the largest electric loads,” mainly data centers.
“You need to understand, when a utility says, ‘I can’t get you power,’ what they mean is, ‘There are certain hours of the day I can’t get you power,’” he said. The data center industry “lacks visibility into this, which is kind of shocking,” given that data center flexibility is nothing new.
In fact, back in 2016, when he worked as energy strategy director at Microsoft, Janous helped structure a deal for a data center in Cheyenne, Wyoming, to use fossil-gas-fired backup generators to reduce peak grid stress for utility Black Hills Energy. That promise, combined with Microsoft’s agreement to purchase nearly 240 megawatts of wind power, got that deal over the line.
Janous thinks many utilities are eager for similar propositions today. One unnamed utility executive told him recently that the backlog for connecting large data centers to its grid is now at least five years. “I asked, ‘What if the data center could be dispatchable?” And he said, ‘Oh, we could connect them tomorrow. But nobody’s asking me that.’”
Getting utilities and data centers together to ask those kinds of questions is what DCFlex is all about. Project partners are now developing five to 10 demonstration projects, Tom Wilson said, none of which have been announced. But he described the scope of work as ranging from the development stage to “existing sites that are ready to roll.”
As for how these projects will help the grid, he laid out two broad methods: They’ll use on-site power generation or storage to replace what they’d otherwise pull from the grid, or they’ll use less electricity during peak hours.
Janous thinks on-site generation is the simpler approach. To some extent, it’s already happening today but typically with dirty diesel generators. Janous, Tom Wilson, and other experts say these diesel generators are not a viable option for hyperscalers, however. They’re simply too dirty and too expensive to rely on, except during grid outages or other dire situations.
Biodiesel and renewable diesel could work for some smaller data centers, Tom Wilson said. But it’s not yet clear whether air-quality rules would permit generators burning those fuels to run during nonemergencies. Nor are the economics viable for larger-scale data centers, he said.
Fossil-gas-fired backup generators like those Microsoft used in Cheyenne are another option — albeit one that still pollutes the local air and warms the planet. Still, a growing number of data center developers are looking to use them as a workaround to grid constraints. “We’re in the process of developing sites in many parts of the country,” Janous said. “Every one of them has access to natural gas.”
It would be a problem for the planet — and for meeting the climate goals major tech companies have committed to — if data centers planned to use fossil gas for a majority of their power. But if relied on sparingly and strategically, this choice might be less harmful than the alternatives: If a data center burns fossil gas just to power itself during grid peaks, that might reduce pressure on utilities to keep old coal-fired power plants open or to build much larger gas-fired plants that would lock in emissions for decades.
Other gas-fueled options for on-site power might be less damaging to the climate — although this remains a hotly debated topic. Microgrid developer Enchanted Rock plans to install gas generators at a Microsoft data center in San Jose, California, which will burn regular fossil gas but will offset that usage by purchasing an equivalent amount of “renewable natural gas” — methane captured from rotting food waste. It claims this will make the project emissions-free.
And utility American Electric Power has signed an agreement to buy 1 gigawatt of fuel cells from Bloom Energy, which it plans to install at data centers. The firm’s fuel cells still emit carbon dioxide, but they avoid the harmful nitrous oxides caused by burning gas.
Batteries are another option — and the one that has the greatest potential to be clean. Most data centers have some batteries on-site to help computers ride through grid disruptions until backup generators can turn on, but relying on batteries for backup power and to provide grid support is a far more complex and costly endeavor.
“There are all kinds of trade-offs in terms of reliability, in terms of emissions, in terms of cost, in terms of the physical footprint,” Tom Wilson said. “In a storm situation that brought the grid down, you’d want something that can be dependable.”
A small but growing number of hyperscalers are looking to batteries for both backup power and grid flexibility. Google’s battery-backed data center in Belgium is one example.
“We built out battery storage as a way to displace part of our diesel gensets, to provide grid services, and to provide relief during times of extreme grid stress when we needed backup,” said Amanda Peterson Corio, Google’s global head of data center energy.
In the U.S., a Department of Energy grant is supporting a battery installation at an Iron Mountain data center in Virginia that’s meant to test the potential to store clean power for backup and grid-support uses.
It’s one of a number of DOE programs launched under the Biden administration whose purpose is to explore ways that “battery energy storage systems can provide similar levels of reliability, and without a lot of the challenges that diesel gensets or other backup power sources have,” Avi Shultz, director of the DOE’s Industrial Efficiency and Decarbonization Office, said in an October interview.
The other big idea for making data centers flexible focuses not on the power they can generate and store but on the power they use, Shultz said.
“Demand response” is the utility industry term for the practice of throttling power use during times of peak grid stress, or of shifting that power use to other times when the grid can handle it better, in exchange for payments from utilities or revenues in energy markets.
Historically, data centers haven’t been interested in standard demand-response programs and markets. The value of what they do with that electricity is just too high compared with the potential rewards.
But if a data center’s participation in a demand-response program is the difference between it getting a grid connection or not, the programs become a lot more appealing.
Shultz highlighted two key data center tasks that are particularly ripe for load flexibility.
The first is “cooling loads and facility energy demand,” he said. Data centers use enormous amounts of electricity to keep their servers and computing equipment from overheating, and quite a bit of that energy is lost in the process of converting from high-voltage grid power to the low-voltage direct current that computing equipment uses.
Data center operators have invested heavily to make this cooling and power conversion more efficient in recent decades. Further advances in efficiency — and technologies that can store power for cooling for later use — could become “part of the routine best practices of data center operations,” Shultz said.
The second big target for load flexibility is “the core of the computational operation itself,” he said. Not all data centers need to run their computing equipment 24 hours a day, and “they may not be being operated in a way that’s optimized from an energy point of view. I think there’s an opportunity there to develop more innovative and flexible operational processes.”
Again, this isn’t a new idea. Cryptocurrency mining operations in Texas have been earning millions of dollars for not using electricity during grid emergencies, even as lawmakers, regulators, and neighbors of crypto mining operations have been raising alarms over that industry’s rising hunger for power. Google has also carried out some version of this through its “carbon-intelligent computing” program, which shifts certain data center operations to use cleaner power. In recent years, Google has touted how it can also shift computing load to relieve grid stress.
“Our carbon-aware computing platform started as how we can shift nonurgent compute loads to times when the grid is more clean,” Google’s Peterson Corio said. “We’ve also done that to support utility partners in times of extreme weather events.”
But last year’s DOE report on data center power use stated that aside from Google’s activities, contributors to its research “identified no examples of grid-aware flexible operation at data centers today” in the U.S. That absence of evidence “may result from the fact that electricity providers only recently started having to say no to data center interconnection requests.”
Plus, not every data center conducts tasks that can be easily postponed, Tom Wilson emphasized. “Data centers are all different. If you are Visa, in Virginia, their data center is transacting, I think, 80,000 credit approvals per second. You don’t want to say, ‘I’ll approve your credit when the sun is shining.’ That’s the kind of thing where you don’t have much flexibility. You need to deliver.”
Other tasks are better suited, he said, including, critically, much of the work of training the AI models that constitute the largest single source of increasing power demand from data centers. “There is potentially flexibility there — and a fair amount of it.”
Taking advantage of that flexibility is the idea behind Verrus, a company launched last year by Sidewalk Infrastructure Partners, a firm spun out of Alphabet. Traditional data centers separate their computing capacity into individual “halls,” each of which has its own power conversion, cooling, and backup generation. Verrus, by contrast, is planning data center complexes with “a centralized battery in the center of all the data halls, with a sophisticated microgrid controller that allows it to think about all the data halls as interruptible and schedulable,” Jonathan Winer, a Sidewalk Infrastructure Partners cofounder and the former co-CEO, said in an interview last year.
Not every “hall” will be dedicated to tasks that can be interrupted, he explained. But some of them will be — and “with AI training, I can press pause on it. These are multi-day, sometimes multi-week training runs, and you can press pause and resume.”
Verrus hasn’t built one of these data centers yet, but it is targeting Arizona, California, and Massachusetts as its first markets. In a white paper last year, Sidewalk Infrastructure Partners explained why this kind of flexibility is a must for new data centers. As Winer said, “The data center industry has realized how much it needs to be a power-led model.”
It’s not clear how many data center developers are building flexibility into their grid interconnection requests to utilities. But that doesn’t mean it’s not happening.
Data center developers tend to be secretive about their hunt for sites and the nature of the discussions they have with utilities. Developers may be talking about flexibility with one utility while also shopping around for a more traditional interconnection that gives them access to the power they want at all hours of the day.
Flexibility deals are most likely to emerge in data center markets where unimpeded interconnections simply aren’t possible anymore due to grid constraints, said Aaron Bilyeu, Cloverleaf’s chief development officer. What’s changed is that “we’ve quickly run out of those opportunities,” he said.
Utilities in grid-constrained parts of the country are beginning to take on the complexities of creating “flexible interconnection” policies and tariffs — the rules and rates for customers — that could provide data center developers some clarity on what a commitment to flexible operations could be worth to them. That’s a big part of the work underway at DCFlex, Tom Wilson said.
“We have three main workstreams. The first is aimed at defining the flexibilities that are possible and creating a taxonomy — each utility and RTO [regional transmission organization] has different words for the same things,” he said. “The second piece is aimed at incentives, rate structures, and regulatory issues on the utility side, looking at how they could effectively orchestrate flexibility.”
“The third piece is how to build the planning and operational tools that incorporate flexibility,” he added. That means figuring out how to calculate the impact of flexible versus non-flexible large customers on long-term planning for power plants, grids, and other infrastructure investments, which traditional planning processes don’t do today. “You need new tools, or to evolve tools, to think about and utilize the new opportunities that flexibility can provide.”
That work will be critical to giving regulators the tools they need to challenge utilities’ claims that they must build new fossil-gas power plants and keep coal plants open to serve peak loads, the Sierra Club wrote in a September report. Big customers “can have an outsized impact in avoiding new fossil fuel investment (or enabling coal retirements) by participating in demand management programs that allow utilities to subtract some or all of the customer’s load from its peak obligation,” the report notes.
That aligns with the needs of tech companies like Amazon, Google, Meta, and Microsoft that have aggressive clean energy goals. These companies are negotiating with utilities in hot data center markets like Georgia, arguing that letting data centers build clean energy, on-site power, and load flexibility into load forecasts could obviate some of the new dirty power that utility Georgia Power wants to build.
Putting the question of clean versus dirty power aside, data center developers can’t expect utilities and regulators to allow the costs of supplying them with round-the-clock power to fall on regular customers’ shoulders, Bilyeu said. “We believe data centers should pay their own way.”