Is AI Slowing Climate Progress? It’s Complicated
Two BU experts studying grid solutions and sustainable energy transitions say that tech companies racing to build more AI could make data centers more energy efficient
As more data centers—like this one in Sterling, Va.—are being constructed across the United States, BU experts say that creating sustainable solutions for powering them is paramount. Photo via iStock/Gerville
Is AI Slowing Climate Progress? It’s Complicated
Two BU experts studying grid solutions and sustainable energy transitions say that tech companies racing to build more AI could make data centers more energy efficient
For over a decade, Ayse Coskun has studied the relationship between electric grids and data centers—the sprawling warehouses that house equipment necessary to maintain the internet and computing infrastructure. In years past, grid operators have been able to plan for and meet energy demands from data centers—but then artificial intelligence (AI) boomed.
“AI consumes a lot of energy,” says Coskun, a Boston University College of Engineering professor of electrical and computer engineering and systems engineering. It’s estimated that asking AI a question requires about 10 times the electricity of a traditional Google search. “But that’s just querying,” Coskun says. “People have now seen the success of these large language models that get trained with huge amounts of data. As a result, businesses want to build and train more of these models for specific applications.”
Building more data centers to operate AI is already increasing electricity usage: training AI large language models (LLMs) like OpenAI’s ChatGPT and Google’s Gemini requires thousands of computers in data centers running at full speed. According to a US Department of Energy report, data centers consumed 4 percent of total electricity in the US in 2023—more than a large US city—with that number expected to more than double by 2028.
With demand expected to rapidly peak, utilities across the country are actively trying to figure out where all of that power will come from; for the most part, that has meant relying on fossil fuels, like natural gas and coal plants. And this is all happening at a time when big tech companies acknowledge that they need to lower carbon emissions to help curb climate change.
“When you talk to utilities and power balancing authorities, they are concerned because there’s a demand for building new data centers, and the total amount of power [the centers] are demanding is unprecedented,” says Coskun, who also directs BU’s Center for Information & Systems Engineering and has published dozens of papers on energy-efficient and high-performance computing. “Suddenly, data centers may need 80 gigawatts of power in the next five years—when 1 gigawatt is roughly the amount that can be generated by a large nuclear power plant.”
These two conflating issues—powering data centers and slowing climate change—raise questions about whether achieving both goals is even possible. Yet, Coskun and BU energy transitions expert Cutler Cleveland believe that it can be done—and are working on ways to make their ideas a reality. Coskun is studying how data centers can regulate their energy consumption to align with power grid needs, while making sure the data center works fast and reliably for users. While she is using her engineering expertise to solve this problem, Cleveland sees solutions in the way energy is regulated. A BU College of Arts & Sciences professor of Earth and environment, he helped set Boston’s carbon neutrality and energy transition goals as the principal investigator for Carbon Free Boston.
“The intense demand for electricity by data centers will hopefully create an incentive to improve the efficiency of energy use at data centers,” Cleveland says. “There’s already a lot of work being put in to reduce their demand for electricity.”
The intense demand for electricity by data centers will hopefully create an incentive to improve the efficiency of energy use at data centers.
That is especially important in regions where data centers are being built heavily, like Virginia’s “Data Center Alley,” as well as Phoenix, Ariz., Atlanta, Ga., and Reno, Nev. If more data centers are connected directly to electricity distribution networks, electric costs could rise, putting a heavy burden on low-income households in the US. (Data centers also require a lot of other precious resources, like water, which is essential for cooling servers.)
“There are millions of households who, especially during the winter, face decisions about whether to pay their utility bill or provide lunches for their kids,” Cleveland says. So, he says, making data centers more energy-efficient would benefit not just the climate, but also everyday consumers.
Making Data Centers an Energy Asset
As more data centers are built, Coskun believes there is potential for them to work with power grid needs, rather than draining them. Alongside her role at BU, Coskun serves as the chief scientist at Emerald AI, a commercial start-up that aims to optimize the relationship between power grids and data centers. Coskun has found in her research that it’s possible for new data centers to come online faster based on a concept called “energy flexibility.”

At a baseline, power grids use about 60 to 70 percent of their capacity, so that when there are peaks in demand—like when everyone cranks their air conditioners during a heat wave—there’s energy in reserve. (Peak times often result in grids operating “peaker plants,” which tend to use the dirtiest and most expensive fuels.) Coskun says that if data centers agree to curtail their usage by just a small percentage during energy peaks, more data centers can be integrated faster into the power grid to meet the growing AI demand—without building new, massive fossil fuel–generation infrastructure.
“My research over the last decade has shown that, if data centers provide flexibility in their power consumption, they can make the grid more resilient and reduce their electricity costs,” she says. Much of her research has paved the way for her work with Emerald AI, which aims to act as a mediator between the grid and a data center to assist the latter in becoming flexible with energy demands. The company recently tested their platform in Phoenix and found that their software reduced the power consumption of AI workloads by 25 percent over three hours during a peak time, without disruptions to the data center’s computing performance.
My research over the last decade has shown that, if data centers provide flexibility in their power consumption, they can make the grid more resilient and reduce their electricity costs.
Moving the Needle on Renewable Energy
Complicating the situation further is that more tech companies are moving toward powering themselves: Google is purchasing $3 billion of hydropower; Microsoft plans to reopen Three Mile Island nuclear plant to power data centers, while also building a new nuclear reactor in Wyoming. Though nuclear energy doesn’t release carbon, it’s not considered renewable since it requires rare Earth elements like uranium. In the short term, there aren’t enough sources of renewable energy to serve both data centers and individual consumers.

“Some new energy capacity is going to have to be built to accommodate this demand. However, there’s a lot that could be done without building a new power plant,” says Cleveland, who studies global energy markets and shares insights on BU’s Visualizing Energy, a project that links trustworthy information on sustainable energy and human well-being to decision-makers.
Reforming interregional movement of electricity, for example, could accommodate some of the large demand without building new fossil fuel plants—but requires a lengthy regulatory process and bipartisan support that is difficult to achieve. Renewable energy can also be valued higher by grid operators, who assign market values to different electricity generators. Currently, many grid operators assign renewable sources lower values than fossil fuel generation because of the variability of power from the sun and wind.
“Right now, solar, wind, and battery storage is at a disadvantage in the way they’re valued, because our system is used to building large coal and gas plants. There is a bias against how we value renewable sources. And that’s a policy decision,” Cleveland says. “We need to think about this in a holistic way. This includes quantifying the benefits of clean energy and energy efficiency, such as improved health outcomes, job creation, and the social cost of carbon.”
Now that our digital world is infused with AI, Cleveland also believes we need to account for the indirect effects of AI use. “AI is going to dramatically improve the energy efficiency of some aspects of human existence,” Cleveland says. “So, it cuts both ways. Data centers produce the tools, but the tools are going to improve efficiency by making different processes faster.” But, there’s also a rebound effect—meaning, if something becomes easier or more efficient, we’re likely to do more of it. “So the net effect of the expansion of AI into our personal lives and performing economic tasks is quite complicated,” he says. “How that all washes out, no one knows.”
That makes matching the expansion of AI with the expansion of the grid and sustainable energy solutions all the more important, since there is so much uncertainty about how AI will be deployed. Decisions made over the next couple of years about the challenges and opportunities will drastically shape the future of AI—and our climate.
Comments & Discussion
Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.