The explosive growth of AI-powered computing centers is creating an unprecedented surge in electricity demand that threatens to overwhelm power grids and derail climate goals. At the same time, artificial intelligence technologies could revolutionize energy systems, accelerating the transition to clean power.
“We’re at a cusp of potentially gigantic change throughout the economy,” said William H. Green, director of the MIT Energy Initiative (MITEI) and Hoyt C. Hottel Professor in the MIT Department of Chemical Engineering, at MITEI’s Spring Symposium, “AI and energy: Peril and promise,” held on May 13, 2025. The event brought together experts from industry, academia, and government to explore solutions to what Green described as both “local problems with electric supply and meeting our clean energy targets” while seeking to “reap the benefits of AI without some of the harms.” The challenge of data center energy demand and potential benefits of AI to the energy transition is a research priority for MITEI.
AI’s startling energy demands
From the start, the symposium highlighted sobering statistics about AI’s appetite for electricity. After decades of flat electricity demand in the United States, computing centers now consume approximately four percent of the nation’s electricity. Although there is great uncertainty, some projections suggest this demand could rise to 12-15% by 2030, largely driven by artificial intelligence applications.
Vijay Gadepally, senior scientist at MIT’s Lincoln Lab, emphasized the scale of AI’s consumption: “The power required for sustaining some of these large models is doubling almost every three months,” he noted. “A single ChatGPT conversation uses as much electricity as charging your phone, and generating an image consumes about a bottle of water for cooling.”
Facilities requiring 50 to 100 megawatts of power are emerging rapidly across the United States and globally, driven both by casual and institutional research needs relying on large language programs such as ChatGPT and Gemini. Gadepally cited congressional testimony by Sam Altman, CEO of OpenAI, highlighting how fundamental this relationship has become: “the cost of intelligence, the cost of AI, will converge to the cost of energy.”
“The energy demands of AI are a significant challenge, but we also have an opportunity to harness these vast computational capabilities to contribute to climate change solutions,” said Evelyn Wang, MIT Vice President for Energy and Climate, and the former director at the Advanced Research Projects Agency-Energy (ARPA-E) at the Department of Energy.
Wang also noted that innovations developed for AI and data centers—such as efficiency, cooling technologies, and clean-power solutions—could have broad applications beyond computing facilities themselves.
Strategies for clean energy solutions
The symposium explored multiple pathways to address the AI-energy challenge. Some panelists presented models suggesting that while artificial intelligence may increase emissions in the short term, its optimization capabilities could enable substantial emissions reductions after 2030 through more efficient power systems and accelerated clean technology development.
Research shows regional variations in the cost of powering computing centers with clean electricity, according to Emre Gençer, co-founder and CEO of Sesame Sustainability and former MITEI principal research scientist. Gençer’s analysis revealed that the central United States offers considerably lower costs due to complementary solar and wind resources. However, achieving zero-emission power would require massive battery deployments—five to ten times more than moderate carbon scenarios—driving costs two to three times higher.
“If we want to do zero emissions with reliable power, we need technologies other than renewables and batteries, which will be too expensive,” Gençer said. He pointed to “long-duration storage technologies, small modular reactors, geothermal or hybrid approaches” as necessary complements.
Because of data center energy demand, there is renewed interest in nuclear power, noted Kathryn Biegel, manager of R&D and corporate strategy at Constellation Energy, adding that her company is restarting the reactor at the former Three Mile Island site, now called the “Crane Clean Energy Center,” to meet this demand. “The data center space has become a major, major priority for Constellation,” she said, emphasizing how their needs for both reliability and carbon-free electricity are reshaping the power industry.
Kathryn Biegel (center), the manager of R&D and corporate strategy at Constellation Energy, discussed strategies to meet clean energy goals from data centers with panelists Uday Varadrarjan of RMI and Emre Gençer of Sesame Sustainability, and moderator Ruaridh Macdonald of MITEI. Credit: Jake Belcher
Can AI accelerate the energy transition?
Artificial intelligence could dramatically improve power systems, according to Priya Donti, assistant professor and the Silverman (1968) Family Career Development Professor at the MIT Electrical Engineering and Computer Science Department and the MIT Laboratory for Information and Decision Systems. She showcased how AI can accelerate power grid optimization by embedding physics-based constraints into neural networks, potentially solving complex power flow problems at “ten times, or even greater, speed compared to your traditional models.”
AI is already reducing carbon emissions, according to examples shared by Antonia Gawel, global director of sustainability and partnerships at Google. Google Maps’ fuel-efficient routing feature has “helped to prevent more than 2.9 million metric tons of GHG emissions reductions since launch, which is the equivalent of taking 650,000 fuel-based cars off the road for a year.” Another Google research project uses artificial intelligence to help pilots avoid creating contrails, which represent about 1% of global warming impact.
AI’s potential to speed materials discovery for power applications was highlighted by Rafael Gómez-Bombarelli, Paul M. Cook Career Development Associate Professor in the MIT Department of Materials Science and Engineering. “AI-supervised models can be trained to go from structure to property,” he noted, enabling the development of materials crucial for both computing and efficiency.
Securing growth with sustainability
Throughout the symposium, participants grappled with balancing rapid AI deployment against environmental impacts. While AI training receives most attention, Dustin Demetriou, senior technical staff member in sustainability and data center innovation at IBM, quoted a World Economic Forum article that suggested that “80% of the environmental footprint is estimated to be due to inferencing.” Demetriou emphasized the need for efficiency across all artificial intelligence applications.
Jevons’ paradox, where “efficiency gains tend to increase overall resource consumption rather than decrease it” is another factor to consider, cautioned Emma Strubell, Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University. Strubell advocated for viewing computing center electricity as a limited resource requiring thoughtful allocation across different applications.
Several presenters discussed novel approaches for integrating renewable sources with existing grid infrastructure, including potential hybrid solutions that combine clean installations with existing natural gas plants that have valuable grid connections already in place. These approaches could provide substantial clean capacity across the United States at reasonable costs while minimizing reliability impacts.
(From left) Moderator Elsa Olivetti of MIT explores opportunities to reduce data center demand with panelists Dustin Demetriou of IBM, Emma Strubell of Carnegie Mellon University, and Vijay Gadepally of MIT Lincoln Laboratory Supercomputing Center. Credit: Jake Belcher
Navigating the AI-energy paradox
The symposium highlighted MIT’s central role in developing solutions to the AI-electricity challenge.
Green spoke of a new MITEI program on computing centers, power, and computation that will operate alongside the comprehensive spread of MIT Climate Project research. “We’re going to try to tackle a very complicated problem all the way from the power sources through the actual algorithms that deliver value to the customers—in a way that’s going to be acceptable to all the stakeholders and really meet all the needs,” Green said.
Participants in the symposium were polled about priorities for MIT’s research by Randall Field, MITEI director of research. The real-time results ranked “data center and grid integration issues” as the top priority, followed by “AI for accelerated discovery of advanced materials for energy.”
In addition, attendees revealed that most view AI’s potential regarding power as a “promise” rather than a “peril” though a considerable portion remain uncertain about the ultimate impact. When asked about priorities in power supply for computing facilities, half of the respondents selected carbon intensity as their top concern, with reliability and cost following.