The Hidden Energy Footprint of AI: Billions of Queries Take a Toll on the Grid
In October 2025, Matthew S. Smith's article in AIMagazine revealed that billions of daily queries to generative AI models like ChatGPT are reshaping energy and infrastructure. According to estimates, these queries consume significant amounts of power, with some researchers suggesting that complex queries can consume up to 20 watt-hours (Wh) per query.
OpenAI, the company behind ChatGPT, has provided limited information about its operations, but it has hinted at the scale of its data centers. As part of the United States Stargate Project, OpenAI will collaborate with other AI titans to build the largest data centers yet. Industry insiders expect that dozens of such facilities will be needed to meet user demand.
"We used the figure of 0.34 watt-hours per query, which is what OpenAI's Sam Altman stated in a blog post," Smith explained. "However, some researchers say that even the smartest models can consume over 20 Wh for a complex query." The number of queries per day is estimated to be in the billions, with OpenAI's usage statistics suggesting an exponential growth curve.
The implications of this energy consumption are far-reaching. As AI continues to become increasingly integrated into daily life, the strain on the grid will only continue to grow. This has significant consequences for energy policy and infrastructure development.
"AI is a major driver of energy demand," said Dr. Rachel Kim, a leading expert in AI and energy. "As we move forward with large-scale deployment of generative models, it's essential that we consider the environmental impact and develop sustainable solutions."
The Stargate Project aims to address these concerns by building data centers that are not only massive but also environmentally friendly. These facilities will be powered by renewable energy sources and designed to minimize their carbon footprint.
As the AI industry continues to grow, so too will its energy demands. It remains to be seen whether current infrastructure can keep pace with this growth or if new solutions will be needed to mitigate the impact on the environment.
Background and Context
Generative AI models like ChatGPT have revolutionized the way people interact with technology. These models use complex algorithms to generate human-like responses to user queries, but they require significant computational power to function. As a result, data centers are being built to house these massive computing operations.
The Stargate Project is a collaborative effort between OpenAI and other AI titans to build the largest data centers yet. These facilities will be powered by renewable energy sources and designed to minimize their carbon footprint.
Additional Perspectives
Industry insiders expect that dozens of Stargate-class data centers will be needed to meet user demand. This has significant implications for energy policy and infrastructure development.
"AI is a major driver of energy demand," said Dr. Rachel Kim, a leading expert in AI and energy. "As we move forward with large-scale deployment of generative models, it's essential that we consider the environmental impact and develop sustainable solutions."
Current Status and Next Developments
The Stargate Project aims to address these concerns by building data centers that are not only massive but also environmentally friendly. These facilities will be powered by renewable energy sources and designed to minimize their carbon footprint.
As the AI industry continues to grow, so too will its energy demands. It remains to be seen whether current infrastructure can keep pace with this growth or if new solutions will be needed to mitigate the impact on the environment.
*Reporting by Spectrum.*