Is AI a climate problem?

Henrik Kniberg, co-founder and chief scientist at Ymnig AI, shares his thoughts on AI and its climate impact with us at GoClimate.
Henrik Kniberg

Last updated: 2025-02-25


Text: Henrik Kniberg

Are you one of the hundreds of millions of people using AI assistants like ChatGPT, Claude, or similar? Are you concerned about the energy you consume when asking them for help? You're not alone. When ChatGPT made its breakthrough in 2022, and subsequent models became increasingly powerful, concerns about the climate impact of AI services surged—rightfully so. But much has happened since then. A ChatGPT query today uses about as much energy as a standard Google search—around 0.3 watt-hours (an approximate estimate, not an exact figure). That’s equivalent to keeping an LED light on for a few minutes. This is a far cry from early estimates that suggested an AI conversation would consume ten times more energy than a search. This improvement is due to rapid technological advancements. For example, OpenAI's GPT-4o model uses only one-third of the energy compared to its predecessor. At the same time, new AI chips from companies like Nvidia have made it possible to deliver the same performance with dramatically lower power consumption.

Increased usage drives up total energy consumption

But—there’s a downside. While energy consumption per query has decreased, the number of users and queries has skyrocketed. By the end of 2024, ChatGPT was handling around a billion conversations—per day. As a result, the total energy consumption of AI services continues to rise. This mirrors the development of fuel-efficient cars: even if each car uses less gasoline, total fuel consumption can increase as more people drive and cover longer distances. AI faces a similar challenge.

Breakthroughs in energy efficiency

However, there is hope. In early 2025, a surprising breakthrough came when the Chinese company DeepSeek introduced a new AI model. It performs at the same level as ChatGPT but operates with only a tenth of the energy consumption. It’s as if someone suddenly invented a car that drives just as fast as any other but consumes a fraction of the fuel. This led analysts to quickly revise their pessimistic forecasts for AI’s future energy demands, showing that powerful AI systems can be built far more efficiently than previously thought.

AI as part of the solution

Interestingly, AI is becoming part of the solution to its own climate challenges. The latest AI models, such as OpenAI's DeepResearch, are approaching human research capabilities and have already helped develop more efficient computer chips and software. The same technology could be used to accelerate research into cleaner and more efficient energy production. Tech giants like Microsoft and Google are also making strong environmental commitments. They have pledged to run their data centers on fossil free energy, with Google aiming to be entirely carbon-free by 2030.

What does this mean for you as a user?

As an individual user, there’s no longer a reason to feel climate anxiety over using modern AI assistants. The energy consumption per query is now on par with other everyday digital activities. The real climate challenge lies at the system level. The future will be determined by the race between two opposing forces: on one side, the exponential increase in AI usage; on the other, the rapid development of more efficient technology. Which will prevail remains to be seen. But one thing is certain—AI will play a key role in solving its own climate challenges.

Sources and data

I used GPT DeepResearch to compile the factual basis for this article. The result was a lengthy, detailed report with numerous sources. If you're curious, you can view both my prompt and the full report here: https://chatgpt.com/share/67bb4ebc-1598-800c-bb48-c33b6f546340

Henrik Kniberg has written this article on behalf of GoClimate.