
Dear Cherubs, the energy bill behind AI is no pocket change, but the internet does love a dramatic number with a side of panic. The truth is less “ChatGPT is secretly powering a nation” and more “the machines behind your chatbot are starting to look very, very hungry.”
THE BIG NUMBER
According to the International Energy Agency, data centres used around 415 terawatt-hours of electricity in 2024, which was about 1.5% of global electricity demand. The IEA also says that data-centre electricity use has been growing by roughly 12% a year since 2017, and the United States accounts for the largest share, followed by China and Europe.
That is already a sizable chunk of the grid, and the IEA says it is not stopping anytime soon. Its outlook says electricity demand from data centres could reach more than 1,000 TWh by 2026, which is roughly the same as Japan’s total electricity consumption. So yes, the “some countries” line is not pure internet nonsense. It is just pointing at the whole data-centre ecosystem, not one lonely chatbot reply.
THE SMALL PRINT
Here is where the nuance matters, because nuance is the part of the internet that usually gets left in the taxi. A 2025 arXiv preprint estimated that a short GPT-4o query uses about 0.42 Wh of electricity, compared with about 0.30 Wh for a Google search. In other words, the cost of one prompt is modest; the cost of billions of prompts, multiplied across huge data centres, is where the real bill starts to look rude.
So is ChatGPT using more energy than some countries? If you mean one prompt, no. If you mean the wider AI and data-centre stack powering it, then the claim lands much closer to reality than the headline-grabbing version suggests. That is the bit people miss while they are busy spilling the tea.
The bigger takeaway is not that AI must be cancelled by lunchtime. It is that efficiency matters. The IEA says renewables are expected to meet nearly half of the additional data-centre electricity demand to 2030, but natural gas and coal still play a major role in the near term. So the future of AI is not just about smarter models; it is also about cleaner power, better cooling, and chips that do more with less.
International Energy Agency — https://www.iea.org/reports/electricity-2024/executive-summary
International Energy Agency — https://www.iea.org/reports/energy-and-ai/executive-summary
International Energy Agency — https://www.iea.org/reports/energy-and-ai/energy-supply-for-ai
arXiv — https://arxiv.org/html/2505.09598v1





Leave a comment