OpenAI CEO Sam Altman opened the curtain on a bit of a technological secret: How much water and energy ChatGPT uses with each question you ask it. The numbers may shock you.
As per Altman’s latest blog entry on the future of AI, every ChatGPT question uses approximately 0.000085 gallons of water – about one-fifteenth of a teaspoon. To compare, you’d have to ask ChatGPT 240 questions in order to use the amount of water in a tablespoon.
The statement comes as there is growing concern over the environmental effects of the development and application of artificial intelligence across the globe.
Breaking Down the Energy Numbers of ChatGPT
Altman didn’t leave it at that. He also provided correct energy use figures to put AI’s power requirements into perspective in real-world terms. It takes around 0.34 watt-hours of power to process a single ChatGPT question – more or less the same as your kitchen device for more than a second, or what an energy-efficient LED lightbulb would use in two minutes.

These comparisons create a picture that’s much less sensational than might be anticipated. Although AI systems do need enormous computational brawn, individual questions actually seem to have a relatively small environmental footprint when put into perspective with daily domestic routines.
“The price of intelligence will at last equal the price of electricity,” Altman forecast in the blog post, guessing that as the technology of AI keeps developing, its main cost will be just the electricity required to power it.
Increasing Worries about AI’s Environmental Footprint
Though these per-query numbers might appear quite small, the aggregate argument over the environmental footprint of AI is still elevated. Globally, the sheer scale of AI usage means that even small per-user consumption levels can add up to very high numbers.
In February, scientists made a shocking forecast: artificial intelligence systems would use more electricity than the entire Bitcoin mining industry by 2025. That’s a comparison with special relevance, as Bitcoin mining has always been criticized for its outrageous energy use.
The Washington Post has also independently investigated AI water usage and determined that it would take “a little more than one bottle” of water to create a 100-word email with GPT-4. This indicates that more complex, longer AI tasks use proportionally more water.
Why Water Matters in ChatGPT?
You may ask yourself why AI technologies require water at all. The reason is in the enormous data centers where the technologies reside. These data centers have thousands of high-speed computers, which, when running, produce tremendous amounts of heat. Water is utilized for cooling such systems to avoid overheating.
As increasingly more AI finds its way into everyday life – virtual assistants and chatbots, auto-writing and solving sophisticated problems – the total water and energy requirements are increasing exponentially.
The Bigger Picture
Altman’s openness regarding ChatGPT’s energy usage is a big leap of progress in the current debate around AI environmental responsibility. By presenting tangible figures, OpenAI is enabling scientists, policymakers, and the general public to make better-informed choices regarding the use and regulation of AI.
But these single-query statistics only tell half the story. The true environmental footprint of AI goes beyond individual transactions to encompass the energy needed to train AI models, power data centers, and repeatedly update and fine-tune these systems.
When artificial intelligence becomes increasingly integral to business operations, education, healthcare, and personal productivity software, it will be critical to comprehend and regulate its environmental impact to ensure sustainable technological development.
The argument about AI’s effect on the environment is just getting started, but Altman’s announcement provides us with good points of data for anyone trying to figure out the actual cost of our growing reliance on artificial intelligence.