OpenAI chief executive Sam Altman says that a ChatGPT query consumes roughly 0.34 watt-hours of electricity, which is “about what an oven would use in a little over one second."
What Happened: Altman, in a blog post he published on Tuesday, also mentioned that each ChatGPT query consumes "about 0.000085 gallons of water,” which is roughly one-fifteenth of a teaspoon.
Altman argues that "the cost of intelligence should eventually converge to near the cost of electricity," positioning the chatbot as comparatively frugal even as environmental watchdogs scrutinize AI's hidden climate bill.
Trending: Maker of the $60,000 foldable home has 3 factory buildings, 600+ houses built, and big plans to solve housing — this is your last chance to become an investor for $0.80 per share.
OpenAI has not publicly shared details of their methodology, but the disclosure lands as researchers warn that data-center power demand could outstrip Bitcoin Mining by next year. One study from Vrije Universiteit Amsterdam projects AI could soon swallow nearly half the electricity flowing into global server farms.
Water use complicates the equation. A Washington Post investigation found that drafting a 100-word email with GPT-4 required "a little more than one bottle" of water, with consumption varying widely by data-center location. MIT researchers likewise warned in January that generative models are driving up both energy and water footprints as companies race to scale.
See Also: Invest where it hurts — and help millions heal: Invest in Cytonics and help disrupt a $390B Big Pharma stronghold.
Why It Matters: The Department of Energy projects that U.S. data centers may consume up to 12% of the nation’s electricity by 2028, a share that could climb as AI adoption accelerates.
Altman's efficiency pitch follows corporate pledges to curb resource demand, yet Yale's Environment 360 notes hyperscale facilities still guzzle millions of gallons annually. Planet Detroit adds that generative systems may use 33 times more energy than conventional software for the same task.
Whether the teaspoon math holds up, Altman insists smarter algorithms and automated chip fabrication will keep driving the price and resource cost of "intelligence" lower.
Image via Shutterstock
Read Next:
- Are you rich? Here’s what Americans think you need to be considered wealthy.
- These five entrepreneurs are worth $223 billion – they all believe in one platform that offers a 7-9% target yield with monthly dividends
© 2025 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.