News
The High Price of Politeness: Why Saying 'Please' to ChatGPT Costs Millions
- By John K. Waters
- 04/28/2025
In a world racing toward automation, it turns out that good manners come at a price — a steep one.
Earlier this month, OpenAI CEO Sam Altman responded to an X user who posed the question, "Wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models. Altman's response was characteristically dry: "Tens of millions of dollars well spent — you never know."
But behind the quip lies a deeper, more urgent reality: every extra word we type into AI models, every superfluous gesture of civility, comes with a computational cost, and that cost is adding up fast.
Every Word, Every Watt
The physics of AI are relentless. Unlike traditional software that responds with minimal processing, generative AI models spin up massive computations every time they're prompted, even for a simple "thank you." In an interview with NPR, Jesse Dodge, senior research analyst at the Allen Institute for AI, said a single ChatGPT query uses enough electricity to power a lightbulb for about 20 minutes. Multiply that by millions of users daily, and you begin to glimpse the scale of the energy footprint.
Adding a few extra words to a prompt may seem trivial, but in practice, it means more data is processed, more servers are engaged, and more electricity is burned. According to a report from Goldman Sachs, each ChatGPT-4 query consumes roughly ten times the electricity of a traditional Google search. The cumulative effect is staggering: if one out of every ten working Americans used GPT-4 once a week for a year, the power required would match the electricity consumed by every home in Washington, D.C., for nearly three weeks.
Meanwhile, companies like Google and Microsoft — two of the world's biggest AI investors — are already reporting massive spikes in emissions, with Google revealing a 48% rise since 2019, largely attributed to the growing demands of AI data centers.
Water and Air: Hidden Costs
Electricity is only part of the story. Behind every AI query lurks a hidden cascade of environmental costs. As Pure AI readers undoubtedly know, AI servers generate enormous heat, requiring constant cooling, often through water-intensive systems. A study from the University of California, Riverside, found that generating 100 words with GPT-4 consumes up to three bottles of water. Even a three-word courtesy like "You're welcome" translates to about 1.5 ounces.
As AI becomes a ubiquitous presence in everyday life, experts warn that its environmental toll is only beginning to show. Rene Haas, CEO of semiconductor company ARM Holdings, recently told The Wall Street Journal that AI could account for a quarter of America's total electricity consumption by 2030, a sharp jump from its current share of about 4%. AI models like OpenAI’s ChatGPT "are just insatiable in terms of their thirst" for electricity, Haas said
"We have an existential crisis right now. It's called climate change, and AI is palpably making it worse," Alex Hanna, director of research at the Distributed AI Research Institute, told NPR.
The Etiquette Dilemma
Ironically, although politeness to machines may be hastening real-world harm, many AI researchers argue that being courteous to chatbots is, well, important — not for the sake of the AI, but for the humans training it.
According to Microsoft's WorkLab, users who adopt polite language tend to get better, more cooperative responses from AI tools like Copilot. Kurtis Beavers, a director on the Copilot design team, noted that respectful phrasing often "sets a tone" for the interaction, reinforcing good communication habits between humans and machines.
Surveys back this up. A December 2024 poll from Future, the publisher of TechRadar, found that 67% of American AI users say they are polite when addressing AI systems. Among them, 82% said they do so simply because it feels like the "nice" thing to do, while a mischievous 18% confessed they were hedging against the possibility of a future robot uprising. Seriously.
Whether driven by etiquette, habit, or half-joking paranoia, human tendencies to anthropomorphize machines now carry tangible environmental costs, a reality with which users, developers, and policymakers are only beginning to grapple.
A High-Tech, High-Stakes Tradeoff
Altman's casual remark underscores a growing tension in the AI era: the clash between human values and computational costs. In the rush to build AI companions that feel natural, intuitive, and "human-like," we are also literally burning through energy and water at unsustainable rates.
While polite queries may encourage better AI responses today, they raise an unsettling question for tomorrow: How much are we willing to pay, in carbon and kilowatts, to preserve our basic human decency — even when our audience is made of code?
About the Author
John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS. He can be reached at [email protected].