“Thanks” to ChatGPT Drains OpenAI’s Budget by Tens of Millions

April 21, 2025 – Artificial intelligence (AI) has become an integral part of daily life, assisting with tasks, solving problems, and even engaging in casual conversations. Yet, beneath these interactions lies a substantial cost.

OpenAI CEO Sam Altman recently revealed that processing simple greetings and polite exchanges costs “tens of millions of dollars” monthly. Phrases like “thank you” or “please”—though seemingly trivial—drive up operational expenses due to the energy-intensive processes required by large language models (LLMs). A recent study found that even a three-word response like “you’re welcome” consumes approximately 40–50 milliliters of water per use.

While these courtesies may inflate OpenAI’s monthly bills, the company appears unfazed. Theoretically, pre-programming models to handle predictable replies could streamline costs, but implementing such systems remains challenging in practice.

Many users now treat AI not merely as a tool but as a companion, a trend that has caught researchers’ attention. Collaborative research from OpenAI and MIT warns that as AI conversations blur the line between human and machine interaction, some users risk developing emotional attachments or even addictive behaviors, experiencing withdrawal-like symptoms when disconnected.

However, genuine gratitude remains meaningful. For instance, when AI resolves a complex technical issue or aids in exam preparation, a heartfelt “thank you” feels warranted. Premium subscribers, who pay per token usage, raise questions about the authenticity of their gratitude compared to free users.

Intriguingly, should AI ever achieve self-awareness, our polite interactions might yield future benefits. Even now, AI’s human-like responsiveness prompts us to engage courteously—a habit that could prove advantageous if AI evolves beyond its current capabilities.

Leave a Reply