top of page
Search

Saying 'please' and 'thank you' to ChatGPT costs OpenAI millions, Sam Altman says

  • snitzoid
  • 1 day ago
  • 2 min read

First of all, being polite to anyone, animate or not is a waste of your time.


Second, you can be green and use AI. How? Er..I asked it? What a load of crap.




Saying 'please' and 'thank you' to ChatGPT costs OpenAI millions, Sam Altman says

Being nice to your AI chatbot requires computational power that raises electricity and water costs

By Shannon Carroll, Quartz Media

PublishedYesterday


Being polite to your AI assistant could cost millions of dollars.


OpenAI CEO Sam Altman revealed that showing good manners to a ChatGPT model — such as saying “please” and “thank you” — adds up to millions of dollars in operational expenses.


Altman responded to a user on X (formerly Twitter) who asked how much the company has lost in electricity costs from people being polite to their models.


“Tens of millions of dollars well spent — you never know,” the CEO wrote.


Sounds like someone saw what operating system Hal did in “2001: A Space Odyssey” and is going to be nice to their AI assistant just in case. Experts have also found that being polite to a chatbot makes the AI more likely to respond to you in kind.


Judging from Altman’s cheeky tone, that “tens of millions” figure likely isn’t a precise number. But any message to ChatGPT, no matter how trivial or inane, requires the AI to initiate a full response in real time, relying on high-powered computing systems and increasing the computational load — thereby using massive amounts of electricity.


AI models rely heavily on energy stored in global data centers — which already accounts for about 2% of the global electricity consumption. According to Goldman Sachs, each ChatGPT-4 query uses about 10 times more electricity than a standard Google search.


Data from the Washington Post suggests that if one out of every 10 working Americans uses GPT-4 once a week for a year (meaning 52 queries by 17 million people), the power needed would be comparable to the electricity consumed by every household in Washington, D.C. — for 20 days.


Rene Hass, CEO of semiconductor company ARM Holdings, recently warned that AI could account for a quarter of America’s total power consumption by 2030. That figure currently is 4%.


Polite responses also add to OpenAI’s water bill. AI uses water to cool the servers that generate the data. A study from the University of California, Riverside, said that using GPT-4 to generate 100 words consumes up to three bottles of water — and even a three-word response such as “You are welcome” uses about 1.5 ounces of water.

 
 
 

Recent Posts

See All

Comments


Post: Blog2_Post
  • Facebook
  • Twitter
  • LinkedIn

©2021 by The Spritzler Report. Proudly created with Wix.com

bottom of page