Being polite to ChatGPT costs OpenAI tens of millions
Briefly

The article discusses the hidden energy costs of being polite to AI chatbots like ChatGPT. When asked about the electricity usage associated with polite interactions, OpenAI CEO Sam Altman revealed that it costs the company millions and views the expenditures as worthwhile. As data center energy consumption surges, powered largely by AI, there are projections that electricity used by US data centers could increase from 4.4% to 12% of the nation's total by 2028. Amidst these concerns, the push for more server capacity is escalating.
Conventional wisdom holds that being polite to AI chatbots makes them respond better, but no one stops to think how much energy that politeness is wasting.
CEO Sam Altman admitted it costs the super lab millions of dollars in operational expenses - money that he nonetheless believes is worth it.
As of late last year, US datacenters ate up about 4.4 percent of the electricity in the country, and the Department of Energy expects that number to reach 12 percent by 2028.
The International Energy Agency expects global datacenter electricity consumption to more than double between now and 2030, with the world's DCs consuming as much leccy as the country of Japan.
Read at Theregister
[
|
]