
"Every AI interaction, whether it's a chatbot response, an image generated from a text prompt, or a search query, is processed in a data centre. These are enormous facilities, sometimes the size of several football pitches, packed with servers that run continuously to keep the technology behind them operational."
"In 2023, data centres consumed 4.4% of all US electricity, a figure that could triple by 2028. A 2021 paper from Google and UC Berkeley estimated that training GPT-3 alone consumed around 1,287 megawatt hours of electricity, enough to power roughly 120 average American homes for a year."
"Deploying models in real-world applications and fine-tuning them for better performance draws large amounts of energy long after the original development phase is complete. A ChatGPT text search was estimated to use nearly ten times as much electricity."
AI interactions, while appearing seamless, rely on energy-intensive data centers that consume a substantial amount of electricity. In 2023, these centers accounted for 4.4% of US electricity usage, with projections suggesting this could triple by 2028. The training of AI models, such as GPT-3, requires immense energy, generating significant carbon emissions. The environmental impact of AI is becoming a mainstream concern as the demand for AI capabilities continues to grow, highlighting the hidden costs of technological advancements.
Read at Medium
Unable to calculate read time
Collection
[
|
...
]