We're all going to be paying AI's Godzilla-sized power bills
Briefly

We're all going to be paying AI's Godzilla-sized power bills
"It appears that AI-ready datacenters will be growing at a compound annual growth rate (CAGR) of 33 percent per year between today and 2030. That's a heck of a lot more datacenters, which, in turn, means a hell of a lot more power consumption. Why? Well, AI sucks so much power down because training and operating modern models, especially generative AI, requires extremely intensive computational resources and vast amounts of data processed in parallel across large clusters of high-performance GPUs and TPUs."
"For example, the training phase of a state-of-the-art AI model requires repeated adjustment of billions to trillions of parameters. That process alone requires thousands of GPUs running simultaneously for weeks or months at a time. Adding insult to injury, each of those special AI chips draws far more juice than your run-of-the-mill CPU. But once the training is done, it doesn't take that much power, does it? It does."
AI datacenters already consume enormous electricity, with a large AI-dedicated facility typically requiring about 100 megawatts—roughly the energy used by 100,000 homes. There are hundreds of AI datacenters today, and AI-ready facilities are projected to grow at a compound annual growth rate of 33 percent through 2030, greatly increasing power demand. Training state-of-the-art models adjusts billions to trillions of parameters and can require thousands of GPUs running for weeks or months. Specialized AI chips draw significantly more power than typical CPUs, and inference and serving of models also consume substantial energy despite limited public disclosure.
Read at Theregister
Unable to calculate read time
[
|
]