The article discusses the substantial energy consumption associated with various AI models, highlighting how a single query to an 8B parameter model like Llama 3.1 uses around 57 joules, doubling with energy demands to 114 joules. Larger models, such as GPT-4, likely consume even more energy, with significant impact on the environment. AI video generation, in particular, is noted as extremely energy-intensive, with a five-second video using about 3.4 million joules. These findings illustrate the urgent need for transparency and efficiency in AI development.
Although its true size is a mystery, OpenAI's GPT-4 is estimated to have well over a trillion parameters, meaning its per-query energy footprint is likely far higher than the Llama queries tested.
AI video generation, on the other hand, is an energy sinkhole. In order to generate a five-second long video at 16 frames per second, the CogVideoX AI video generation model consumes a whopping 3.4 million joules of energy.
Collection
[
|
...
]