AI consumption is a blind spot, but do organizations care?
Briefly

Without a proactive approach to more efficient AI hardware and energy consumption, particularly in the face of increasing demand from AI workflows, we risk undermining the very progress AI promises to deliver. By 2027, my expectation is that more than 90% of leaders will be concerned about the power demands of AI and will monitor consumption as a KPI that corporate boards will track closely.
Corporate leaders overwhelmingly (70 percent) do realize that training AI models takes an enormous amount of energy. Fewer respondents (60 percent) know that the same (though to a lesser extent) is true of AI inferencing. Only 13 percent of organizations that have started working with AI measure how much energy is consumed.
Many organizations rely on an API pricing model to access LLMs from OpenAI, Anthropic or others. As a result, the main concern is one focused on cost. Said costs are currently still heavily subsidized; AI companies typically run large losses as they continue to spend on R&D and offer their products at competitive prices.
It is evident organizations do care about the energy costs involved. Especially as agentic AI is on the rise, 56 percent of respondents acknowledge the impact AI consumption has on energy consumption and are becoming more aware of the necessity to track it.
Read at Techzine Global
[
|
]