AI integration across devices and services increases computing and storage needs, driving higher electricity consumption. Large AI models require massive datasets and high compute, making training more energy intensive than traditional computing tasks. Data centers that store and process AI workloads consume significant power and can strain local grids as investments and user adoption grow. Energy impacts should be evaluated relative to overall technology energy use, including office lights, laptops, and social media. Efficiency improvements, measurement of per-query or per-task energy, and operational choices can reduce AI emissions. Businesses and individuals can manage and mitigate AI footprints.
AI feels inescapable. It's everywhere: Your smartphone, Google, even your work tools. AI features promise to make life easier and more productive, but what does all this new tech mean for the environment? As AI investment grows -- as does user adoption -- so do the technology's energy costs. Made up of high-compute systems, AI requires a lot of data, which needs to be stored on large networks of computers known as data centers.
Just like your personal computer, those gigantic centers need electricity -- as does the process of training an AI model, which relies on more compute than traditional computer functions. Also: Google reveals how much energy a Gemini query uses - in industry first But in the context of the energy we already use every day, from office lights and laptops to social media, how does that consumption actually compare?
Collection
[
|
...
]