Google reports Gemini uses about 0.24 watt-hours of electricity and 0.26 milliliters of water per median-length prompt, far below earlier estimates of roughly 45–47.5 ml per page for medium-sized models. Datacenters consume water on-site via evaporative cooling towers to chill incoming air and prevent CPU/GPU overheating; about 80 percent of water removed from nearby watersheds goes to evaporative cooling at datacenters. Water is also consumed off-site when generating electricity because thermal power plants use cooling towers. Datacenters can therefore affect local watersheds through both direct and indirect water use linked to their energy sources.
AI's drinking habit has been grossly overstated, according to a newly published report from Google, which claims that software advancements have cut Gemini's water consumption per prompt to roughly five drops of water - substantially less than prior estimates. Flaunting a comprehensive new test methodology, Google estimates [PDF] its Gemini apps consume 0.24 watt hours of electricity and 0.26 milliliters (ml) of water to generate the median-length text prompt.
The power-hungry facilities often employ cooling towers, which evaporate water that's pumped into them. This process chills the air entering the facility, keeping the CPUs and GPUs within from overheating. Using water is more power-efficient than refrigerant. By Google's own estimates, about 80 percent of all water removed from watersheds near its datacenters is consumed by evaporative cooling for those datacenters.
But water is also consumed in the process of generating the energy needed to keep all those servers humming along. Just like datacenters, gas, coal, and nuclear plants also employ cooling towers which - spoiler alert - also use a lot of water. Because of this, datacenters that don't consume water directly can still have a major impact on the local watershed.
Collection
[
|
...
]