The rapid expansion of artificial intelligence is significantly increasing the need for data centers, which in turn escalates water usage. (This surge in water consumption is primarily for cooling the servers housed within these centers.) As reported by FT, Virginia, which hosts the largest array of data centers globally, saw its water use soar by nearly two-thirds from 2019 to 2023, escalating from 1.13 billion gallons to 1.85 billion gallons.
Observers worldwide view this trend as unsustainable. Microsoft, which operates numerous data centers, noted that in 2023, 42% of its water use occurred in “areas with water stress.” Similarly, Google, owning one of the most extensive networks of data centers, revealed this year that it extracted 15% of its freshwater from regions facing “high water scarcity.”
The question arises: Why don’t data centers employ a closed-loop system to reuse water? While many already do, a significant portion of the water they use is dedicated to maintaining humidity levels, resulting in evaporation. This issue is particularly acute in arid regions, where insufficiently humidified air can turn into a potent conductor for static electricity, posing serious risks to electronic equipment.
Compiled by Techarena.au.
Fanpage: TechArena.au
Watch more about AI – Artificial Intelligence


