the pollution of AI

The Hidden Pollution of AI | Why “Clean Algorithms” Are a Myth

Artificial Intelligence dazzles us with its promise: faster insights, smarter automation, creative tools at our fingertips. But behind the scenes, it’s leaving a darker trail: massive energy use, water depletion, e-waste, and even public health costs. If we don’t confront AI’s environmental toll, we risk undermining one of the few tech revolutions that could help us tackle climate change. Here are the deeper reasons why AI is bad for the environment — with hard data, expert warnings, and some paths forward.

To receive the Luxiders Newsletter, sign up here.

Gigantic Energy Consumption & Carbon Emissions

AI is a tool with enormous potential to help us solve climate change, disease, scientific discovery, and more. But we must stop whitewashing it as “green by default.” There’s real pollution, resource strain, and inequity baked into its infrastructure.

Training large AI models is not cheap, in energy terms. In Harvard Business Review, Shaolei Ren and Adam Wierman report that training a single large AI model can consume thousands of megawatt-hours of electricity and emit hundreds of tons of CO₂ According to How Companies Can Mitigate AI’s Growing Environmental Footprint, some estimates suggest that running a large AI model over its lifetime emits more carbon than a standard car.

On other hand, the MIT News piece Explained: Generative AI’s environmental impact highlights that generative models (like Chat GPT) require huge computational power — and that inference (serving user queries) also leads to continuous energy consumption.

In the U.S., analysis of 2,132 data centers indicates they consumed over 4% of national electricity, produced more than 105 million tons of CO₂e, and had carbon intensities 48% higher than average U.S. values.  As more AI models are trained, retrained, and used heavily, that power demand grows faster than many grids and renewable systems can keep up.

In terms of AI, we need new norms: environmental impact audits, regulation, open standards, and engineering that values sustainability as much as performance.

Water Usage & Cooling Loses

It’s not just about electricity, AI infrastructure often leans heavily on water to cool down servers and keep them from melting.

The Uneven Distribution of AI’s Environmental Impacts paper notes that data centers require vast amounts of freshwater evaporation to manage heat. In Environmental Impact of Artificial Intelligence, some projections estimate that by 2027, AI operations could withdraw 4.2 – 6.6 billion cubic meters of water – more than half of the UK’s annual water usage.

A landmark study titled Holistically Evaluating the Environmental Impact of Creating Language Models found that beyond just training, the model development phase (data collection, architecture work, testing) contributed roughly 50% of the total carbon emissions and consumed millions of liters of water.

Cooling innovations —like liquid cooling instead of air— can help: research suggests liquid cooling may cut emissions by up to 50%, enable near-zero water usage, and reduce building size. In drought-prone regions, that water demand can be catastrophic: local ecosystems and people lose out.

Google’s own claim that a typical AI text prompt uses just “5 drops of water” (0.26 ml) and 0.24 watt-hours sparked criticism: experts argue it understates broader system use, ignores indirect water use, and may mask scale effects.

the pollution of AI deep reportage by Luxiders Magazine
© Pawel Czerwinski via Unsplash

E-waste and Other Hidden Costs of AI

AI demands more than compute cycles

It demands cutting-edge hardware, and that creates upstream environmental costs: GPUs (graphics processing units), ASICs, and specialized chips are built with rare metals, heavy mining, and toxic chemical processes. As models evolve, companies frequently upgrade hardware, discarding older equipment and adding to global electronic waste (e-waste). Artificial Intelligence and the Environment from the University of Delaware calls out e-waste as a serious impact: components contain mercury, lead, cadmium—hazardous to soil and water. The environmental burden is not just during runtime, it’s baked in, from mining to disposal.

 

AI’s footprint isn’t evenly felt

Some communities suffer more, especially those already under stress. A study “The Unpaid Toll: Quantifying the Public Health Impact of AI” argues that emissions from AI (across manufacturing, operation, and disposal) degrade air quality via fine particulate matter, leading to health burdens. The authors estimate that U.S. data centers in 2030 could impose $20+ billion/year in public health costs.

Reuters reported that between 2020 and 2023, indirect emissions of major tech firms (driven by AI infrastructure) rose 150%, driven by energy-hungry data centers.

The Smithsonian Magazine warns that with more AI usage, we’ll see more sprawling data centers, more fossil-fuel-based grids, and increased water use and emissions.

Of course, environmental harm and health costs are often “externalities”, that means not paid for by AI companies directly, but borne by local residents, often in lower-income or marginalized areas.

 

Rebound Effect & Jevons Paradox

Improving efficiency doesn’t always help the environment if usage skyrockets. For instance, the Jevons Paradox holds that as a technology becomes more efficient (cheaper per use), its usage often expands, canceling out gains. AP News notes that a single AI search can consume 23 times more energy than a typical Google search, and as AI becomes more embedded, these energy costs amplify.

Google’s own claim that a typical AI text prompt uses just “5 drops of water” (0.26 ml) and 0.24 watt-hours sparked criticism: experts argue it understates broader system use, ignores indirect water use, and may mask scale effects. Efficiency alone won’t solve this. If AI becomes more pervasive, aggregate environmental damage may still rise.

“A single AI search can consume 23 times more energy than a typical Google search, and as AI becomes more embedded, these energy costs amplify”

© Nahrizul Kadri via Unsplash

Don’t say it is sustainable if you can not prove it

AI isn’t inherently evil, but we must adopt rigorous guardrails, transparency, and sustainable designs. Some suggested pathways:

The UK’s National Engineering Policy Centre has called for mandatory reporting of energy and water use in data centers. They should respect green architectures & “green AI” principles, designing models to be smaller, more efficient, validated with environmental cost limits. And we all should use techniques that share or reuse computations instead of retraining from scratch.

Leverage ideas like SHIELD, a framework that co-optimizes carbon, water, and energy across geographically distributed data centers,  shows reductions of up to 3.7× carbon, 1.8× water usage versus status quo.

We all should shift from air cooling to liquid cooling, which can drastically reduce emissions and water needs. We also should integrate waste heat recapture (for heating buildings, universities, etc.) as in the Aquasar project.

We should choose data center locations where renewable energy is abundant and grid infrastructure is clean. But we should also encourage direct use of solar, wind, hydro, or even nuclear. Upgrade grid resilience is also a “must” – AI spikes often stress grids, pushing reliance onto fossil fuel backups –, and track not just training, but hardware manufacturing, transportation, disposal, and public health impacts.

Hightlight Image:
© Unsplash

This site is registered on wpml.org as a development site. Switch to a production site key to remove this banner.