The Hidden Environmental Cost of Training Large Language Models

Wired by Will Knight February 10, 2026 · 23:59 1 day ago
Summary

New research quantifies the water and energy consumption of frontier AI model training runs.

A comprehensive study published this week has attempted to quantify the full environmental footprint of training frontier large language models, including factors often overlooked in previous analyses such as water consumption for data center cooling, the embodied carbon of specialized hardware, and the energy cost of the extensive evaluation and red-teaming processes.

The research estimates that training a single frontier model requires approximately 25 million liters of water and produces carbon emissions equivalent to 300 transatlantic flights. However, the authors note that these costs should be weighed against the potential efficiency gains enabled by AI applications across industries.

Analysis

Relevance Score
75%
High
Sentiment
↓ -0.22 Neutral
Source Reliability
7.6 / 10 Medium

Source

Wired
wired.com
Tech US