Environmental Impact of Large Language Models: Green or Polluting?

Wind Turbines On Mountain Horizon With Glowing Digital Network Grid Overlay, Representing Renewable Energy And Smart Technology Integration In Blue Tones At Sunset.

Large language models have low per-use emissions but high training and hardware production costs. Though each interaction is energy-efficient, large-scale usage and upstream impacts raise environmental concerns. Sustainable AI requires lifecycle assessments, transparency, and cleaner energy sources.

Introduction

In recent years, large language models (LLMs) such as ChatGPT, Copilot, Claude, and Gemini have seen explosive growth in adoption across industries and everyday life. From customer service chatbots and content generation tools to coding assistants and educational platforms, LLMs are rapidly becoming embedded in digital infrastructure. This widespread integration has brought unprecedented convenience and productivity — but also raised growing concerns about the environmental costs associated with developing and operating these powerful AI systems. While many assume that LLMs are inherently carbon-intensive, particularly given the massive computational power they require, recent research suggests a more nuanced picture. In particular, the emissions generated per interaction may be far lower than commonly feared, prompting a closer examination of where the true environmental impact lies.

CO2 emissions for training and usage LLM

The environmental footprint of large language models can be broadly divided into two categories: training emissions and usage emissions. Training emissions refer to the one-time but highly energy-intensive process of developing the model, which involves running vast computations over weeks or even months using powerful hardware across large-scale data centers. For example, training OpenAI’s GPT-3 consumed an estimated 1,287 MWh of electricity and resulted in approximately 552 metric tons of CO₂ equivalent (tCO₂e), while projections for GPT-4 suggest emissions as high as 21,660 tCO₂e. In contrast, usage emissions — the energy required to generate a response or “prompt” once the model is deployed — are significantly lower, often measured in grams of CO₂ per prompt. Research by Tomlinson et al. indicates remarkably low emissions of approximately 1.9 gCO₂e per generated image [1], representing one of the most resource-intensive prompts. Even conservative estimates place these around 4.3 gCO₂e per interaction [2], making them comparable to or lower than many everyday digital activities. While training emissions are front-loaded and substantial, usage emissions are lightweight and accumulate gradually depending on scale. Understanding this distinction is key to designing both sustainable infrastructure and informed policy for AI deployment. Nevertheless, this energy-intensive phase is typically a one-time investment, with subsequent usage characterized by considerably lower emissions.

Comparing to tangible products

Even adopting the higher estimate (4.3 gCO₂e), the environmental impact per prompt remains minimal when compared to daily consumer activities. For instance, importing asparagus from Peru or Mexico by air freight produces roughly 14 kgCO₂e per kilogram [3]. Considering that one kg of asparagus provides about seven servings, each serving accounts for around 2 kgCO₂e. This means one would need to generate 430 prompts to equal the carbon footprint of a single serving.  Furthermore, everyday transportation does not fall short, for instance a car journey from Bern to Biel (41 km) emits around 11 kgCO₂e [3], corresponding to about 2560 prompts, whilst the same journey by train emits roughly 0.6 kgCO₂e [3], equivalent to about 140 prompts.

Discussion

While the per-prompt emissions of LLMs appear low, especially when compared to daily activities, a more comprehensive assessment requires attention to large-scale and systemic effects. As the usage of AI-powered tools scales rapidly, even modest emissions per interaction can aggregate into a significant environmental footprint. With billions of prompts processed each day globally, the cumulative impact could rival that of traditional industries.

Moreover, the environmental cost extends beyond energy consumption. The manufacturing of AI-specific hardware, such as high-performance GPUs, involves the extraction of rare earth metals, high water usage, and complex global supply chains. These upstream impacts are often overlooked in standard CO₂ assessments but contribute substantially to the overall ecological footprint of AI technologies.

Another key consideration is the energy source used for model training and deployment. While some AI providers utilize renewable energy, many data centers continue to rely on fossil fuels, particularly in regions where clean energy infrastructure is limited. This variation introduces significant disparities in the true environmental cost of AI depending on geographic and economic context.

Finally, there is a growing concern about “greenwashing” in the AI sector. Companies may highlight low operational emissions or isolated sustainability efforts while neglecting transparency about total lifecycle impacts. For genuinely sustainable AI development, comprehensive lifecycle assessments and standardized environmental reporting are essential.

Conclusion

In summary, while AI—particularly large language models (LLMs)—is often perceived as environmentally detrimental, a closer examination reveals a more complex and balanced reality. The emissions generated per interaction are relatively low, especially when compared to common activities such as food consumption. However, the substantial energy required for training, the environmental cost of hardware production, and the scaling of global AI use raise legitimate concerns. Encouraging developments like DeepSeek’s energy-efficient architecture and BLOOM AI’s use of low-carbon energy sources show that the environmental footprint of LLMs can be significantly reduced through thoughtful design and infrastructure choices. Still, ensuring the sustainable future of AI technologies requires more than incremental improvements. Future research must focus on comprehensive lifecycle assessments, standardized carbon accounting methods, and energy-efficient training techniques. It should also consider behavioral and rebound effects from mass adoption and the development of policy frameworks that support transparency, accountability, and environmental responsibility. Only through such a holistic, multi-dimensional approach can AI align with long-term climate goals while continuing to drive innovation.

 

Bibliography

[1] B. Tomlinson, R. W. Black, D. J. Patterson, and A. W. Torrance, ‘The carbon emissions of writing and illustrating are lower for AI than for humans’, Sci. Rep., vol. 14, no. 1, p. 3732, Feb. 2024, doi: 10.1038/s41598-024-54271-x.

[2]  V. Wong, ‘Gen AI’s Environmental Ledger: A Closer Look at the Carbon Footprint of ChatGPT’, Piktochart. Accessed: Mar. 26, 2025. [Online]. Available: https://piktochart.com/blog/carbon-footprint-of-chatgpt/

[3] N. Affolter, L. Hänni, and C. Klopfenstein, ‘Know thy impact: Developing a Comprehensive Digital Twin for Estimating Environmental Impact of Individual Behaviour’, Bachelor’s Thesis, Bern University of Applied Sciences, 2024.

[4] D. Patterson et al., ‘Carbon Emissions and Large Neural Network Training’.

[5] Y. Yu et al., ‘Revisit the environmental impact of artificial intelligence: the overlooked carbon emission source?’, Front. Environ. Sci. Eng., vol. 18, no. 12, p. 158, Oct. 2024, doi: 10.1007/s11783-024-1918-y.

 

 

Creative Commons Licence

AUTHOR: Stefan Grösser

Stefan Grösser is Professor of Decision Sciences and Policy and heads the Management Science, Innovation and Sustainability research group at BFH Technology & Informatics. He lectures in the Master of Engineering (MSE) program and works on several research projects in the fields of simulation methodology (system dynamics, agent-based modeling, machine learning), decision-making using artificial intelligence (decision-making and management science), and circular economy (circular economy, circular business models). His industries of focus are the solar, energy, and healthcare sectors. He also contributes to modern learning technologies.

AUTHOR: Luis Felipe Olivares Pfeifer

Create PDF

Related Posts

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *