Artificial intelligence
AI: Friend or Foe to the Environment?
Author
Hugues Foltz
Lately, there’s been growing discussion about the environmental impact of artificial intelligence (AI) — and rightly so.
The meteoric rise of generative AI — think ChatGPT, Midjourney, or Gemini — has sparked legitimate questions about the massive amounts of energy needed to power these digital marvels. But let’s not lump everything together: there’s AI… and then there’s AI.
In this article, we’ll talk about energy, productivity, greenhouse gases (GHG), and, most importantly, shed light on the real issue: is AI harming our planet, or could it actually help protect it?
The Elephant in the Room: Generative AI and Its Energy Appetite
Let’s start with the hot topic: generative AI. Since 2022, these models have exploded in popularity, transforming many aspects of our daily lives. But behind their convenience and wonder lies a more down-to-earth — and energy-hungry — reality.
Large language models (LLMs) like GPT-4 or Claude require massive infrastructure to operate. Training them can take weeks on thousands of graphics processors (GPUs), consuming megawatts of electricity. According to some estimates, training GPT-3 required about 1,287 MWh of power — roughly equivalent to the annual electricity consumption of 120 U.S. households.
And that’s just the training phase. Every time you type a prompt and get a response, energy-intensive servers are working in data centers that must be cooled constantly. One study estimated that ChatGPT consumes about three bottles of water to generate just 100 words.
So yes, generative AI is resource-hungry — but it’s far from representative of all AI.
Traditional AI: Lean, Efficient, and Already Everywhere
When people think of AI, they often picture talking robots or self-writing text. Yet, much of the AI that has powered businesses for years is far more discreet — and far more efficient.
Traditional AI models such as decision trees, linear regressions, or simpler neural networks are used for:
- Stock demand forecasting
- Truck route optimization
- Fraud detection
- Customer data analysis
These algorithms are typically lightweight and low-energy, often running on a laptop or a small server without the need for giant GPU clusters or massive data centers. Their carbon footprint is minimal compared to next-generation generative models.
In other words: traditional AI quietly does most of the heavy lifting — without overheating the planet.
The Debate: Energy Costs vs. Net Benefits
The real question isn’t whether AI consumes energy — it’s whether that consumption is worth the benefits it delivers.
Nobody denies that generative AI, especially when used excessively or for trivial purposes (like generating endless unicorn-cat images), can have a heavy footprint. Some experts even warn that AI could soon become a major contributor to global electricity use — on par with aviation or the digital sector as a whole.
There’s also the issue of raw materials for chip manufacturing (such as GPUs) and the significant water usage required for data center cooling.
But the story doesn’t end there. When used responsibly, AI can dramatically reduce energy use in other sectors:
- Optimizing renewable energy production through better weather prediction
- Managing electrical grids to reduce losses
- Minimizing transport routes and fuel consumption
- Helping farmers optimize fertilizer use and predict crop disease
And across industries, AI — of all types — drives huge productivity gains: writing reports, automating workflows, enhancing customer service, and analyzing complex data in seconds.
In short, we must weigh the emissions we generate against the ones we prevent.
Striking the Right Balance: Toward Responsible AI
The takeaway is clear: we shouldn’t demonize AI as a whole. Yes, some generative models are energy-hungry, but AI itself is just a tool — and like any tool, its impact depends on how we use it.
Here are a few ways to move toward more sustainable AI:
1. Optimize models: Many companies are developing smaller, more efficient generative systems that are just as powerful but less resource-intensive.
2. Use traditional AI when possible: It’s often sufficient for 80% of analytical or predictive tasks.
3. Support green data centers: Powered by renewable energy, efficiently cooled, and strategically located.
4. Measure avoided emissions: Include GHG savings when assessing AI projects.
A Debate Worth Continuing — With Nuance
AI won’t save the planet on its own, but it can certainly help. And it won’t destroy it either — unless we use it recklessly.
The debate on AI’s environmental impact deserves nuance and evidence-based thinking. Rather than taking extreme positions, we should pursue a responsible approach where benefits outweigh costs.
Because in the end, AI doesn’t make the choices — we do.