Green AI: Can intelligence survive its own energy hunger?

Table of Contents

The rise of generative artificial intelligence has sparked a global wave of innovation and optimism, but also a growing sense of concern. As language models, image generators, and other intelligent systems become increasingly integrated into everyday tools and workflows, a fundamental question arises: how much energy does AI actually consume, and is it sustainable?

Once confined to research labs and supercomputers, high-performance computing is now mainstream. Millions of people interact daily with models like GPT. Businesses deploy AI to automate, generate, and optimize their workflows. And all of this requires one thing in growing amounts: power.

But what is the true scale of this energy demand? Can AI align with global decarbonization targets? And what’s the trade-off between AI’s benefits and its environmental footprint?

These questions are no longer abstract; energy and infrastructure providers are already feeling the pressure, as AI reshapes how electricity is consumed, delivered, and even priced.

Data centers: AI’s industrial engine room

Every AI prompt, image, translation, or code snippet is processed within a data center: a highly engineered environment that houses servers, storage arrays, high-speed networks, and sophisticated cooling systems.

These digital factories are where inference occurs (when AI models respond to user requests) and, occasionally, where training takes place (the more computationally intensive process of building new models).

A typical data center’s power consumption is split into three main components:

  • IT equipment (40–50%) – servers, GPUs/TPUs, networking, storage
  • Cooling and thermal management (30–40%) – fans, HVAC, liquid cooling systems
  • Auxiliary systems (10–30%) – lighting, security, power conversion, backup systems

Source: Artificial Intelligence’s Energy Paradox: Balancing Challenges and Opportunities, World Economic Forum and Accenture, January 2025

Generative AI is intensifying these loads. Compared to traditional cloud applications, AI workloads are denser, more continuous, and more unpredictable, making them harder to scale efficiently within existing energy systems.

How much energy do data centers actually use?

In 2022, according to the International Energy Agency (IEA), data centers consumed between 240 and 340 terawatt-hours (TWh), roughly 1–1.3% of global electricity demand. When factoring in cryptocurrency mining and data transmission networks, that figure reaches around 2%.

As AI adoption accelerates, this share is expected to grow. The IEA projects that:

  • Data center electricity use could double by 2030
  • AI-specialized data center capacity is expanding 30% annually, compared to 9% for traditional infrastructure
  • AI-specific server deployments could drive a 165% increase in power demand by 2030, according to Goldman Sachs

And yet, today, AI accounts for only a fraction of total data center energy use, estimated to be between 10% and 15%, depending on the workload and region.

The real issue isn’t just volume, but concentration and growth rate. AI-driven loads are highly power-dense, harder to predict, and place substantial strain on local grids. In some U.S. states, utility companies have begun rationing power or delaying new service connections to hyperscale data centers.

Beyond speculation: How do we measure AI’s energy footprint?

Measuring the energy consumption of AI systems is surprisingly difficult, for both technical and methodological reasons.

Two main approaches have emerged:

1. The Top-Down Approach: Market-Based Estimations

This method begins with the hardware (typically GPUs) and scales up energy estimates based on the number of servers expected to be deployed for a given task.

One of the most cited examples comes from Alex de Vries, who in 2023 estimated the energy required if Google were to integrate generative AI into its search engine.

Using NVIDIA A100 servers as a reference, and based on third-party projections that this would require 400,000–500,000 servers, De Vries calculated a total energy footprint of 23–29 TWh per year — or about 7–9 watt-hours per search, which is 23–30 times higher than a conventional Google search, based on figures from a 2009 Google report.

Even De Vries admitted that these figures were speculative: the assumptions are not easily verifiable, the server count could be outdated within months, and newer AI models are achieving similar accuracy with a fraction of the computational cost.

Still, top-down methods offer one key advantage: they help flag macro risks early, especially when infrastructure is planned years in advance.

2. The Bottom-Up Approach: Measured Energy per Task

In this approach, researchers run AI models (often open-source) in real environments, measure the actual energy usage per inference or training cycle, and extrapolate from there.

The results vary significantly:

  • Generating an image (e.g., via Stable Diffusion): ~0.5 Wh
  • Generating text (e.g., via GPT-like models): slightly less
  • Training a GPT-3 scale model: up to 1 GWh

However, inference (every time someone asks the AI a question and gets an answer) dominates the long-term energy footprint, not training (the teaching phase of an AI model). If models are queried billions of times daily (inference phase), even low per-request consumption leads to terawatt-hour-scale annual loads.

Recent benchmarks have added much-needed nuance. For example:

  • A 2024 study estimates that GPT-scale inference consumes ~0.34 Wh per query, with a range from 0.18–0.67 Wh depending on task complexity.
  • Another study finds that simple prompts consume <0.5 Wh, while long or multi-turn prompts can reach 4–5 Wh.

Notably, some public estimates may overstate AI energy use by 4× to 20×, because they ignore server optimization, caching, or shared inference loads.

So, do we have an Answer?

Luccioni and other researchers launched the AI Energy Score project, a public initiative to compare the energy efficiency of AI models on different tasks, which gives each model a star rating. 

Luccioni and her colleagues observed that the energy consumption of AI tasks varies significantly depending on the type of activity. Their most recent findings indicate that creating an image from a text prompt typically uses around 0.5 watt-hours of electricity, whereas text generation tends to consume slightly less.

So… Is AI Sustainable or Not?

The short answer: it depends.

AI, especially generative models, is undeniably energy-intensive, particularly at scale. But it also has the potential to become a key enabler of energy efficiency across sectors.

Ways AI can reduce energy consumption system-wide:

  • Smart grids: Optimize load balancing, predict outages, and enable better renewable integration
  • Predictive maintenance: In factories, AI reduces downtime and waste by forecasting failures
  • Building automation: HVAC systems powered by AI adapt in real-time to occupancy and preferences
  • Transport & logistics: Route optimization, fuel savings, reduced empty miles
  • EV charging management: Align charging with grid capacity and dynamic energy pricing
  • Energy forecasting: Improve the accuracy of solar/wind predictions, aiding better dispatch planning

In other words, AI is both a consumer and a controller of energy. The challenge is ensuring that its value exceeds its cost — not only in business terms, but in ecological and infrastructural terms.

Paths Toward a More Sustainable AI Ecosystem

Here are some key levers to align AI innovation with long-term energy sustainability:

1. Hardware innovation

  • Energy-efficient chips (e.g., neuromorphic, optical, custom ASICs)
  • Hardware-aware model design
  • Smarter cooling systems (immersion, phase change, etc.)

2. Green data center design

  • Modular and liquid-cooled architectures
  • On-site renewable energy and energy storage
  • Dynamic load balancing and demand response

3. Model efficiency

  • Use of quantization, pruning, and distillation
  • Smaller, specialized models (e.g., fine-tuned LLMs)
  • Active reduction of hallucination and retries

4. Energy benchmarking

  • Initiatives like the AI Energy Score and ML.ENERGY is critical to establish transparent metrics
  • Policymakers may push for energy labeling or emissions disclosure for foundation models

5. Cross-sector collaboration

  • AI developers, cloud providers, utilities, and regulators must co-design scalable infrastructure
  • Avoid local grid overloads or inefficient energy routing

6. Geographic strategy

  • Align data center location with renewable generation and cooling potential
  • Incentivize the use of low-carbon grids or on-site nuclear, geothermal energy

Final Thoughts: Responsibility, Not Just Optimization

Generative AI is not just a breakthrough technology; it’s a paradigm shift in how information, computation, and intelligence are distributed.

But with great power comes great… electricity bills.

If we want AI to become not just smart, but responsible, energy must become a first-class design principle, not an afterthought. The future of AI is not just about larger models or faster chips. It’s about building systems that are aligned with human, environmental, and infrastructural limits.

+ posts

AI Evangelist and Marketing specialist for Neodata

Keep Your AI Knowledge
Up-to-Date

Subscribe to our newsletter for exclusive insights, cutting-edge trends, and practical tips on how to leverage AI to transform your business. No Spam, promised.

 

By signing up you agree to our privacy policy.