Artificial intelligence is changing every aspect of life today, from searching the internet and managing household services to business operations. As these rapid advances take hold, an important question emerges: How much energy does AI actually use, and what does this mean for our future? The answer is both astonishing and complicated: our convenience afforded by AI-powered devices and services comes with a hidden environmental cost that most users are not aware of.
The Scope of AI Energy Use
Let’s talk in numbers; and by the way, the numbers I provide will seem hard to comprehend. When researchers trained GPT-3, it took about 1,287 megawatt-hours of electricity to conduct just that one training session. To put that into perspective: enough electricity to run 120 typical American homes for a year. And this is just for the training of GPT-3; we will talk about additional energy use for different uses of AI and self-consuming devices.
Once these AI systems are made available to the public, they are a constant energy drain, every question we ask requires energy. Take ChatGPT as an example, with hundreds of millions of user globally, the backend system that keeps ChatGPT online day and night, week after week is substantial. When you ask ChatGPT a question, you use approximately 0.3 watt-hours of energy, or around ten times more energy than a typical Google search. When you laboratory the billions of questions that are being asked on AI platforms every day, you gets a sense of the massive energy demand for AI.
Examining Where The Energy Goes
To get an accurate picture of the energy requirements of AI research you have to consider the full system lifecycle that starts from the moment the creators begin training a model to the point where an agent/user works on the model.
| AI Stage | Energy Intensity | Carbon Footprint | Duration |
|---|---|---|---|
| Training | Extremely High | 500+ tons CO2 per large model | Weeks to months |
| Inference (Usage) | Moderate per query | Accumulates with billions of queries | Continuous |
| Data Centers | Very High | Massive ongoing consumption | 24/7/365 |
| Cooling Systems | High | 40% of data center energy | Continuous |
| Model Updates | High | Regular retraining cycles | Periodic |
Experience vs. Inference: Two Different Energy Problems
At this point, things get very interesting. Training is the most energy-intensive activity imaginable. When AI models train on massive datasets, they run thousands of powerful GPUs continuously for days, weeks, or months on end. Training GPT-3, for example, generated carbon emissions equal to five cars’ lifetimes— including the emissions from manufacturing those cars. But here is what seems lost on a lot of people: the continual use, which we call inference, is really becoming the bigger problem. Sure, each query may not consume that much energy, but when billions of people ask AI questions every single day, it does not take long for those small increments to add up to a significant amount of energy. Some experts believe that, in time, this continual use will consume more energy than the training ever did.
Data Centres: The Invisible Energy Monsters
Do you ever wonder where all of this AI actually resides? In data centres, massive complexes that are located around the world. Currently, these data centres utilize somewhere between 1-2% of the total electricity in the world. That number is projected to rise to 3-4%, due to the increased popularity in AI, by 2030. A single data centre can use as much energy as a small city, using between 20 to 50 megawatts of continuous power.
And here’s something that most people do not appreciate: cooling the data centres is another massive consumer of electricity. Those powerful AI chips can get extremely hot, therefore approximately 30-40% of a data centre’s total power is used to provide cooling. If the data centre is located in a warm climate? The cooling bill is even greater.
The Under-Acknowledged Water Problem
It is not only electricity. Data centres consume billions of gallons of water per year simply for cooling. As a benchmark, training GPT-3 used approximately 700,000 litres of clean freshwater. To put that in perspective, that’s enough water for hundreds of thousands of people’s daily needs. As the industry expands globally, that water consumption becomes more concerning, especially for communities with limited water resources.
Putting numbers into context about AI energy usage
sometimes makes sense of all these big numbers, here are a few comparisons that come to mind:
- One ChatGPT conversation = 10 Google searches worth of energy
- One training of GPT-3 = Powering 120 homes for a whole year
- All global data centres combined = Argentina’s total national electricity consumption
- One AI query = Powering an LED lightbulb for approximately one hour, and
- The AI industry 2030 = More electricity consumption than some countries equivalent to The Netherlands.
The Source of Energy Makes a Significant Difference
Here’s the deal with the carbon footprint of AI—it’s heavily dependent on the energy source. Train an AI model in Iceland, which has renewable geothermal and hydropower? That’s a pretty low carbon footprint. Train that model in an area that burns coal for electricity? You’re talking about emissions that are orders of magnitude greater. A model trained in Iceland (a source of renewable geothermal and hydropower) has much lower emissions than one trained in a region that relies on fossil fuel-based electricity. The spatial component is very important when it comes to responsibility—companies that train AI models where the energy grid is fed from carbon-based sources should have greater responsibility than companies who train AI models where the energy is coming from renewable sources. Embracing Efficiency and Innovation. The technology industry is aware of the sustainability problem.
How much energy does AI use today relative to its energy use tomorrow?
Researchers are also working on energy-efficient models, which include units that are sparse and only turn on the necessary neuron pathways, quantization that lowers a unit’s computational need, dedicated AI chips that use energy consumption as the priority, edge computing processes data on location instead of in a data warehouse, and model distillation which generates a smaller version of a large model. These innovations show promise. For example, newer language models may be able to provide similar performance to prior ones with 90% less energy used during training. However, that may all be for nothing since the total use of AI may continue to outpace these efficiency improvements.
The Rebound Effect: More Efficiency, More Use
Presenting the paradox: the more energy savings from AI will presumably result in significantly more usage hence a net more energy could be consumed. For example, a 50% efficiency improvement may lead to 200% more use and essentially more energy consumption overall. It is necessary to determine how much energy all AI use will call for in the future due to the facts that AI adoption will accelerate at an alarming rate now and everything has a boundary at which point the costs to employ a technology outweigh the benefits, therefore reducing the use of the technology.
Corporate Responsibility and Transparency
Many prominent AI companies are beginning to publish sustainability reports, but there are still inconsistencies in disclosure and transparency. What is the AI consumption of energy for these companies? Most don’t report these numbers at all. The large players creating AI infrastructure (Google, Microsoft, and Amazon) have said they would be carbon neutral or committed to renewable energy sources, but each organization has significantly different timeframes and focuses. Critics claim that because there is no standardized requirement to report outcomes, you can pick metrics that indicate you are environmentally responsible while masking deeper environmental consequences.
The Users’ Role: Conscious AI Usage
Individual users rarely consider how much energy usage they are contributing to with their AI interaction. There are slight behavioral changes that can reduce AI energy contributions: 1. Use AI tools with intention and consideration rather than casually.
2. Use text-based AI over video/image generation (10-100x higher energy use).
3. Support companies that have commitments toward renewable energy.
4. Ask if AI is required for your task.
5. Advocate for energy reporting from your AI providers.
Regulatory and Policy Framework
Governments are beginning to recognize and regulate how much energy is allowed for consumption by AI for which there is no energy accountability to the environment. The EU’s AI Act includes a sustainability realm, California is considering regulating the energy consumption of data centers, and international climate agreements are increasingly calling out the carbon footprint of AI. Regulatory scrutiny of AI technology is not keeping pace with the rapid development of this technology, paving the way for exponential growth of AI advancement with minimal counterbalance or oversight for environmental impact.
The Future: Opportunities for Sustainable AI or Environments at Risk?
What does AI’s energy consumption look like in 2025 compared to 2030 or 2040? Current trends suggest we will experience unchecked growth that is unsustainable. Without massive gains in energy efficiency and clean energy deployment, AI will become a leading contributor to climate change, which is ironically the reason for AI’s invitation to help solve environmental challenges. What we need is a strategy for a mindful approach of weighing the sells and benefits of AI to society against the environmental impacts of its utilization, adopting a rationale for mandatory transparency reporting regarding energy consumption and the carbon footprints from its utilize, aggressively investing and deploying the infrastructure for renewable energy, advancing our research and implementation of efficiency of AI, and establishing international norms for sustainable AI development.
Common Questions
Q: How does the energy usage of AI compare to cryptocurrency mining?
A: Bitcoin mining still consumes more energy than AI, using about 0.5% of total global electricity versus AI’s current level of about 1-2%. That said, AI is growing faster than crypto, so someday it could easily surpass crypto in energy usage.
Q: Is it realistic for all AI energy use to come from renewable energy?
A: In theory yes, but in practice, it is difficult to achieve. While data centers can obtain renewable energy, there are logistical barriers to having full 24/7 energy from intermittent sources. This would require either battery storage or the geographic distribution of large-scale renewable energy generation.
Q: How much energy does ChatGPT use for one user query?
A: ChatGPT uses approximately 0.3 watt-hours of energy for one user query. This is about 10 times more than a Google search. It is dependent on the complexity of the query and the model. For example, longer conversations and more complex requests will increase the energy usage.
Q: Does image-generating AI consume even more energy than text-based AI?
A: Yes, they consume significantly more energy. Generating AI images consume around 10-100 times more energy than text queries, depending on the same factors of complexity and resolution. Video generation consumes the most energy of all mentioned.
Q: Are companies hiding energy consumption by AI?
A: Not very many businesses really disclose AI-specific energy consumption, although some do provide data on overall data center energy use and sustainability commitments. More transparency is needed in the whole sector.
Q: Will AI energy use prevent climate goals from being reached?
A: Not directly. If unchecked, it does certainly tend to do so. But AI can also help optimize energy systems and minimize the development of renewable energy. The net impact thus depends on how we treat AI technology and deploy it.
Disclaimer
This article is one of an analysis of joint energy consumption, using recent research, industry reports, and expert estimates, by 2025. There is a perceivable difference in exact energy consumption due to model architecture, hardware efficiency, data center location, and sources of electricity. Most companies do not publish the data specific to AI energy consumption, making for difficulty in providing accurate calculations. Most estimates provided in energy consumption are approximates and may vary with time and as technology takes its course. This article so puts forward an elaboration into the public discourse on AI sustainability, and in doing so, in not acting as any technical or policy advice. For specific data, readers should reference primary sources and current research. AI technology and its environmental impact are still rapidly evolving, so information may soon be outdated. This content is for educational and informational purposes only.