What your ChatGPT use is doing to the environment
May 19, 2025
Many of us are aware that artificial intelligence (AI) is not the most planet-friendly technology out there, due to its reliance on energy-intensive processors to answer every query.
But even if you do know that creating an image of an action figure of yourself might have an ecological impact, just how much energy do AI chatbots like ChatGPT actually use?
Data centres – buildings full of powerful computer equipment that are used for everything from sending emails to booking plane tickets – consumed about 1.5% of the world’s electricity in 2024, according to the International Energy Agency (IEA).
But the rapid adoption of AI means that the energy demands of data centres are driving growth in electricity demand, after years of steady or declining demand.
ADVERTISEMENT
Global demand for AI is now forecast to consume three times as much electricity as the UK’s total demand by 2030, according to the IEA.
In the wake of the ChatGPT ‘Barbie’ trend, which saw users giving the bot detailed instructions to create an action figure resembling them, environment experts warned that using the AI chatbot had serious environmental effects.
Professor Gina Neff from Queen Mary University London said that generative AI (GenAI) tools are having outsized environmental consequences and that ChatGPT is burning through energy – the data centres used to power it consume more electricity in a year than 117 countries.
The reason AI is more power-hungry than other technologies is that, rather than using traditional computer processors (CPUs), it relies on power-hungry graphics processing units (GPUs).
ADVERTISEMENT
This means that data centres dealing with AI tools are filled with racks of hundreds of GPUs, often made by graphics-card giant Nvidia.
Before models such as GPT-4 are used, they have to be trained in a lengthy process which consumes thousands of megawatt hours of electricity and hundreds of tons of carbon dioxide (CO2), according to Harvard Business Review.
Each AI query (technically known as ‘inference’) then has its own electricity impact.
As a result, a trend for declining or static electricity demand from data centres has reversed, with Goldman Sachs predicting a 160% increase in electricity demand triggered by AI.
If you take one comparison, ChatGPT is relatively efficient.
Research published in the journal Nature found that AI systems emit between 130 and 1,500 times less CO2 per page of copy generated than human writers do.
ADVERTISEMENT
Per illustration produced, AI systems emit between 310 and 2900 times less CO2 than a human artist.
The report by Goldman Sachs suggested that each ChatGPT query consumes 10 times as much power as a Google search, based on IEA research.
The Nature study suggests that each ChatGPT query emits 1.84g of CO2 for the training of the model, based on re-training once a month, and in total emits 2.2g of CO2 per query.
That’s relatively hefty for technology: Google searchers are believed to release 0.2g per search, and emails have been estimated at releasing 0.3g.
But compared to other activities, ChatGPT’s carbon emissions are relatively small.
Every human releases between more than 700g of CO2 every day simply by breathing.
Shipping, cooking and storing food can also lead to the release of 4.5kg of CO2 per person, per day, according to estimates by the Food Foundation.
Search
RECENT PRESS RELEASES
Related Post