More standards are needed to regulate the relationship between AI performance and energy consumption.
On June 23, 2025, author Kameryn Griesser in the article "Your AI prompts may have a negative impact on the environment" exposed a little-known, but increasingly worrying aspect of the rapid development of artificial intelligence (AI) and the potential environmental costs.
AI creates convenience but impacts the environment
Generative AI models are increasingly being used by individuals and organizations as a tool for solving everyday problems. But behind that convenience lies a harsh reality: every problem AI solves comes with hidden environmental costs that are mounting, the crux of which lies in how AI works.
Each word in the prompt that the user enters is broken down into clusters of numbers called "token IDs." These are then sent to giant data centers, some larger than a football field.
Here, large computers perform dozens of rapid calculations to generate a response.
These centers are often powered by coal or natural gas power plants, fossil fuels that cause serious environmental pollution.
The entire computing process, according to an often-cited estimate from the Electric Power Research Institute, can consume 10 times more energy than a typical Google search. That’s a truly alarming number, especially given the frequency and scale of AI use today.
Need standards to measure the extent of damage
To quantify the “damage” from each AI prompt, researchers in Germany conducted an extensive study.
They tested 14 large language model (LLM) systems by asking both free-response and multiple-choice questions. The results of the study, published in the journal Frontiers in Communication , revealed several key findings:
According to Tuoi Tre Online , complex questions generate six times more carbon dioxide emissions than questions with concise answers. This implies that using AI thoughtfully, with clear and to-the-point questions, can help reduce environmental impact.
Typically, these “smarter”, more energy-intensive LLMs have tens of billions of parameters – the weights used to process the ID tokens – more than smaller, more concise models.
Dauner likens this to a neural network in the brain: "The more neural connections you have, the more thinking you can do to answer a question."
While their reasoning capabilities and performance are more than desirable, they consume significant energy, posing a challenge for sustainable AI development.
Call to action and future solutions
According to Tuoi Tre Online, the popularity of integrating AI into daily work has posed a thorny issue in the context of the increasingly severe climate crisis.
It can also be seen as a reminder of our environmental responsibilities when using technology. Users need to be informed about the potential environmental costs of using AI.
The AI industry needs to prioritize research and development of more energy-efficient models and architectures. This could include optimizing algorithms, using greener data centers, or exploring less energy-intensive computing methods.
Additionally, AI developers and service providers should be more transparent about the carbon footprint associated with the use of their products. This would allow users and businesses to make more informed choices.
Further research is needed to better understand the relationship between AI performance and energy consumption, as well as to find ways to reduce environmental impacts. Collaboration between scientists , engineers, and policymakers is essential.
It is time to rethink how we interact with AI technology, and ask important questions about its sustainability in the future. As AI continues to develop and become more powerful, addressing the energy and carbon footprint will no longer be an option, but an urgent requirement to ensure that technological advancement does not come at the expense of environmental degradation.
Source: https://tuoitre.vn/moi-cau-lenh-cho-ai-deu-gay-o-nhiem-moi-truong-20250625114142376.htm
Comment (0)