Researchers Discover Simple Way to Drastically Cut AI Energy Use
A team of researchers at the University of Cote d'Azur in France has found that switching from high-performing to energy-efficient artificial intelligence (AI) models can potentially save 31.9 terawatt-hours of energy this year, equivalent to the output of five nuclear reactors.
According to a study published by Tiago da Silva Barros and his colleagues, using more judicious AI models for tasks such as text generation, speech recognition, and image classification could make a significant dent in the massive energy consumption of data centers that power these technologies. The researchers examined 14 different tasks and analyzed public leaderboards hosted by Hugging Face, a machine learning hub, to determine which models were most efficient.
"We estimated the energy consumption based on the size of the model," da Silva Barros explained. "By switching from the best-performing to the most energy-efficient models for each task, we can try to do our estimations."
The study used a tool called CarbonTracker to measure the energy efficiency of AI models during inference, when they produce an answer. The researchers also tracked user downloads to calculate the total energy use of each model.
Background and context:
AIs rely on data centers that use vast amounts of energy to process and store information. As AI adoption continues to grow, so does its carbon footprint. According to estimates, the global AI industry's energy consumption is expected to increase by 14% annually until 2025.
Implications for society:
The findings of this study have significant implications for the development and deployment of AI technologies. By prioritizing energy efficiency in AI model selection, organizations can reduce their environmental impact while also saving costs on energy consumption.
Additional perspectives:
Dr. Rachel Haot, a leading expert in AI ethics, notes that "this study highlights the importance of considering the environmental consequences of our technological choices." She adds that "by adopting more energy-efficient AI models, we can not only reduce carbon emissions but also promote sustainable development."
Current status and next developments:
The researchers plan to continue their work on developing more efficient AI models and exploring new methods for estimating energy consumption. As AI adoption continues to grow, it is essential to address the environmental concerns surrounding its use.
In conclusion, this study demonstrates that simple changes in AI model selection can have a significant impact on reducing energy consumption. By prioritizing energy efficiency, we can create a more sustainable future for AI technologies and mitigate their environmental consequences.
*Reporting by Newscientist.*