What It Will Take to Make AI Sustainable

Building AI sustainably This appears to be a pipe dream, as tech giants that had previously promised to reduce their emissions have rushed to build massive data centers powered by fossil fuels.
The rush to develop AI at all costs has been reinforced by the Trump administration, which is also rolling back environmental protections.
Despite these headwinds, AI sustainability researcher Sasha Luccioni believes that the demand for greater transparency in AI, both from businesses and individuals, is higher than ever on the customer side.
Luccioni became a leader in trying to create more transparency about AI’s emissions and environmental impacts during her four years at Hugging Face, an AI company, including pioneering a ranking documenting the energy efficiency of open source AI models. She has also been an outspoken critic of big AI companies who, she says, deliberately hide information about energy and sustainability from the public.
She is now launching Sustainable AI Group, a new venture with former Salesforce sustainability chief Boris Gamazaychikov. They will work to help companies answer, among other things, the question “what are the levers we can play with to make agents a little less bad?” » Luccioni also wants to assess the energy requirements of different types of AI tools, such as speech-to-text or photo-video translation, an area that she says has so far been understudied.
Luccioni spoke exclusively with WIRED to talk about the demand for sustainable AI and what exactly she wants from Big Tech.
This interview has been edited for length and clarity.
WIRED: I hear a lot of people worried about the environment and the use of AI, but I hear fewer companies thinking about it. What have you heard specifically from people working with AI in their companies and what are they worried about?
Sacha Luccioni: First of all, they have a lot of pressure from employees – and pressure from the board, pressure from directors, like “you have to quantify this.” Their employees say: “You force us to use Copilot: how does this affect our ESG objectives? »
For most businesses, AI has become an essential part of their business offering. In this case, they need to understand the risks. They need to understand where the models work. They cannot continue to use models where they don’t even know the location of the data centers or the network they are connected to. They need to know what the supply chain emissions are, the transportation emissions, all these different things.
It’s not about not using AI. I think we’re past that. For example, it’s about choosing the right models or sending the signal that the energy source is important, so that customers are willing to pay a little more for data centers powered by renewable energy. There are ways to do this, and it’s about finding believers in the right places.
I also imagine that for global companies the sustainability situation is very different than in the United States, right? The US government may not care, but other governments certainly do.
In Europe, they have the European AI law. Sustainability has played an important role from the beginning. They added a bunch of clauses, and now the first reporting initiatives are published.
Even Asia is trying to be more transparent. The International Energy Agency produced these reports [on AI and energy use]. I was talking to them and they told me that other countries realize that the IEA gets their numbers from countries, and that countries don’t have these numbers specifically for data centers. They can’t make forward-looking choices because they need numbers to know, “OK, that means we need X amount of capacity, in the next five years,” or whatever. [Some countries] have started to push back against data center builders.


