Your AI-generated image of a cat riding a banana exists because of children clawing through the dirt for toxic elements. Is it really worth it?

Behind the release of major linguistic models like Chat GPT lies a journey with complex environmental and social impacts, from child mineral mining in the Democratic Republic of Congo, to training systems that expose people to violent and degrading images in countries like Nigeria, and to vast, resource-intensive data centers in regions where energy, water and access to transmission infrastructure are cheap. This means that the AI boom has the potential to create new economies of resource production and consumption – likely in communities that are already marginalized or have been subject to previous resource booms and busts.
Yet these costs are rarely recognized and they raise profound questions about sustainability, not only from a mineral resource perspectivebut also in a broader moral sense: do we want to build a society that benefits from the suffering of the world’s most marginalized? Will this end up fracturing societies and leading to a politics of resentment?

Akhil Bhardwaj
Akhil Bhardwaj is Associate Professor of Strategy and Organization at the University of Bath, UK. He studies extreme events, which range from organizational disasters to radical innovation.

Grete Gansauer
Dr. Grete Gansauer is an assistant professor in the Haub School of Environment and Natural Resources at the University of Wyoming. She is an economic geographer and interdisciplinary public policy researcher focused on regional policy and the effects of sustainability transitions in rural and natural resource-producing contexts.
The journey that powers AI-produced texts and images begins with rare earth minerals used in computer chips. Rare earth minerals are “rare” because they are found in small, isolated pockets of the Earth’s crust and are difficult to extract through physical and chemical processes.
China currently dominates global rare earth production in mining and processing; The United States is second in mining, but it lacks the infrastructure to process rare earths once they are extracted from the ground.
Many critical minerals, such as lithium and cobalt, are also important for AI processing and storage. Unlike rare earths which are designated because of their chemical properties, critical mineral designation is a political designation given to minerals of key strategic, geopolitical, or national importance.
Many of these minerals are found in regions currently ravaged by war (e.g. Ukraine has some of the largest lithium reserves in Europe and Russia is the world’s largest producer of uranium. Others, like cobalt, are found in regions like Congowhere many mines are controlled by Chinese interests.
Beyond geopolitical concerns, while certainly very important, concerns also arise regarding labor practices. Many of these mines use artisanal mining, which is often a euphemism for child labor – artisanal mining can involve children. digging minerals with their hands. These minerals are then mixed with those extracted from industrial mining, making traceability impossible. Working conditions can be horriblewith high mortality rates, often due to exposure to air and water pollutants that cause fatal illnesses.
Therefore, intensified production of resources driven by the demands of AI and a highly digital economy could produce a new “resource curse” in the peripheries of the North and South. The wealth produced by local labor is extracted and used to support some of the most lucrative digital service industries in the world. The trap, then, is that communities whose material contributions are embedded in the global AI value chain will again be vulnerable to the same boom-and-bust dynamics that hit economies based on the production or extraction of other resources, such as oil or diamonds.
Beyond extracting mineral wealth, many AI models require considerable training – and humans need to do that training. LLMs are trained on an increasingly large corpus of “tagged” data containing violent and pornographic content. Aside from the precariousness of on-demand work, the content itself can be very disturbing and cause traumatic workers. Much of this work is done in countries like Nigeria and India, where labor costs are low and workers have little protection.
Once these models are trained, running them involves using huge data centers to cool the servers that process them. These server farms/data centers consume enormous resources, both energy and water. These centers represent an emerging trade frontier with major implications for land use change and resource impacts.

Private land companies are rapidly seeking resource frontiers that offer the most affordable combination of cheap land, cheap water, cheap energy and cheap access to transportation infrastructure, proximity to densely populated centers, and cheap but skilled labor. However, such a geographic unicorn is difficult to find.
Numerous data centers have been established or are being explored water-poor regions like Nevada and Arizona, where labor and land are cheap. This trend seems is true on a global scale. In addition to cheap land, deserts have low humidity, reducing the risk of metal corrosion. These centers also question the capacity of local electricity networksand as energy is often purchased in bulk or “pre-market”, it can drive up prices for average consumer. Researchers estimated that using AI to compose an email consumes half a liter of water.
Although there is strong pressure to adopt the use of LLMs worldwide, particularly with the lure economic and labor efficiencies and other potential benefits – including its use for search for information and write as well as automate repetitive taskswe must be fully aware of the material and social costs it imposes. Do you need ChatGPT to write this email? Do you Really Need to generate an image of a cat riding a banana?
However we might answer these questions, it seems we need to fundamentally reassess what it means to be sustainable – claiming to be sustainable while adopting and promoting LLMs is suspect, to say the least.
And do we really want the progress that LLMs can bring if they build on the suffering of others? This is a question that we, as a society, urgently need to answer.
Notice on Live Science gives you insight into the most important science questions affecting you and the world around you today, written by experts and leading scientists in their fields.


