Skip to main content

101: Ecosystem of AI


The massive ecosystem of AI relies on many kinds of extraction: from harvesting the data from our daily activities and expressions, to depleting natural resources and to exploiting labor around the globe so that this vast planetary network can be built and maintained.

This guide gives you insights, numbers and examples of art projects to help you as a maker navigate this field.

Cartography of Generative AI shows what set of extractions, agencies and resources allow us to converse online with a text-generating tool or to obtain images in a matter of seconds, by Estampa

“Cutting-edge technology doesn’t have to harm the planet”

Experimenting with AI

As a maker it can be daunting to experiment in this field, some tips (this is a growing list):

Using large generative models to create outputs is far more energy intensive than using smaller AI models tailored for specific tasks. For example, using a generative model to classify movie reviews according to whether they are positive or negative consumes around 30 times more energy than using a fine-tuned model created specifically for that task.

The reason generative AI models use much more energy is that they are trying to do many things at once, such as generate, classify, and summarize text, instead of just one task, such as classification.

Be choosier about when they use generative AI and opt for more specialized, less carbon-intensive models where possible.

Code Carbon makes these calculations by looking at the energy the computer consumes while running the model: https://codecarbon.io/

Inspiration project: Solar Server is a solar-powered web server set up on the apartment balcony of Kara Stone to host low-carbon videogames. https://www.solarserver.games/ 

At what costs …

Artificial intelligence may invoke ideas of algorithms, data and cloud architectures, but none of that can function without the minerals and resources that build computings core components. The mining that makes AI is both literal and metaphorical. The new extractivism of data mining also encompasses and propels the old extractivism of traditional mining. The full stack supply chain of AI reaches into capital, labor, and Earth’s resources - and from each demands enormous amounts. 

Each time you use AI to generate an image, write an email, or ask a chatbot a question, it comes at a cost to the planet. The processing demands of training AI models are still an emerging area of investigation. The exact amount of energy consumption produced is unknown; that information is kept as highly guarded corporate secrets. A reason is that we don’t have standardized ways of measuring the emissions AI is responsible for. But most of their carbon footprint comes from their actual use. What we know:

The AI Index tracks the generative AI boom, model costs, and responsible AI use. 15 Graphs That Explain the State of AI in 2024:  https://spectrum.ieee.org/ai-index-2024

Usage

Build/Run/Host

In-depth breakdown

The dirty work is far removed from the companies and city dwellers who profit most. Like the mining sector and data centers that are far removed from major population hubs. This contributes to our sense of the cloud being out of sight and abstracted away, when in fact it is material affecting the environment and climate in ways that are far from being fully recognized and accounted for. 

Compute Maximalism 

In the AI field it’s standard to maximize computational cycles to improve performance, in accordance with a belief that bigger is better. The computational technique of brute force testing in AI training runs or systematically gathering more data and using more computational cycles until a better result is achieved, has driven a steep increase in energy consumption.

Due to developers repeatedly finding ways to use more chips in parallel, and being willing to pay the price of doing so. The tendency toward compute maximalism has profound ecological impacts.

The [uncertain] Four Seasons

The [Uncertain] Four Seasons is a global project that recomposed Vivaldi’s ‘The Four Seasons’ using climate data for every orchestra in the world. https://the-uncertain-four-seasons.info/project

Consequences

Within years, large AI systems are likely to need as much energy as entire nations.

Some corporations are responding to growing alarm about the energy consumption of large scale computation, with Apple and Google claiming to be carbon neutral (meaning they offset their carbon emissions by purchasing credits) and Microsoft promising to become carbon negative by 2030. 

In 2023, Montevideo residents suffering from water shortages staged a series of protests against plans to build a Google data centre. In the face of the controversy over high consumption, the PR teams of Microsoft, Meta, Amazon and Google have committed to being water positive by 2030, a commitment based on investments in closed-loop systems on the one hand, but also on the recovery of water from elsewhere to compensate for the inevitable consumption and evaporation that occurs in cooling systems.

Deep Down Tidal is a video essay by Tabita Rezaire weaving together cosmological, spiritual, political and technological narratives about water and its role in communication, then and now. 

More about water

Reflecting on media and technology and geological processes enables us to consider the radical depletion of nonrenewable resources required to drive the technologies of the present moment. Each object in the extended network of an AI system, from network routers to batteries to data centers is built using elements that require billions of years to form inside the earth.

Water tells a story of computation’s true cost. The geopolitics of water are deeply combined with the mechanisms and politics of data centers, computation, and power - in every sense.

The digital industry cannot function without generating heat. Digital content processing raises the temperature of the rooms that house server racks in data centres. To control the thermodynamic threat, data centres rely on air conditioning equipment that consumes more than 40% of the center's electricity (Weng et al., 2021). But this is not enough: as the additional power consumption required to adapt to AI generates more heat, data centers also need alternative cooling methods, such as liquid cooling systems. Servers are connected to pipes carrying cold water, which is pumped from large neighboring stations and fed back to water towers, which use large fans to dissipate the heat and suck in freshwater

The construction of new data centers puts pressure on local water resources and adds to the problems of water scarcity caused by climate change. Droughts affect groundwater levels in particularly water-stressed areas, and conflicts between local communities and the interests of the platforms are beginning to emerge.

Curious about more? Reading tips

Atlas of AI - Kate Crawford

A Geology of Media - Jussi Parikka

Hyper objects - Timothy Morton 

Sources:

Atlas of AI - Kate Crawford

Podcast: Kunstmatig #28 - Tussen zeespiegel en smartphone

Datacenters and water: KUER

Technology Review: Making an Image with Gen AI uses as much energy as charging your phone

https://www.washingtonpost.com/technology/2023/06/05/chatgpt-hidden-cost-gpu-compute/

https://arxiv.org/pdf/2206.05229 

Technology Review: getting a better idea of gen AIs footprint

https://spectrum.ieee.org/ai-index-2024

https://aiindex.stanford.edu/report/?utm_source=Stanford+HAI&utm_campaign=525788e5b7-AI_INDEX_2024_EMAIL_CAMPAIGN&utm_medium=email&utm_term=0_-525788e5b7-%5BLIST_EMAIL_ID%5D&mc_cid=525788e5b7&mc_eid=3eeff7f19b