Exploring the Environmental Costs of Artificial Intelligence (AI)
by S.L Chan
Introduction
We are a generation living in the midst of AI development, and many of us have likely dreamed of how AI could make our lives easier by carrying out certain tasks and freeing up time for more meaningful pursuits. Sometimes we're surprised by just how far AI has already advanced, such as ChatGPT's ability to write a 900 words article in just 20 seconds. It can make us wonder why we bother writing ourselves and worry about potential typos and grammatical errors.
Take Amazon's digital voice assistant, Alexa, as an example. It has brought many households real conveniences, from turning on lights and setting alarms, to playing games with users. But what stands behind the Alexa speaker box?
At a book review session for "Atlas of AI" by Kate Crawford, organized by TechHuman, concerns were raised about the environmental costs of AI. This short article aims to provide some brief insights into this topic
To begin with, how does an AI system work?
Behind the Alexa speaker box, there is a complex system that supports Alexa's capabilities. This system runs continuously and is updated regularly to ensure that it meets users' needs, provides convenience, and strives for accuracy. Figure 1 was put together to illustrate the key components of an AI system, including the resources it consumes and the impact it can have. Understanding this cycle is important for exploring the environmental costs of AI.
The inner circle of Figure 1 exemplifies the four key elements of an AI system: Data Centers, Training and Running Programs, Hardware, and Cooling Systems. Data Centers store massive amounts of data, while Training and Running Programs operate non-stop. Hardware supports data storage and application, at the same time Cooling Systems ensure that the system does not overheat and malfunction.
The white boxes adjacent to the inner circle describes the resources that each AI model element consumes, including energy, limited natural resources, and large amounts of water. As shown in Figure 1 , the AI model operates like a cycle: the more data that is processed, the more computing power and microchips are needed and thrown away for disposal, next is the more maintenance is required, including cooling. As AI systems continue to expand, their costs continue to rise not only in terms of money and and labour, but also in environmental terms.
Consequently, what are the environmental impacts?
The outer circle of Figure 1 lists some of the general environmental effects caused by AI models. Due to their massiveness, AI systems consume significant amounts of energy, such as ChatGPT, which has been trained on 300 billion words and has around 175 billion parameters, while the English language has only around 60,000 active words today. Undeniably, the AI industry is responsible for significant greenhouse gas emissions and the release of toxic chemicals, contributing to climate change and global warming, the harmful environmental impacts caused by it.
Thus, how significant is the amount of energy and carbon emission we are referring to?
According to Kate Crawford, AI is responsible for 2% of global electricity consumption and greenhouse gas emissions, with a trend to increase further. To put this into perspective, the following comparisons depict the significant energy consumption and carbon emissions of the industry. It is clear that the negative environmental impact of the AI industry cannot be ignored or downplayed.
Standing behind the Alexa speaker box is therefore not only the complex system, but also an energy consumption equivalent to that of a small power plant.
Most importantly, is there any good news?
While such comparisons can be disheartening, Kate Crawford also urged for solutions, such as the development of more sustainable and responsible AI systems. This involves designing algorithms to be more energy-efficient, reducing the use of single-use hardware, and prioritizing the utilization of renewable energy sources. Furthermore, some good news follows as examples.
Currently, many of the world's largest technology companies are committed to achieving carbon neutrality. For example, Microsoft has set a goal to become carbon negative by 2030, and plans to use 100% renewable energy by 2025[1]. Google has already achieved this milestone, having relied solely on renewable energy since 2017[2]. DeepMind, an AI research firm owned by Alphabet, reduced its energy consumption for data centers by 40% in 2019 by optimizing its cooling systems and using machine learning algorithms to predict and manage energy usage[3]. Some data centers have successfully transitioned to using exclusively clean energy. Iceland's data centers, for instance, are largely powered by the country's abundant hydroelectric and geothermal resources. Additionally, Iceland's cold climate eliminates the need for energy-intensive cooling systems, such as using water[4]. Companies are also exploring alternative materials for batteries, such as lithium-ion or more commonly available resources like concrete[5]. The OECD (Organization for Economic Co-operation and Development) has published a document[6] last year in November, addressing the measurement of the environmental impacts of artificial intelligence, pointing out current measurement gaps and the policy implications.
Clearly further advancements in reducing the environmental costs of large scale IT systems will require the involvement and implementation of AI systems themselves. Increasingly, with Cloud activity, the IoT (Internet of Things), and mobile participation, we are unable to measure environmental impacts, engage in constructive discussions, and effectively address the most important issues at stake without reliance on the very AI systems we may be debating or decrying. Specifically, without AI, meaningful comparisons such as those presented above would not be possible.
[1] https://www.microsoft.com/en-us/corporate-responsibility/sustainability
[2] https://sustainability.google/reports/
[3] https://deepmind.com/blog/article/optimising-for-sustainability-in-our-data-centres
[4] https://www.cfr.org/blog/artificial-intelligences-environmental-costs-and-promise
[5] https://www.cfr.org/blog/artificial-intelligences-environmental-costs-and-promise
[6] https://www.oecd-ilibrary.org/docserver/7babf571-en.pdf?expires=1676921048&id=id&accname=guest&checksum=5BD9EA2C5FC58B8D5DDA17C370EFEBCA
Last but not least, reverse thinking.
The AI industry, like any other industry, contributes to negative environmental impacts. The core issue may not be AI itself, but rather the prevailing practices and thought patterns. In our society, particularly in business and consumption, we tend to consume first and think later. This is essentially a careless behaviour. As Christians, we are called to cultivate creation and steward resources rather than harm creation first and then try to restore it later. It would be more beneficial to create a dynamic where AI firms could only secure funding and make profits by presenting a carbon-neutral plan since the beginning. AI can make it possible, can’t it?
We recognise these are important, complex and contested issues and would welcome comments and feedback. Our goal at TechHuman is to provide a forum in which these topics can be debated.
Shuk Ling Chan
Shuk-Ling has studied sociology and intercultural competencies. She was born and raised in Hong Kong, has been living in Austria for 15 years, and considers Austria her second home. She worked in the forestry industry for a decade and has seen how the industry is involved in environmental topics. Her passion, however, lies in the diverse embodied sense of belonging in mixed cultural communities, and she enjoys putting chaos in good order. Shuk-Ling participated in the IFES Graduate Impact program, where she learned and continues to learn how to live out her faith in the workplace. She worships at the International Bible Fellowship in Buchs, Switzerland, and serves on the worship team.