Review: Introduction of The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence by Kate Crawford

Kate Crawford, The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press, 2021. Pp. 1, 327. ISBN 9780300209570, 9780300264630.

Chapter reviewed: Introduction

Reviewed by: Hannah Mendro, hsmendro@uw.edu

In an increasingly technological world, and particularly with the advances in machine learning technology that have become so prevalent in the last few years, it can be easy to be swept up in the excitement and apprehension of artificial intelligence. AI seems to be entering every space I frequent: academic conversations in humanities classes about the insufficiency of ChatGPT in responding to critical questions; serious questions in librarian meetings about how to grade student work generated by AI; informal conversations with writers and artists who fear having their work stolen and their labor replaced with AI tools. As a scholar and library worker more engaged with cultural and information studies than with technology and design, I find AI coming up more in conversations about ethics and bias than about its computing components, but it is often portrayed with an air of mystery born of our own uncertainty about what it is—in academic and informal discussions alike. AI tools are quickly becoming part of the processes and lexicon of everyday life, and too many of us don’t even know how they work.

Amidst all of this question and uncertainty, Kate Crawford’s book The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence steps back even further than the questions of what AI will do and its future implications to address the question of what AI even is. In a powerful and thoughtful critique, Crawford works to bring these conversations about AI back to the politics that enable it, making visible the costs, hidden algorithms, and patterns of discrimination that are rarely mentioned in these casual conversations. In her introduction to the book, Crawford compellingly deconstructs the notion of AI, breaking down the baseline assumptions that underlie the very words artificial intelligence, and seeking to make visible the power structures—of land, capital, and discrimination—that bring it about.

Crawford opens her introduction with an extensive anecdote about “Clever Hans,” a horse who gained fame for ostensibly being able to solve math problems but was in fact responding to nonverbal and unintentional cues from his owner that allowed him to give the right answer. Crawford uses this example to point out the inherent fallacy in the notion of ascribing human intelligence to nonhuman beings or constructs. Without relying on assumptions about technology—or the inherent intimidation of overly-technical explanations to laypeople such as myself—Crawford illustrates both the temptation to believe that artificial intelligence constructs can demonstrate human intelligence and the danger in expecting them to do so. This example provides both a touchstone for the rest of the chapter and an opportunity to demonstrate the assumptions at play in both the words “artificial” and “intelligence,” which Crawford proceeds to deconstruct. In her characterization and explanations of AI, Crawford does not rely on the exaggerated or jargon-laden terminology of enthusiasts; rather, she grounds her explanation in the words that make up the name. She points out that the very concept of intelligence is a construct of cultural systems among humans—and as such, any “artificial” construct of such intelligence is bound up in all the assumptions, injustices, and stereotypes built into those systems. All machine learning systems are built by humans, with explicitly coded structures of “thought,” and the cultural constructs of intelligence are built into those systems, creating processes that reproduce existing values and concepts. This explanation is both a central tenet of the book and forms the foundation for the further deconstruction that will take place in each subsequent chapter: Crawford uses this book to make visible all that is generally concealed or assumed in the creation of artificial intelligence, from the products to the algorithms to the things it is used for. In her own words, “To understand how AI is fundamentally political, we need to go beyond neural nets and statistical pattern recognition to instead ask what is being optimized, and for whom, and who gets to decide. Then we can trace the implications of those choices” (9).

In the introduction, Crawford also lays out the “atlas” concept that opens the title and underlies her concept for the book. An atlas is a book of maps, each of which represents a different focus or perspective on a geographical landscape, each of which is guided by a particular view of what is important. Each chapter of the book similarly addresses a different force at play in the question of AI. In the first chapter, Crawford goes to the roots of extraction in the industry, providing an in-depth look at the “lithium mines of Nevada, one of the many sites of mineral extraction needed to power contemporary computation” (15). She addresses the environmental impact of this industry, one of the elements of AI not brought up in casual conversation. Chapter 2 addresses the exploitative use of human labor in creating these tools, focusing specifically on where the invisible work is elided by the ease of the machinery. Chapter 3 focuses on the data that these machines use, where they get it, and the ethical questions of surveillance that arise in such harvesting practices—and chapter 4 makes visible the role of classification and bias in creating the “intelligence” that these tools rely on. Chapter 5 looks at the complexity of emotion detection and the danger of relying on AI recognition tools, and chapter 6 points out the ways that these tools are used in service of state and military power. In this introduction, Crawford provides a brief overview of all these chapters and lays out the threads that will pull them all together throughout the book.

Crawford points out that the word “atlas” brings up questions of geography and power; the act of mapping is also an act of determining importance according to power structures. Each of these chapters’ “maps,” and the conclusion which pulls them together, will provide a new angle to the book’s overall focus on power and extraction in AI as a technology and an industry. The introduction gives an overview of each chapter, each angle, and contextualizes it all in that focus on power and land. This contextualizing chapter also provides an opportunity for her to name her own position: “[emerging] from a specific lived experience that imposes limitations . . . skews toward the AI industry in Western centers of power” (13). This explicit acknowledgement both provides a context for understanding Crawford’s approach to her questions and allows readers to identify where we can critique her and her argument. 

The questions Crawford interrogates—where did the materials come from? Who worked on them, and how? What information do they use, and how? Who benefits from the use of the technology, and who is exploited?—do not come up nearly often enough in the conversations I have experienced; they are not embedded deeply in the casual awareness of artificial intelligence systems. And yet, as Crawford points out, “artificial intelligence is now a player in the shaping of knowledge, communication, and power” (19). Without making visible the power dynamics at play, from the land where mineral extraction takes place to the discrimination built into the algorithms that construct “intelligence,” it will be too easy to give artificial intelligence tools an uncritiqued position in industry, society, and growth. Crawford’s book provides a crucial intervention into the trajectory of these systems: accessible to those who are not familiar with AI technologies, but assuring us that this deeper examination is the responsibility of everyone who works or engages with AI. These are the questions that must be leveled at artificial intelligence tools before they can be used to further damage the earth, exploit people for labor and data, and consolidate power in the hands of those who already have too much—and in her “atlas,” Crawford provides a series of maps for how to ask them.