Kate Crawford maps a world of extraction and exploitation in ‘Atlas of AI’

Video produced by USC Annenberg Video Team.


After 15 years of studying artificial intelligence, Kate Crawford knew that it had many hidden human and environmental costs. But she didn’t realize the extent of those costs until she set out to trace the full life cycle of a single AI consumer product: the Amazon Echo.

Working with fellow researcher Vladan Joler, she authored the project Anatomy of an AI System (2018), which traced the Echo from the mining of the minerals required to build it, through what happens to the voices it collects, through the end of its life in e-waste dumps in Ghana or Pakistan. The project is now part of the permanent collections of the Museum of Modern Art in New York City and the Victoria and Albert Museum in London. 

“That process, for me, was one of really opening my own eyes to see how many of these choices we make are actually feeding these much bigger systems of extraction,” said Crawford, who joined USC Annenberg as a research professor in March. “That was the moment when I realized I wanted to see that at a bigger scale, to see it across the entire AI industry.”

This industrywide examination of how AI extracts raw materials — both literal (lithium for batteries in AI-driven devices, including electric cars) and virtual (the personal data of billions of people) — became the subject of her new book Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. In an April 6 virtual event celebrating the release of the book through Yale University Press, Crawford spoke with Josh Kun, Chair in Cross-Cultural Communication, about the conception and research for the book.

Here, in her own words, are some of the most stirring, chilling, infuriating and thought-provoking observations Crawford shared.

Kate Crawford.
On the book’s journalistic structure and balancing scholarship with on-the-ground reporting.

If you look at the way artificial intelligence is represented, it’s so commonly understood in this highly abstract mathematical way. If you do an image search right now in Google for “AI,” what you get back is just reams and reams of pictures of blue numbers and tunnels of code and white robots. This is the way that we see AI represented: as immaterial, as sort of floating in the cloud.

It was incredibly important to me to look at that in a much more grounded way. What are the material consequences of these systems? To do that, you really have to go there.

Jeff Bezos has now created a space company, which is called Blue Origin. And it was one of the locations that I visited — to photograph this reusable rocket base in the middle of West Texas. And for me to see that the billions of dollars generated by AI companies and by tech billionaires is now being redirected into a commercialized space race, this idea that now we've made this money, we can abandon the planet as sort of a discarded and useless object … those trajectories and those ideologies run deep.

On how AI doesn’t just build on already existing histories of bias and inequality, it actively constructs them.

When you have these huge ambitions to capture and contain the entire world, it is driven by this extractive impulse to get as much data as possible, at any cost. We’ve seen the downside of that. All of those collections of data come with histories, they can’t be separated from them. And some of those histories contain very problematic structural forms of inequality — racism, sexism, classism — and that then creates the systems we use in the future.

One of the really horrifying things for me as a researcher was looking at the ways in which these systems that were primarily designed for intelligence agencies — that were extralegal by design — have filtered down to the municipal level, that are literally being used by local police departments that are connecting to people’s Amazon Ring cameras on the front of their houses. These sites of data ingestion are then being fed into the engines of deportation.

Those stories need to be told far more often: As we subscribe to new forms of tracking and monitoring our own homes and our own bodies, we could actually be providing more data to precisely those incredibly unjust systems. We’re slowly sleepwalking into significant social and political change without having those states laid out for us — in many ways because they are intentionally hidden.

‘Atlas of AI’ by Kate Crawford.

On the role of universities in fueling the growth of AI technology that has outpaced AI ethics.

Universities are incredibly important here, because that’s where we train people to build technical systems. But there’s a problem, which is that traditionally computer science and engineering have been seen as disciplines that don’t really engage with human subjects. They don’t go through ethical review. They don’t do training on what are the larger sociological implications of these systems. … But now, things that were previously very much in theory and built in labs are touching the lives of billions of people every day. And we haven’t caught up.

In many ways, universities are working with these older siloed approaches of seeing the computer sciences as a mathematical discipline that’s at arm’s length from human bodies. When the reverse is actually the case.

On the problem with large data sets collected from the internet without our knowledge.

If there’s an original sin of the field, it’s this moment when the idea of just harvesting the entire internet — taking people’s photos, taking people’s texts, taking their responses to each other — and seeing it as an aggregate infrastructure that had no specific histories or stories or intimacies or vulnerabilities contained within it. To strip it of all that, and say, this is just “raw material” — with very much scare quotes around that — to drive large-scale systems of prediction and optimization. That has brought us to this point, where I think we should be asking much harder questions of those data sets, not only because of their origins, but because of the way in which they bring, smuggling in, a worldview that is so rarely questioned — and is producing some very serious harms.