Home Tech James Muldoon, Mark Graham and Callum Cant: “AI feeds off the work of humans”

James Muldoon, Mark Graham and Callum Cant: “AI feeds off the work of humans”

0 comments
James Muldoon, Mark Graham and Callum Cant: “AI feeds off the work of humans”

IAmes Muldoon is Professor of Management at the University of Essex, Mark Graham is Professor at the Oxford Internet Institute and Callum Cant is Senior Lecturer at the University of Essex Business School. They work together on Fair Worka project that evaluates working conditions in digital workspaces, and are co-authors of Feeding the machine: the hidden human work that drives AI.

Why did you write the book?
James Muldoon: The idea for the book came from fieldwork we did in Kenya and Uganda on the data entry industry. We spoke to a number of data entry workers and the working conditions were just horrendous. And we thought this is a story that everyone needs to hear. People working for less than $2 an hour on insecure contracts, work that is predominantly outsourced to the global south because of how difficult and dangerous it can be.

Why East Africa?
Mark Graham: I started researching East Africa in 2009, specifically on the first of many undersea fibre optic cables that would connect East Africa to the rest of the world. The research focused on what this new connectivity meant for the lives of workers in East Africa.

How did you get access to these jobs?
Mark Graham: At Fairwork, the basic idea is that we set out decent work principles and then we assess companies against them. We give them a score from 1 to 10. And that’s how companies in Nairobi and Uganda opened up to us, because we were going to give them a score and they wanted a better one. We went to them with a zero out of 10 and said, “Look, there’s work to be done to improve.”

And do companies respond? Do they discuss their low scores?
Mark Graham: There’s a whole range of responses. Some argue that the things we’re asking them to do are simply not possible. They say things like, “It’s not our responsibility to do these things.” The beauty of ratings is that we can point to other companies that are doing them. We can say, “Look, this company does that. What’s wrong with you? Why can’t you have this condition for your workers?”

Can you tell us about the echoes of colonialism that you found in this data work?
Mark Graham: The old East African Railway connected Uganda to the port of Mombasa. It was funded by the British government and was basically used to extract resources from East Africa. The interesting thing about fibre connectivity in East Africa is that it follows a very similar route to the old railway and it is also an extractive technology.

Could you explain your concept of “extraction machine”?
Callum Cant: When we look at an AI product, we tend to think that it was created relatively spontaneously and we don’t think about the human work, resource requirements and everything else that goes on behind the scenes.

For us, the extraction machine is a metaphor that allows us to think much more about the work, resources, energy and time of those who have invested in that process. The book is an attempt to move from the superficial appearance of an elegant web page or images of neural networks to the material reality of when this reaches your workplace: what does AI look like and how does it interact with people?

James Muldoon: I think a lot of people would be surprised to learn that 80% of the work behind AI products is actually data annotation, not machine learning engineering. And if you take the example of an autonomous vehicle, one hour of video data requires 800 human hours of data annotation. So it’s an incredibly labor-intensive form of work.

How does this concept differ from Shoshana Zuboff’s idea of ​​surveillance capitalism?
James Muldoon: Surveillance capitalism is the best description of companies like Google and Facebook, which make money primarily through targeted advertising. It’s an apt description of a process from data to advertising, but it doesn’t really capture the broader infrastructural role that big tech companies now play. The extraction machine is an idea we developed to talk more broadly about how big tech companies feed off the physical and intellectual labor of human beings, whether they’re Amazon workers, creatives, data annotators, or content moderators. It’s actually a much more visceral, political, and global concept to show the ways in which these companies exploit and extract all of our labor.

Many of the concerns about AI have focused on existential risks, or how the technology can reinforce inequalities and biases that exist in the data it is trained on. But do you argue that the mere introduction of AI into the economy creates a whole host of other inequalities?
Callum Cant: We can see this very clearly in a workplace like Amazon. Amazon’s AI system, its supply chain orchestration technology, has automated the thought process, and what humans have to do in an Amazon warehouse is this brutal, repetitive, high-stress work process. We end up with a technology that purports to automate menial work and create freedom and time, but in fact what we have is people being forced to do more routine, boring, less skilled work because of the inclusion of algorithmic management systems in their workplace.

Amazon’s systems have created a “repetitive and stressful” work process for its employees, says Fairwork’s Callum Cant. Photo: Beata Zawrzel/NurPhoto via Getty Images

In one chapter of the book, you write about Chloe, an Irish actress, who discovered that someone was using an AI-generated copy of her voice. This bears some similarity to the recent dispute between Scarlett Johansson and OpenAI. She has a platform and the financial resources to challenge this situation, which most people don’t.
Callum Cant: A lot of the solutions are not individual, they rely on collective power. Because, just like anyone else, we don’t have the ability to tell OpenAI what to do. They don’t care if some authors think they’re running a data mining regime. These companies are funded with billions and billions of pounds of capital and they don’t really need to care what we think of them.

But collectively, we identified a number of ways in which we could fight back and start trying to transform the way this technology is being deployed. Because I think we all recognize that there is emancipatory potential here, but to get to it, it’s going to require an enormous amount of collective work and conflict in a lot of places, because there are people who are getting enormously rich off of this stuff and there are decisions that are being made by a very, very small handful of people in Silicon Valley that are making our lives worse. And until we force them to change the way they do it, I don’t think we’re going to get a better form of technology out of this.

What would you say to the readers? What measures could be taken?
Callum Cant: People are in such different situations that it is difficult to give universal advice. If someone works in an Amazon warehouse, then organize your coworkers and use your influence against your boss. If someone works as a voice actor, then you need to organize with other voice actors. But everyone will have to respond to this on their own terms and it is impossible to give a diagnosis.

We are all customers of Big Tech. Should we, for example, boycott Amazon?
Callum Cant: I think organizing at work is more powerful, but organizing as consumers also has a role to play. If there are clear differences and opportunities to leverage consumption, then by all means, especially if the workers involved call for it. If Amazon workers call for a boycott, for example, on Black Friday, then we encourage people to listen to that. Of course. But there has to be a set of principles guiding any action that people take anywhere, and the most important of these is that collective action is the main way forward.

You may also like