The book argues that the extraction and exploitation of personal data by Big Tech companies is a form of modern colonialism, where data is treated as a resource to be extracted for profit and social control, mirroring historical colonial practices.
Companies collect a vast array of personal data, including emails, online shopping habits, location data, health information, and even data from smart devices like cars, which track driving habits and other activities.
The gig economy operates on data territories, where platforms like Uber and Lyft exploit workers by algorithmically reducing wages and increasing prices for users, while maintaining absolute control over the labor process through surveillance and data collection.
The four X's—explore, expand, exploit, and exterminate—originate from a strategy game but reflect historical colonial practices. In data colonialism, these stages translate to exploring and expanding data territories, exploiting data for profit, and erasing previous ways of living through technological control.
Data territory refers to the spaces created by platforms and software that capture and control data, giving the owners of these territories absolute power over the data and the activities conducted within them, much like colonial land ownership.
AI, such as ChatGPT, is redefining knowledge and expertise in education, often without consulting educators or students. It imposes a new standard of learning that erases previous methods and replaces them with algorithmic assessments, leading to a loss of critical thinking and knowledge retention.
Early warning signs include the environmental costs of data centers, the biases built into AI systems, and the exploitation of low-wage workers in the global south who train AI algorithms, often without their consent or fair compensation.
The data colonial class includes big tech companies like Google and Facebook, data brokers, manufacturers of smart devices, and individuals like Elon Musk and Ton Tatt, who exploit data for profit and control, often in ways that align with colonial practices.
Resistance involves imagining a different future, working within the system to push for regulations, working against the system through protests and activism, and working beyond the system to create alternative models that decolonize data and reclaim control over personal information.
They are continuing their work on AI, social media, and education, focusing on decolonizing data and building networks for resistance, such as Tierra Común, a trilingual platform for activists in Latin America and beyond.
The holidays are about spending time with your loved ones and creating magical memories that will last a lifetime. So whether it's family and friends you haven't seen in a while or those who you see all the time, share holiday magic this season with an ice cold Coca-Cola. Copyright 2024, The Coca-Cola Company.
Okay, I have to tell you, I was just looking on eBay where I go for all kinds of things I love, and there it was. That hologram trading card. One of the rarest. The last one I needed for my set. Shiny like the designer handbag of my dreams. One of a kind. eBay had it, and now everyone's asking, ooh, where'd you get your windshield wipers? eBay has all the parts that fit my car. No more annoying, just beautiful.
Whatever you love, find it on eBay. eBay. Things people love. Hey, it's Austin James. Yes, I'm living with diabetes, but it doesn't have to define me. Thanks to the Freestyle Libre 3 Plus Sensor, I get real-time glucose readings throughout the day. The Freestyle Libre 3 Plus Sensor is small and easy to wear, giving me the freedom to focus on my life as a parent and a musician. Now this is progress. You can get a free sensor at FreestyleLibre.us.
Welcome to the new Books Network.
Welcome to the NewBooks Network. I'm your host, Michael LaMagna. Today, I'm joined by the authors of Data Grab, The New Colonialism of Big Tech and How to Fight Back, published in 2024 by the University of Chicago Press. Large technology companies have access to an immense amount of personal data on all of us. And this is through our emails, online shopping, locations and movements, our health and our work.
While this may seem like a byproduct of the convenience of using technology, it is actually intentional in a growing industry. Joining me to discuss Data Grab are the authors Ulysses Mejias, Professor of Communication Studies at the State University of New York, Oswego, and Nick Kuldry, Professor of Media, Communications, and Social Theory at the London School of Economics and Political Science.
and a faculty associate at the Harvard University's Berkman Klein Center for Internet and Society. Welcome to the podcast, Ulysses and Nick. Great to be here. Thank you very much. So before we discuss your book, Data Grab, I was hoping you could tell us a little bit about yourself, your background, and your career path. Do you want to go first, Ulysses? I think you should. Sure. That's all it is.
Yes, I'm a professor at the State University of New York at Oswego. I teach mostly in the communication studies and mass media area. I'm originally from Mexico, so issues having to do with colonialism have always felt very close to just growing up in a former colony.
in all sorts of interesting ways. So I guess at some point it became evidence how issues of colonialism and technology intersected. And I also have the fortune of
working out these ideas with Nick for by now it seems like almost a decade. Yeah, and I teach at the London School of Economics. I'm a media theorist. I look at questions of media power. I have been for 25 years or more. But around about 12 years ago, I started to get interested in issues of data.
And then I had the good fortune to meet Ulysses at a conference and I guess throughout my life, and of course, I come from the colonial center of London, right? So I can't absolve myself from responsibility for any of that. But I've always been a critical view of colonialism and Britain's failure to address its colonial past.
So it wasn't in any way strange for me to start thinking in a decolonial way about data. And that's what we've been doing for eight and a half years, nearly, nearly pretty well a decade. We've been working on this together now. That's excellent. We could see what sparked your interest in this topic. So as we dive into this idea of data colonialism, I was just wondering if you could talk a little bit about, you know, for those that don't really think about data privacy, what type of information are companies collecting on us?
Oh, well, how long have you got? It's almost infinite. So if we start with the easy stuff, we know Facebook and all those things. We know they're tracking us as we play around on their platforms. We take that for granted almost. But in the book, we point out that this is actually just an example of a much wider pattern. We'll come back to examples later, education, health, whatever, Internet of Things. They're all data territories.
that are there to capture us. And to give you perhaps one simple example that's really hit home to people recently in the States, your car.
is gathering data about you every time you speed up, every time you slow down, maybe even what you're listening to on the radio or via your iPod or whatever it is, that's all being fueled, data fueled to the machine. And it's not just the car company who keeps that. They probably sell it to a data broker who for sure want to make money by selling it on to an insurer, for example. People have suddenly found they've lost their car insurance.
So they're committing a criminal offense just because some insurer's algorithm doesn't like the way they speed up or slow down, and no one told them. That's an example of how deep this goes, and it's pretty shocking in its consequences.
And it's not so much where data is being collected from us, because as Nick says, it's pretty much everywhere. Not just what we do through social media platforms, but anything connected to the Internet of Things, any smart device, which basically means it's connected to the Internet and it's capturing data. But it's not just this process of capture that concerns us.
Many times it's how this data is used in other contexts. So Nick mentioned how data collected by our cars might be used by insurance companies. Similarly,
Data from one of the apps that we use for phones might be used in a health context for various companies to determine, to sell those products or to determine certain things about our health. It might be how this data is used in the educational context.
It might be how the data is used even to train systems for war, weaponry, and for target allocation. So that's what we try to do in our book, to sort of trace how the collection of our data from our personal lives is feeding this much larger population.
which touches on every aspect of our lives, access to public services, whether we can get a loan through a bank or not.
And in many cases, even situations having to do with life and death when it relates to health or when it relates to war. So to provide some context for our discussion today about data colonialism, I was hoping you could talk about the four X's of colonialism. Sure. Well, so basically, you know, by now you've heard sort of referred to this concept of data colonialism, which is really at the center of our critique.
What we're saying is that we need to look at what's happening with data, which again encompasses everything, not just social media, but artificial intelligence, you know, crypto, whatever. We need to look at it not just in the context of the last 50 years, let's say the history of the internet, of mobile devices, but in a much larger context, a context of 500 years of colonial history.
extraction, colonial dispossession, and also 500 years of a very interesting history of, you know, not just colonialism, but its relationship with capitalism. So we tried to look sort of at how those two things co-evolved and developed. And maybe Nick can sort of talk about the four Xs.
Yes. Well, I mean, they come from a strategy video game invented by Sid Meier, part of the Civilization series, which sounds like harmless fun, but there is actually one called Colonialism.
the strategy video game and you can play various sides. You can play obviously the Spanish or Portuguese, the English or the Dutch, I think it is. And the goal is of course to conquer, but more specifically, Sid Meier designed four stages to that game. The first is explore. We can maybe come back to these in detail. If you've got to explore, you've got to expand because it's not just enough to find something. You have to get more than exploit to make money.
and then exterminate. And the goal is, of course, is to reach the highest level of the game, which is to exterminate, to literally erase other ways of living so that your way of living conquers, just as it did in historical colonialism.
It might sound an odd way in, but it's actually a very good summary of what colonialism historically did. And in our book, we show that this actually fits pretty well with what big tech and data companies are doing today.
And so this is a good understanding of what data colonialism, I think the four X's are an excellent way to really understand it and how it applies to basically data. So as we're thinking about this, what are the data colonies that are currently existing? Well, the way we approach it in the book is through a new idea that hit on us as we were trying to
get across our ideas that have been developed in a previous book for Stanford Press about five years ago into a much wider audience. And we hit on the idea of the data territory, because obviously data is not land. It's not like land. It's not anything like land at all. It happens when people write code to capture data. They put it in a database. It's highly technical. We know all that. On the face of it, it's nothing like land. And yet...
It's possible through writing code and building software to create a space where you have absolute control
over what goes on in that space. We call these spaces platforms. They think we know they're useful, friendly things that we use to do the nice stuff with the people we want to do nice stuff with, you know, send nice pictures of meals to family. But they are only possible because we're doing it on a data territory that gives just as much control, if not more, to the owner of that data territory as physical land does.
You trespass on someone's land, they might get a gun out to shoot you. It's not quite like that in a data territory, but everything you do there is captured. There's nothing you can do that isn't captured. And once you see that, then you get a sense of how explore and then the expanding of territory grows. And it goes across all these areas we've already talked about. They work because they're data territories and that gives absolute control and control.
you know, that affects things like workers. I mean, management power has always been very, very intense under capitalism. But now in this new phase of capitalism that's associated with the new data grab, the building of data territories,
Pretty well every manager now wants absolute control of their workers. They want to know everything they're doing, every time they're going for a toilet break, how efficiently they're performing after the toilet break, before the toilet break, and infinitely. They can do that on a data territory, and that's where people are spending most of their working hours now. So it's not just about the fun stuff. It's about the everyday conditions, everyday workers who are being trapped in these data territories under...
effectively pretty well absolute management power today. Yeah. I mean, so the question is, what are these new colonies? And I think to answer that, first we have to clarify a couple of things. Firstly, we're not saying that data is bad. We're not against data. In our work, Nick and I have come up with a very specific definition of data colonialism, which is an emergent social order
for the continuous extraction of information, of data from our social lives for two specific purposes, which are to generate wealth and to create new forms of social control. So that's the first thing that, you know, when we talk about data colonialism, we're not saying that we are against data.
Maybe later we can talk about some of the positive and ways that we can use data to resist, in fact.
But that's the first thing, that we're not saying data is bad. The second thing is that, yes, when we say that there are new colonies, like Nick has been describing, that might seem like we're using the word colonialism metaphorically, but we're not. We're actually saying this is a new stage in the development of colonialism. So we are very intentional about saying that these are, in fact,
new colonies. They don't look like territorial colonies like we had in historic colonialism, but the new colony is essentially our social lives. The new areas of extraction and of exploitation are our lives. And so, like we said in our definition, data is being continuously extracted from it to generate profit, to create new ways of social control.
But we do want to be very careful in acknowledging the differences because we're not making a one-to-one comparison. We're not saying just as there were colonies in the past, look, we have these colonies now that function and look exactly the same. We're not saying that.
We're trying to look at the continuities and we're trying to look at points of correspondence. But there are lots of differences. Colonialism and data colonialism doesn't look and doesn't behave exactly the same way as it did 200, 300, 400 years ago. So we want to be respectful of those differences.
colonialism in Mexico where I'm from, colonialism in India look very different, which also look very different than colonialism today in, let's say, Puerto Rico or Palestine. We want to be mindful of these differences, but we're also saying that there is one very important similarity across all forms of colonialism, and that is the historical function, which is to extract
and to dispossess colonialism is about that this for the first two of the four X's it's about exploration and it is about expansion and so when we look at colonialism throughout the ages including data colonialism that's what we're seeing again and again are you a professional pillow fighter or a nine-to-five low-cost time travel agent or maybe real estate sales on Mars is your profession
It doesn't matter. Whatever it is you do, however complex or intricate, Monday.com can help you organize, orchestrate, and make it more efficient. Monday.com is the one centralized platform for everything work-related. And with Monday.com, work is just easier. Monday.com, for whatever you run. Go to Monday.com to learn more.
This episode is brought to you by Amazon Prime. There's nothing sweeter than bacon cookies during the holidays. With Prime, I get all my ingredients delivered right to my door, fast and free. No last minute store trips needed. And of course, I blast my favorite holiday playlist on Amazon Music. It's the ultimate soundtrack for creating unforgettable memories. From streaming to shopping, it's on Prime. Visit Amazon.com slash Prime to get more out of whatever you're into.
What an excellent point. Now, as I was thinking about, Nick, when you're talking about workers, right, I was thinking about the gig economy, which is driven specifically by a lot of this technology. So how does that fit into our discussion here? Should I pick up on this one, Ulysses? Yeah. Well, often the gig economy is talked about as an opportunity. And we would, again, as a matter of respect, we wouldn't for any reason.
for any purpose just dismiss that. If you can't get a job in the traditional economy, you're going to take any means you can to get some cash. And the gig economy appears to provide that cash in the short term.
But the prices, you have to inhabit a data territory, which is the platform, which is the Uber platform, the Lyft platform. I was just in a Lyft yesterday when the train broke down. I had to get the Lyft between Wilmington and in Delaware and down to D.C. That was the only way to get there. A Lyft driver met us and he made it very clear to the four of us.
That it will be better if we did the journey off Lyft because otherwise Lyft was going to take 42% of the price, which was driven by surge pricing because all these people trapped on a train. That Lyft was able to extract from that situation, extract from us and extract from him. And it became pretty clear that if we then, he went off the platform and we paid him the cash less than we were going to pay Lyft, he was going to make more.
So on the face of this, it seems a good deal. But because Lyft and Uber and all these others, Airbnb, you name it, are data territories, they give absolute control.
to the controller of the territory, which is not the same as the old-style manager of the taxi or the cab company, of course. This is people sitting in Silicon Valley or in Beijing or Shanghai who have absolutely no knowledge of the territories, the worlds that they're intervening in when they get that platform going. So the gig economy is really ambiguous. We're not saying it doesn't have benefits in conditions of huge poverty. And
maybe Ulysses will want to pick up on this. There are many examples where on the face of it, there's desperate cash needed in the global south, but the actual price you have to pay to get that cash is super high. And we need to be ready to address this because the gig platforms will not tell you about the hidden costs of their economy. Yes, I think, you know, the gig economy is really...
very clear example of the second X in the Forex model, which is exploitation. It is a platform designed to exploit algorithmically. It can continuously decrease the wages of the people who work for the gig economy and it can, you know, increase the prices for the users.
That's just one aspect. Another aspect is that with all forms of labor and these new forms of data colonialism, there is an incredible amount of surveillance happening
Again, these data territories are engineered so that managers have a god-eye view of their workers and can see and can track every single interaction and action that the workers perform.
We can also see very clearly, for instance, in the way that AI is being introduced at all levels of these labor relationships. So these days, for instance, AI systems can interview applicants for jobs.
And of course, they present it as something good and something positive because it will allow HR departments to conduct many more interviews that they could possibly conduct just with human beings. But of course, as we know by now, AI has certain biases. And so we have to wonder about the way in which those biases are introduced into the interview process. From there, once the worker is hired,
Again, AI might be involved in monitoring and tracking workers in ways that we couldn't imagine even just 10 years ago. So that by now we have AI systems that can
categorize workers in different, label them in different ways, and then sort of suggest to managers that this category of workers are not really performing up to par. And guess what? If you use the services of this AI company, you can actually easily replace these workers
with AI. So the gig economy and in large, you know, the economy at large is becoming datafied in a way that should really concerns us because these are all ways in which the
Power of workers to exercise their rights is being diminished and their wages are continuously being driven down. So there's a lot of concern on how technology is being applied in these ways that don't really benefit workers.
And this is such a great example. And the gig economy is one of those easy areas to focus in on as a data territory and what's being datafied, right? And I was wondering if you could talk about maybe some other areas, like maybe in education. That's one area that we may not think about, you know, as a data territory or something that's being datafied, both for the good and for the bad. Yes. Well, I mean, the obvious entry point is that we all have horror stories about
If you're in the teaching profession, about the way in which Chad GPT has completely disrupted the process of learning what we think about academic integrity and just the amount of time and energy we have to spend in just trying to manage how do we deal with this new tool.
Of course, we are told that we should learn how to integrate this wonderful new tool into our curriculums because it's here to stay. At the same time, I am concerned that students feel that they can just go to ChatGPT and generate an answer to a question or generate an essay and maybe tweak it a little bit.
But it's really surprising the way studies are being done now that sort of show that when students use ChatGPT to generate answers, they are not really gaining any new knowledge. These studies are comparing those students who use ChatGPT with the students who don't use ChatGPT and have to do the work themselves. They might struggle.
more, but at the end of the day, they are able to recall the knowledge, you know, the following week, whereas the chat GPT group is not.
So, you know, that's just one area in which that's being datafied in education. But I'm sure Nick can mention others as well. Well, I mean, those are really valid points about chat, GBT, because what's really going on, you know, we think about AI as a technology. We're told this is the great tech. This is the future we have to hold on to because tech is always good.
But it's also a redefinition of what knowledge is and what is expertise. And that's a much more subtle thing. That's what teachers care about. That's what parents care about. But it's being redefined without asking them what they want. It's literal. And that's where we get back to exterminate. Often when we do our talks, exterminate is the hardest level of the data colonialism game for people to understand.
Hold on to because, as Ulysses stressed, we are never going to say in our books that what's going on here is as violent as the unimaginable levels of violence of historical colonists. Of course, our relations to our devices are more peaceful than that.
Sometimes there are things which are pretty close to violence, and maybe we'll come back to an example of it later on. We've been hinting at them already with the gig economy. But if we just stretch our definition of exterminate in a realistic way just a little bit more to say erasure of previous ways of living,
So you can't go back to the way you used to do things. That is pretty close to what's going on with generative AI. A lot of teachers are up in arms at the moment because open AI is offering them a package and saying, this is how you use ChatGBT in the classroom. But the very clear implication, if you don't use ChatGBT in the classroom, your competitor will, and they'll be cooler, more advanced, more modern, and so on. And they're being cut out of that.
discussion. But it's important to remember that this EdTech thing is actually at least 10 years old and is dominated by the same big tech companies, Google, Microsoft, Apple, and many others funded by big tech.
which have been setting up platforms so-called edtech platforms where every school assignment every class register the performance of every kid and also performance of course of teachers is being monitored by that platform
And because it's a data territory on terms that the students and the teachers, let alone the parents, absolutely do not control because this is a data territory. And this has been going on at least 10 years. The Gen I is, if you like, for the companies, the icing on the cake is this. This is where their power really makes a difference. So they can sort of impose this stuff.
They couldn't have done that without the previous 10 years of getting us used to doing stuff on edtech platforms. So it underlines the point that I think we're making all the way through in this new book that what seems super new and exciting about Gen AI and
AI, and we're told it's profoundly new, is actually just the latest opportunity that data colonialism started to open up already 10 or 20 years ago. But this is only an opportunity because they have acquired so much power over us, over our habits, over our mindsets, and so on. And that gives us a sense of the scale of what we need to be resisting here.
And that's such a great point, especially in the education space, the predictive analytics that the systems produce that basically will tell you, is a student going to be successful in that class, that major, right? And they try to then steer that student in a different way. And that gets to that idea of the bias that's built into the system, right?
That predictive analytics may know, but it may not know how that student can succeed. So, you know, it's very fascinating thinking about that as a data territory, especially those that are in education. But as we think about historical colonialism, there's always a civilizing mission. And how does that connect to your theory of data as data colonialism? Yeah.
Well, just like in the past with historic colonialism, when it was being deployed, of course there was always a narrative that accompanied it because it was such a brutal system and it was so clear that the system was meant to benefit only one group of people.
that, of course, it had to be packaged in terminology that sort of made it okay, you know, to introduce this model social system to the world and not have people resist it very vigorously. So those narratives were always very important. And, you know, you're right, Michael, to sort of point out that
They were civilizational always. You know, they were presented as this is what you have to do if you want to become modern, if you want to experience progress. Of course, the narrative in colonialism was that certain parts of the world were uncivilized, were inhabited by barbarians.
And so white Europeans were actually doing a good thing in colonizing them, right, in bringing modernity and progress and universal ideals that were going to revolutionize and improve everybody's lives. So this idea of progress, this idea of, in spiritual terms, also salvation.
that if you accepted colonialism, your soul would be saved because it became packaged, again, with a very specific religious worldview.
Today, we don't have those exact same narratives, but we have very similar narratives. And they function in the same way. They fulfill the same function, which is to make this system, which again, only benefits a very small group of people, and which comes with a lot of physical and symbolic violence, to make it acceptable so that we actually feel that, okay,
Firstly, there's no other way to implement this, to avoid it. And secondly, it's going to be for our own good. So those narratives include things like convenience. Why should we accept this? Because it's going to make our lives easier.
It comes with other narratives such as communication. Why should we accept this? Because it enables new ways of democratic communication that bring the whole world together.
Of course, if you're listening to this, you might already be coming up with examples, many examples of ways in which this actually hasn't worked out as advertised. Misinformation and disinformation have basically played havoc with the ways that we used to communicate.
But we're also told in this as part of these narratives that basically we should do this because it's innovation. It, you know, again, it's progress. It's modernity. Look at AI. If you embrace AI, AI is going to be smarter than us. It's going to be able to solve all of our problems. So why wouldn't we want to surrender our data to AI?
participate in the construction of this machine because again it's presented as something that it belongs to all of us we all participate in it but we know just by looking at the end of the day of where the power is occurring where the wealth is occurring that is not a system that benefits all of us equally
And that's where we get to a very interesting hidden continuity, because just extending what Ulysses just said a little bit, one of the key
civilizing narratives that's common between the current age and the old, older colonialism is science. I mean, the idea of European civilization from the beginning is we've got the knowledge, we have better weapons than you, we have a better understanding of how you make profit from agriculture because we have a better botany and so on and so forth. But that's very much a retrospective myth. As historians have found out,
Botany actually emerged very often, depending on the knowledge about plants and medicinal properties of indigenous peoples.
It was literally taken from them and then repackaged as science and then imposed back on them as the science that they had to comply with, literally erasing their previous knowledge. So there's a pattern there of a sort of civilization that imposes science, but at a very high price for those who had other ways of doing science. Now, just to stress, Ulysses and I are obviously not against science as such.
But there's an endless amount of historical authority to say that science has always been entangled with that history of colonial power. How could it be separate? But the links are quite precise. And that leads to a very interesting parallel with today. You know, we, as Ulysses said, we think of AI as science, as scientific. This is the best brains on the planet, the winners of Nobel Prizes. Now, we've just learned last month that
are behind the stuff that we pick up when we use chat GBT with our kid, right? So there's somehow this deep underpinning in science. But let's just think about the phrase artificial intelligence, AI. Artificial intelligence. One of the hidden secrets of AI is that artificial doesn't mean artificial. An AI...
a mathematical program for predicting the results across trillions of data points is not human. So it doesn't know what's a car and what is an orange.
Actually, human beings have to train that algorithm to say this is a car, that is not an orange, and many more subtle things. And this really makes a difference when what the AI is being taught is that this is an obscene image of a man abusing a woman, and this isn't. They have to be trained to know that. How are they trained? By human beings, real human beings, spending every day of their working life
looking at this stuff, the stuff perhaps that you or I would never want ever to see and never want our kids ever to see. But someone, as it were, has to clean the AI so it doesn't reproduce that stuff on our social media. And the same problems are happening with ChatGPT. We don't want obscene stuff coming in when we ask an apparently interesting, innocent question to ChatGPT. It's
Stuff has to be taken out of the data set. Who does it? Human beings. And this is where we get back to the deep limp with historical colonialism. Those people are not generally when the work gets really nasty. They're not in the global north. They're not in the US. They're not in the UK. They're in the global south because those people can be accessed through a data territory that spans the planet. And it can pay people the lowest possible wage to do the worst possible work.
that trains the AI so that we can comfortably sit back in our apartments and play with chat GPT. So there's a deep history between that colonial side of botany
And this hidden secret of today's AI, which is not in any sense clean and is not in any sense artificial or free from human labor. So it's just as tangled and in a way deceptive a story as the historical civilization or story was. But the problem is we are now reproducing it. So unless we have that colonial frame, we can't see those parallels. And that's why the colonial frame makes such a difference to the way we're living our lives today.
This episode is brought to you by AWS. Amazon Q Business is the new generative AI assistant from AWS. Many tasks can make business slow, like wading through mud. Help! Luckily, there's a faster, easier, less messy choice. Amazon Q can securely understand your business data to help you streamline tasks, like summarizing quarterly results or doing complex analyses in no time. Q got this. Learn what Amazon Q Business can do for you at aws.com slash learn more.
Travel is all about choosing your own adventure. With your Chase Sapphire Reserve Card, sometimes that means a ski trip at a luxury lodge in the Swiss Alps.
with a few of your closest friends. And other times, it means a resort on a private beach with no one else in sight. Wherever you decide to go, find the detail that moves you with unique benefits at hand-selected hotels from Sapphire Reserve. Chase, make more of what's yours. Learn more at chase.com slash sapphire reserve. Cards issued by JPMorgan Chase Bank and a member FDIC. Subject to credit approval.
So what were some of the early warning signs about, say, AI or other technology as it was coming up through history that we should be aware of? Well, I think one of the key things was amazing work, which is known if you're
working in the data industry, but not so widely known more widely, which was the work around the hidden costs of the generative AI technology. There is a worker who was born in Ethiopia who did a training in data science in the U.S., Timnit Gebru.
who was a brilliant data scientist and who became so brilliant that she became Google's chief ethicist. She became the lead ethics advisor on programs at the time none of the rest of us knew about, but they are what became Gen AI, Google's version of Gen AI. And around about 2018, 2019, she had the temerity with another academic, two other academics, to publish a paper
called Stochastic Parrots with a lovely little parrot sign in the title. But stochastic just means based on probability. The idea of her paper was AI just reproduces what's in the data set. That's the other side of what we were just saying that humans have to get the data set right. But the other side is AI only reproduces what's in the data set. So if it's not in there, it can't, as it were, know it. In other words, AI isn't knowledge.
She wrote an incredible paper, which was really clear about this and also clear about the huge environmental costs, the data centers and the electricity they use, as well as all the biases, as Ulysses was saying, built into the AI that are endemic to this thing we now know as Gen AI. And we take it in the sugary form of chat GBT. She got sacked by Google for saying that. She got sacked.
And she's a really important voice that we should be listening to. We were delighted to have her endorsing our book. She was really one of the first people to see where we were going and how unparalleled that was to senior management in big tech, so much so that they had to sack one of their top advisors.
But, you know, if we really think about it, these warning signs have been there for a very long time. So as Nick pointed out, you know, if we think about the history of science and technology during the colonial period,
Again, we're not saying that science is bad altogether, but we do have to be mindful of this history of how in order for colonialism to work, it needed to develop certain technologies and it needed to develop certain views of science, which at the end of the day, their purpose was to manage and to control the colonies.
this problem of management at a distance. How do you manage half of the world from London without a new set of technologies and a new set of scientific assumptions? So these warning signs have already been there from the beginning of colonial science and technology. And one way in which we talk about it frequently is by looking at the history
of what we call the concept of cheap. So let me sort of explain a little bit what we mean by that. In order to colonize the world, colonizers had to view nature as something cheap.
cheap nature, land was cheap, territory was cheap. It was just there for the taking. Colonizers would arrive to, you know, remote parts of the world like Australia, like Latin America, Africa, and say, look at all of this land. Nobody's living in it. Of course, we knew that some people were living on it, but these people were not considered fully civilized. So then it was okay to think of all of this nature
All of this land is something cheap that the colonizer could appropriate. Now, the next step was to then not just have cheap nature, but cheap labor. Cheap labor was essential to transform nature into something, you know, into wealth, into something profitable.
And of course, in colonialism, which was a racialized system of labor, the labor of certain groups of people, black people, people of color, women, was deemed as something cheap. But again, it was just there for the taking. So let's follow these warning signs. We have cheap nature, cheap labor. And now, of course, we have cheap data.
And if you think about it, the cheap data shares many of the same characteristics of nature and labor. It is said to be abundant. There's lots of data, right? We're generating trillions of gigabytes of data. It's abundant. It's free because it is said that, you know, this data doesn't really belong to anybody. Yes, you and I produce it individually, but then we need big tech.
to aggregate it, to bring it all together and to do something useful with it, which requires a lot of technological, advanced technological investment. It requires a lot of energy.
which has obviously an impact on the environment. But again, we're told that our data is cheap. It doesn't have a value. So we should surrender it to this system so that big tech can do something useful with it. So we've had this warning signs for a while. You know, if we think about this progression of cheap nature, the cheap labor to now cheap data.
And so you've hit on some important points here, workers in the global north and global south, who is managing these colonies. So who are members of the new data colonial class that exists?
Yes. Well, in the book, we kind of we wanted to come up with umbrella term for all of that, because, you know, it's easy to just talk about big tech. They are the ones that we can all identify. But there's lots of other players in this ecosystem who are participating. So we just call the group the social quantification set.
So the industry sector that's involved in quantifying our social lives. So yes, we have big tech, Amazon, Google, Facebook, Microsoft.
But then we also have lots of other players, Palantir, the manufacturers of all of the smart devices, whether they are phones or refrigerators or cars. So we try to sort of encompass, you know, all of them under this umbrella term of the social quantification sector.
There are people in the sector that all they do, they don't really manufacture anything, but they just take data and repackage it and sell it to somebody else. So data brokers are, of course, a big part of this system. Yes, certain people might play more prominent roles in this sector. But again, we were trying to sort of come up with a descriptive terms to encompass all of it.
And in a sense, we're all a little bit involved because we're all generating that data. And the norm of most, suppose, imagine you're setting up a new website at the moment or some platform.
Because you just want to sell sandwiches, right? Or you want to sell yoga teaching or whatever it might be. The pressure to set up a platform that extracts data from your customers is huge because that's the norm, right? So a lot of people are getting into that as just the way you do business. So it's becoming normalized just as colonial methods of rule and extraction became normalized in the colonies. They were the way the economy ran.
That leads to sectors like insurance that we were already mentioning, just doing data extraction as their basic means of making money. And insurers don't actually take any risks anymore. They just surveil the hell out of us so that the risk is reduced to the minimum. And yet they still call it insurance. There are individuals though, as well, just as there were
colonizers who went out on their own, took some higher risks the most to grab the data to do the original exploring. There are parallels to that in the contemporary era, just to give you two. First of all, there's the individual, very smart computer engineer called Ton Tatt, who founded Clearview AI. Clearview AI was exposed by the New York Times a few years ago.
Because he basically took all the pictures of all of us on Facebook, put them into a massive database, did some metadata and so on, and sold the analysis of facial patterns linked to all the other personal data. He sold them to police forces right across the U.S. He did it entirely without permission.
permission. In some states that was regarded as illegal, but he's also a hero of various countries. He's received a lot of investment from
from not very friendly states around the world because this is a colonial tool that is massively useful to certain repressive governments. But I think it's not too much of a stretch to see Elon Musk as one of these lone entrepreneurs. On the face of it, he represents the official face of big business. He's the richest man in the world. He owns a number of vast companies. By all accounts, he is rather good at engineering.
But at the same time, he's doing some things which really fit extremely well into the colonial pattern. So he, through his Starlink system, is providing satellite coverage to parts of the world that don't have satellite coverage, such as the Amazon. Such as the Amazon that doesn't have much coverage. So the loggers who are cutting down parts of the Amazon with huge environmental costs often get their internet signal courtesy of Elon Musk.
And I could go on. There's many things he's doing that fit very clearly into the idea of a colonial class. He's, if you like, an apex predator within that colonial class. Well, with that, how can we as workers, citizens, activists resist? I could kick off perhaps with a simple point, and Ulysses will add a lot more detail here, that the first thing we need to do is imagine.
No major resistance in history, including resistance to colonialism, has happened without people holding on, however difficult it gets, to the possibility of imagining the world could be different from the world that's on offer to them.
And that's the first thing we have to do, hold on to the power of imagining. In a way, that's what Timnit Gebru was doing when even though she was employed by Google and they were investing hundreds of billions of dollars in generating, investing in what became Gen AI, she insisted, no, we could imagine a world without this. The costs are high. Let's hold on to the idea the world could be different.
And in an interview she did with Time magazine just a year after she got sacked by Google, she said a really powerful thing that maybe we could hold on to as we think about imagining. She said, why not let the people who are being directly harmed by tech imagine the future that they want?
Why would we silence them? Why would we say you haven't got a voice in this? Well, we know why, because there's trillions of dollars of profit to be made from silencing those very voices. And she herself paid the cost of that. But that's exactly the sorts of voices we need to listen to. Hasn't been happening until now, but we really hope this is what we need to be listening to more. We need to listen to different voices, different organizations, different communities that do hold on to the idea of a different future.
Yes, and I think it is crucial that we resist. I mean, we have been enumerating some of the social economic impacts of data colonialism. We haven't talked too much about the environmental impacts, but in a way those are the ones that are very clear to see. And there are people doing amazing work already in trying to expose what happens when you build data centers.
in areas that already have problems with water supplies and energy supplies. There are already people pointing out that in order to satisfy the energy needs of artificial intelligence, we will need an unimaginable amount of energy that we currently don't have.
So this means instead of retiring carbon-based energy plants, we need to prolong their life, build more of them and perhaps start building nuclear reactors to provide the energy that we will need. So if we don't resist, very clearly this is part of a trajectory that is doing a lot of damage to the environment and it will get worse.
Now, how do we resist is the question that obviously Nick and I have tried to think about a lot. If we are right that these problems are part of a 500-year-old history of colonialism, what are the chances that we're going to be able to solve them very quickly and effectively? Obviously, these are great challenges that we face. And so, as Nick said, you know, imagination is going to be a big part of this problem.
resistant movement. And so we actually look at activists from the colonized, from former colonies, activists from the global south who have a lot of experience resisting colonialism and we've tried to learn from them how to approach these big challenges. And what they tell us is that we must work simultaneously at three different levels. We must work within the system,
We must work against the system and we must also work beyond the system. Just focusing on one area is not going to be enough, so we have to do all three. What do we mean by these different approaches? Well, of course, within the system means working in mainstream politics. We need to put pressure on our governments to pass the kinds of regulations that are going to benefit us, not big corporations.
Now, the prospects for doing that in the next four years in the United States look very grim, right? Because we were going to have an administration who couldn't care less about people who want to, you know, reward big corporations. But at the state level, there's still a lot that we can do and that we must do to resist. So that's working within the system. But, of course, the system's not going to fix itself. So we must also work against it.
to all sorts of protests. And in the book we cover lots of examples. There are many that we don't cover because fortunately what we're seeing is that this is an emerging movement, a global movement of resistance. And here again we can learn a lot from past decolonial movements.
And then, to go back to Nick's original point about imagination, the last level of resistance is working beyond the system. How do we imagine alternatives to all of this? How do we decolonize the time colonized by extractivist technologies? How do we decolonize the space?
occupied by cameras, by smart agents that have interfered in our most personal intimate spaces. How might we even decolonize love?
Even Eric Schmidt, the former CEO of Google, is worried about the impact that AI girlfriends might have on teenagers. So we will have to just imagine all ways of decolonizing our data, because at the end of the day, this is a movement for also undoing the injustices of colonialism at large.
When you're on the go and it's time to refresh your energy, grab an ice-cold Celsius Arctic Vibe, where zero sugar, seven essential vitamins, and proven ingredients meet the pure refreshment of frozen berry. Unlike traditional energy drinks, each sip of Celsius is a perfect balance of flavor and function. So whether you're hitting the gym, the office, or your next adventure, grab an Arctic Vibe at your local retailer or visit Celsius.com to learn more.
Bored with your boring cardio? Stop pedaling that snooze cycle to Nowheresville and try some cardio that's actually fun. Supernatural Fitness, available on MetaQuest. Isn't that right, Jane Fonda? Cardio will never be boring again. Sweat to the beat of thousands of chart-topping songs inside stunning virtual landscapes. Bet your stationary bike can't do that. Visit GetSupernatural.com and join the next fitness revolution. Supernatural VR Fitness, only on MetaQuest. Ready team for team.
Well, Ulysses and Nick, I've taken up a lot of your time today. And as we wrap up our conversation, I'm wondering now that you have the book out, what are you working on now? Or you can continue to explore data issues or are you heading in new directions with your research? Oh,
Oh, okay. Well, I'm working on AI. I've actually just published a solo book on the future of social media, which is in a way just a local problem within data colonialism. So I think you can guess where I'm coming from there. I think we need to dismantle and rebuild social media. That's the book that actually comes out in the US on December 10 called The Space of the World. Can human solidarity survive social media? And what if it can't?
But for the longer term, I'm also very much thinking about AI and its impact in the education sphere and other spheres because I really believe there's a heist going on over all the key elements of the social world. Above all, the definition of knowledge.
You don't get much more fundamental for that in terms of how a society holds together. And just as with colonialism, it's being redefined and it's being redefined without anyone asking what we think about it. And for me, I don't know what Ulysses feels, but we work for nearly a decade now on colonialism.
And although at first it started like a voyage with some headwinds and some risks attached, it took some courage to get going down this path. It wasn't what most people saying. Ten years on, I'm ready to double down. I think it's ever more clear that the colonial framework is the only one big enough to make sense to what's going on, both with AI and with everything else we've already talked about. So I'll be continuing down that path, whether solo or with Ulysses.
And why don't you mention Tierra Común, Nick, and what we're doing there? Yeah, we together, we formed Tierra Común, which was a network actually in the pandemic, a network for activists in Latin America especially, because both of us, obviously Ulysses is a native language, but I've learned Spanish. We speak Spanish and it's a trilingual website in Spanish, Portuguese and English. English is the third language.
For those who want to resist data colonialism, it's called Chera Comun. It's easy to find online, and both of us are still involved in keeping that going. So it's not just about writing books. Books can never be enough. It has to be about practical networks of resistance.
Yeah, we're very fortunate that we have been able to form these networks along with some wonderful colleagues like Paula Ricarte in Mexico. And by now Tierra Común has over 100 members. Myself, I am working on a handbook on critical data studies with two amazing colleagues, again, Jasmine Magnilli and Milagros Michelli. That should be coming out in 2026, I believe.
And at large, I think I'm interested in the issue of education as well, because I do think that a solution to these problems is to rethink how we educate the scientists, the engineers,
but also how do we educate the policymakers and the artists to think about these problems? So, of course, I'm an educator professionally, so I have something at stake in these questions, but I would like to keep thinking about how decolonizing data will require us to basically change our curriculums, change the ways even education is structured. Okay. Well,
Well, these sound like really interesting projects, both the academic projects as well as the activist projects. I have to tell you, I really enjoyed our conversation. And as I shared before we started, I've already recommended this book to a number of different colleagues. I really want to thank you so much for joining us today. Thank you so much, Michael. Great to have the opportunity. I'm your host, Michael Amagna, and thank you for listening to The New Books Network.