Work management platforms. Ugh. Endless onboarding, IT bottlenecks, admin requests. But what if things were different? Monday.com is different.
No lengthy onboarding. Beautiful reports in minutes. Custom workflows you can build on your own. Easy to use, prompt-free AI. Huh. Turns out you can love a work management platform. Monday.com, the first work platform you'll love to use. Make your next move with American Express Business Platinum. You'll get five times membership rewards points on flights and prepaid hotels booked on amextravel.com.
Plus, enjoy access to the American Express Global Lounge Collection. And with a welcome offer of 150,000 points, your business can soar to all new heights. Terms apply. Learn more at americanexpress.com slash business-platinum. Amex Business Platinum. Built for business by American Express. Business taxes. We're stressing about all the time and all the money you spent on your taxes. This is my bill?
Welcome to the new Books Network.
Welcome to the New Books Network. We have the pleasure to welcome Professor Matthew Mahmoudi to present his new book, Migrants in the Digital Periphery, New Urban Frontiers of Control, published in 2025 with the University of California Press. Matt Mahmoudi is an assistant professor at Cambridge University in the UK.
My name is Dr. Hannah Poole, senior researcher at the Max Planck Institute for the Study of Societies. Welcome, Professor Mahmoudi. Thank you. Thank you for having me. Matt, you just published a book on a topic that could not be more current. But when was the first time that you ever thought about writing this book about migrants in the digital periphery? Oh, man. Yeah.
I suppose I was starting to look at this moment back in 2016, 2017 upwards. Well, really, I guess towards the end of 2015, when the response to the quote-unquote Syrian refugee crisis was starting to roll around. And we started to see a slew of responses across the European Union in particular. And it was quite interesting to me at the time to see the seemingly unbiased
highly, let's say, invested response from the tech sector. And in particular, the response that we're starting to see from startups that were either former humanitarians or who were themselves quote unquote startup bros.
And the ways in which they were channeling their energies towards developing quote unquote solutions for the refugee problem, whether it was for integration or for work matching or for identification or for housing matching. Inevitably, a lot of them ended up being quite focused on the high tech sector. So it would be things like blockchain based identity system, biometrics and so on.
And it was interesting because a lot of it was galvanizing attention from the big players and the venture capitalists and the Googles of the world and the Facebooks of the world. And this was sort of happening at the same time as the Trump administration, the first one, first came into power and in which a sort of slightly more subtle but yet fairly present relationship between increasing security tech developers, namely your palantirs of the world,
and others were getting involved with immigration customs enforcement in the United States, but under usually quite liberal guises. So we would be often guised under this veil of helping to find missing children, helping to find smugglers, and helping to do good, to do humanitarian things. And the ways in which those relationships came to form also relationships that cities and local authorities, cities in particular because that's where refugees and migrant communities go,
were engaged in relationships with tech companies that were, again, ostensibly purporting to be serving migrant communities, but often doing so to the effect, as we later learned, surveillance, detention, deportation, family separations. And what I was interested in at the time and became interested in more and more were the ways in which
A lot of these technologies were not at all interested in or concerned with the ways in which refugees, undocumented communities, precarious communities on the move, or even missing children
we're engaging with or being identified by these technologies, but we're far more interested in the kind of speculative capital and opportunities that came from being involved in government efforts to find and detain and deport and separate refugees and undocumented communities, and also to make a buck on the back of
the symbolism of them existing in the first place. But yeah, that's roughly what got me to this at a very different time. And before we really delve deeper into your book, maybe to clarify some terms, what is this digital periphery and how does it act? Yeah. So I started to think more broadly and I should say, I have an interest in understanding the
contemporary manifestations of racial capitalism. And what I mean by racial capitalism is really steeped in the Black radical tradition and drawing on folks like Cedric Robinson's definitions of capitalism as being necessarily racial. I have to back it up to this point in order for this to make sense. But in a nutshell,
Cedric Robinson looks at how modes of subjugation, including the modes of subjugation that enable racialism to perpetuate and continue from feudalism and into the emergence of capitalism, continue to exist and endure, that they're not undone and, as it were, revolted into a different state past feudalism and into capitalism, but in fact, they're innovated. And so he thinks of capitalism, you know, as opposed to a lot of Marxian thinkers, as
as a evolution of feudalism. And in so saying, really, is saying, well, look, capitalism is going to continue to find creative and new ways of exploiting race to position people in different economic hierarchies for their exploitation. And in the same way, then, with the digital periphery, I'm interested in understanding the ways in which
conventionally geographically situated forms of extraction of colonialism, of racial exploitation play out in ways that are both corresponding to those geographic distributions, but also in ways that transcend them quite radically. So
Instead of being concerned with now, say, the periphery, quote unquote, or the global south or the underdeveloped world or the lesser developed world.
or the colonial subjects, as it were. Tech and tech actors, including governments that over-stamped them, are interested in the ways in which Black refugee, marginalized, minority, terrorists, Palestinians, etc., are situated within this sort of spectrum of threats, either as needy humanitarian subjects or
that are looking to have their needs met through necessarily technological interventions or as direct security threats that need to be disciplined. And the ways in which really the digital periphery is no longer concerned with, again, their geographic boundness, but exists on the terms that have been defined as coalescence.
both their identities as well as the general milieus that they walk in. So I talk about the digital periphery as operating in essentially three ways. First, I talk about how the digital periphery is concerned with techno-governmental ties. So really the ways in which tech companies and governments work together symbiotically to reinforce one another in the
exchange of symbolic capital and racial capital. I think about techno space and the ways in which spaces in particular are utilized towards the instrumentalization of racial tropes and racial environments. For example, thinking about the urban outcasts and thinking about places in which poor communities live and how those spaces are usually in some ways colonized through technology.
And thinking also at the starting point of techno development, which is the ways in which the, again, the image of the humanitarian other or of the threat is utilized by tech companies together with governmental actors to create the rationale for this interjection and impartment of technological development or exploitation in whatever way.
Thank you so much also to already guiding us into the theory of your book. Maybe before we dive into the actual sections of the book,
How would you describe your book's essence? What precisely did you want to reveal to your audience? I was concerned with, at the time, the ways in which a largely liberal veneer under the guise of, at the time, refugees welcome, quote unquote, as a movement, was being used.
situated as an almost mainstream progressive undertaking, but still a mainstream undertaking that created the pathway towards the insertion of tech actors in the process of migration governance, and therefore, by extension, in the process of borderization. So it
It's no longer about, you know, governments drawing the lines between where the nation states borders are. But it's now also about where does a tech company think that the border should be? Should it be at the borders that are conventionally associated with a nation state? Or should they be literally around our bodies or around particular communities or in the case of refugees and precarious migrant communities?
Should they be tied as an elastic band that really tightens subject to how precarious your immigration status is, ready to snap you back to wherever they demand you should be? And so to me, very, very specifically, again, it was the insertion of tech actors in the business of migration governance that drew me to write and think with
a lot of the activist communities and a lot of the communities that were subjected to these circumstances themselves and some of the more critical folks around these issues and to get to grips with it in the book. Thank you. And can you describe this actual process that you observed in which migrants become, and
in pursuit of racial capital. What is this process? So we've seen this before, right? And we've seen this before, both in terms of the Naupos colony, but in the, for example, under British colonial rule in India, whether it was through census experiments, we've seen it before in Ellis Island, when a
vagrants and migrants would arrive at the port and have their bodies stamped on with ink by people who were assessing their various quote-unquote defects, largely drawing on eugenicist lines of thinking there. We've had it happen through vaccination trials and forced attempts at medical experimentation with Black communities in the United States. And we've seen it done in refugee camps in
in Jordan, in Azraq, in Zaatari, where in particular biometric experiments have been deployed to try and make sense of whether blockchain and biometric tools can be used together to lower inefficiencies involved with
refugees trying to access food and services and shelter. Now, what's interesting to me are the ways in which no longer are we bound to those particular areas and to those particular sites, the ways in which
technological experimentation on the backs of a precarious status alone is enough to make the refugee camp as an iconography endure beyond the camp itself. So increasingly, we see people who are subjected to camp-like conditions by virtue of the tools that they're told to access services through or the tools that they're used to
that they're told to use for identification, the tools that they're increasingly having iterated in their names, even if they never end up being useful to them, or if they sometimes end up exploiting them, or if, as is mostly the case, they end up making money off of the back of using their names. So to me, becoming an experimental site in this way, disaggregated from a particular geographical perspective,
is a new development that is made possible by the fact that the digital periphery
can be defined by tech actors and governmental actors and can be defined without involvement or intervention of those tools otherwise pertain to. After setting the scene through the theory, but also the context in which you wrote your book, tell us about your research, what you did. You focus on two sides, New York and Berlin. How come and
What were the environments in which your research took place? Yeah, good question. So in the context of Berlin, I was looking at this movement, this refugee welcome, I say movement very loosely, but I really...
I really mean this narrative that it was emerging across the European Union that was particularly strong in Germany, which had taken in at the time, you know, nearly 900,000 refugees or so was a claim at the time. And under the banner of Flüchtlinge Willkommen, there was a lot of movement from small startups and organizations that were claiming to do things for refugees and make things easier, etc.
that others, you know, like governments were, for example, allegedly failing to do. This included things that I mentioned previously, attempts at blockchain-based identity to enable access to banking, which is one of the many initiatives that ultimately never got used by really a refugee, despite winning funding from it. And despite, you know, the founders then ending up getting consultancy gigs with big organizations, including MasterCard,
It included tools that were set to help match refugees with work, with housing, etc. Again, many of which relied on metrics that were profoundly flawed, that never indicated any real usage, or at least were hiding the real usage by refugees themselves, but were winning money from Salesforce, from Google and from Facebook at the time. And so to me, Berlin became interesting because on the one hand,
You had a huge movement from the tech sector to mobilize on this moment. And on the other hand, you also had real mobilization from local communities and anchor communities who were taking in people and were working with them on trying to contest this.
the otherwise dominant policies of the times and trying to also assert what the needs and desires and sort of futures that newcomers saw themselves in looked like, where otherwise tech actors were quite sweeping as were governmental agencies in defining what those futures could look like. And similarly, whilst this came from a quite a otherwise humanitarian angle,
What I saw in New York City was people
A way in which smart city interventions that had purported to make it easier for the city to assert its status as a sanctuary city by making it easier for refugees and undocumented communities to access city services was actually steeped in a security paradigm that was involved in surveillance of these very communities and in which there was a growing fear amongst immigrant communities in particular about
that those that were being asked to engage in using these tools were also the very communities that were at highest risk of potentially being caught and identified by Immigration Customs Enforcement, who as a federal agency could undercut the city's sanctuary status. And so what are the ways in which convenience start to play a role, tech convenience starts to play a role in weaponizing urban space and
for immigration enforcement against New Yorkers. And so the difference between the two became really interesting to me. On the one hand, you had the state reinforcing the market,
And that would be in the case of Berlin, in which a lot of the sort of integration initiatives and the refugee startups were given a lot of free roam by the government and were also situated as, you know, being a cutting edge and they won several awards. There was one particular initiative that was also visited by the British monarch and given an award.
And on the other hand, in New York City, where we were sort of watching the market reinforce a state in which the tools that was being provided that were being provided by Silicon Valley was enabling the federal government to exact its immigration regime on some of the most marginal marginalized communities. So that's really the inception point. And that's really why these two cities become quite important because of those differences and yet the sort of similar outcome of
weaponization and commodification of refugee subjectivities in the case of Berlin, and then of refugee environments in the case of New York City. And what was your role within your own ethnography? What was your positionality like? And did it differ in New York or Berlin?
Hi, this is Javon, your Blinds.com design consultant. Oh, wow. A real person. Yep. I'm here to help with everything from selecting the perfect window treatments to... Well, I've got a complicated project. No problem. I can even help schedule a professional measure and install. We can also send you samples fast and free. Hmm, I just might have to do more. Whatever you need. So the first room we're looking at is for... Shop Blinds.com now and save up to 45% on select styles. Blinds.com. Rules and restrictions may apply. I bet.
It's tourney time. And with FanDuel's dog of the day, you can get a daily profit boost during the college conference championships to bet on any underdog. So get ready to celebrate some upsets. No one saw that coming. Except for me, babe. 21 plus and present in select states. Opt-in required. Minimum plus 100 eyes required. Bonus issued is non-withdrawable profit boost tokens. Restrictions apply, including token expiration and max wage or amount. See terms at sportsbook.fanduel.com. Gambling problem? Call 1-800-GAMBLER.
So another really great question. So I myself have some reflection on the ways in which my personal story ties together with the experiences of some of the communities that I was talking to, working with, thinking with in Berlin and New York City.
And that in turn has a bearing, of course, on my chosen methods as well. I come from a background of parents who were refugees. I grew up in what in the place that I was born and raised in Denmark was referred to and is still referred to as a ghetto. And we were largely having to navigate governmental attempts at both overtly attempting to integrate, but more
sort of, I guess, innocuously or subtly what I think was an attempt at segregation and more clearly weeding out particular communities from other communities. And often that came in the form of either, you know, technocratic, bureaucratic barriers or
Or it came in the form of what seemed like convenient fixes, but which then landed people in profound states of immobility and not being seen or heard appropriately. Red flagging and systems that made decisions about us and who we were and what we could have access to on the basis of, again, a website and rigid tools and rigid understandings of culture.
And so to me, some of those legacies played out in the ways in which, you know, newcomer communities in New York City were experiencing coming to the United States or in the ways in which newcomer communities were experiencing not being given much of a voice or say in their own destinies in Berlin.
And so to me, you know, deploying a semi-ethnographic approach, at least to engaging with both newcomer communities directly through the everyday environment.
Whether that in the case of New York involved being directly engaged in a resettlement organization and working with some of the caseworkers there who themselves also came from the communities that were concerned, or whether it was through the sharing house that I was a part of volunteering with in Berlin.
to me, became important to situate myself in close community with the communities that I was trying to understand how they were experiencing the sharpest edge of these technological interventions. Because part of my methodology is thinking about the technology not as just the lines and codes that are written behind a computer screen, but the technology itself as phenomenology
phenomenologically being how it is experienced. So the technology itself doesn't make sense outside of the socio-technical analysis. And so to me, it became really important to trace the contours of every intervention through speaking directly with the communities and practitioners that were being forced to use these tools or that were being forced to promote them or that found them deeply problematic in different ways.
Are there people from the communities, the caseworkers, who really made you rethink everything you had thought until that moment? Is there someone you would like to introduce to us, someone who really helped you to reconsider that?
than known for you? There were a couple of different incidents. There is one particular situation/example stands out to me with Berlin, which is actually the ways in which communities were organizing themselves using technologies and ways that I thought maybe would be past what was possible or useful.
especially in light of some of the risks associated with surveillance and big tech platforms, in particular, a group called Syrian Women in Berlin and Germany.
were involved with categorizing forms of knowledge that were directly applicable and useful for newcomer communities by hashtags and through just organizing posts that were posted within these closed groups. And what was interesting was that they were only, again, they were only for Syrian women and
And what started happening as a result of just the high quality of veracity of the information on these groups was that a lot of the Syrian men who were also looking for this information and getting information.
false or inaccurate information from elsewhere and a lot of the other tools out in the media ecosystem were coming to the women that were a part of these groups and asking them for this information. And they would then either consult those groups or they would already have the information to hand because of the incredibly rigorous, laborious work involved in categorizing and verifying the information in this group.
And in a way, it became both a way of taking matters related to misinformation into their own hands as Syrian newcomers, but also in other ways, it became about mitigating a lot of the gender inequities that also exist within these communities and also that exist, obviously, at large, but in ways, again, that typically these groups don't.
typically any tech initiative that I had come across in the particular ecosystem didn't necessarily reflect these kinds of othered and differently configured epistemologies. And so I found that quite interesting on the one hand, the ways in which typical spaces associated with surveillance and big tech were being appropriated in these useful ways, but we're not receiving funding. We're not
you know, novel enough, we're not sexy enough. It's much more palatable to venture capitalists and to local authorities to be involved in producing an app that sits there, but that has, you know, 80,000 page likes than it is palatable to invest into information organizing in a Facebook group, even if that Facebook group is attended by, you know, over 20,000 Syrian women.
So that's on the one hand, Berlin. The other case in New York City, at the time, we'd started to hear a lot of the discourse around an area-based approach to identity systems being one that could potentially yield inclusive outcomes when it came to creating ways in which, say, community members from precarious immigration status could access government services with an alternative form of ID without revealing information about who they were.
And so in particular, I was speaking with a caseworker in the resettlement organization that I was based at.
who had started to take active notes and hear from his own clients the ways in which this one initiative, which was called IDNYC, which was really aimed at not being refugee-specific and not being specifically concerned with providing particular status of immigrants with an identity, but which was still a
acted as a pathway towards having something that proved who you were. At the time, all it really gave you was free access to a few museums and
And the idea was that you could use it to interface with governments and that they wouldn't know about your immigration status automatically and could let you, as it were, just slide through the system or through being stop and frisked if you were approached. But as it became quite apparent through my engagements with both the caseworker as well as
through looking at the comments and the feedback and the ways in which people were interacting with the IDNYC system more broadly, and through the news report that started to come through at the time, it was quite imminent that the people that were most likely to have IDNYC card were also the people that were most likely to have precarious immigration status, and that Immigration Customs Enforcement had clocked on to this.
And so if they asked someone to produce an ID and they produced IDNYC, they were often frisked and held back and subjected to further inquiry and interrogation and in many cases detained. And so IDNYC itself became a proxy for identifying immigration status, which is a huge problem, right? Because it was supposed to be this great egalitarian project that was under sanctuary protection. And so that to me was, again, another incident in which I thought, well, this isn't quite
what I expected because frankly, I had, I guess, uh,
some amount of faith in the sanctuary city as a both policy framework, as an ethical framework by which urban planning and policymaking took place, and also as a movement more broadly. And so I guess the IDNYC card was exciting to me in some ways. And to see the fallout from it on the other side was profoundly devastating.
Beyond that, can you give us a concrete example of how urban infrastructure serves a double function in the digital periphery? For instance, maybe the Link New York City kiosks. Yeah, yeah. So I guess this is really my first foray when I enter New York into looking at the ways in which particularly innocuous looking systems of urban infrastructure intersect and
deportation rates. And if you go to New York City, I mean, even in London and many other places, increasingly, you might find yourself looking at these kiosks, these big screens that tell you all sorts of local information. They might contain information about an archive related to, say, Black New Yorkers in a particular area within which the kiosk is situated. They might tell you about the immigration histories of particular
areas within which the kiosk is located. I should tell you the kiosk itself, again, it's a massive, massive screen. In front of the screen under the black panel, there are several cameras. On the side of the screen, there are keypads as well as a smaller screen and charging ports so that you can plug in your phone and charge it. It was built originally with the idea that
The range between someone who is homeless who might need to recharge a device versus a person who worked in the corporate sector who might also be on the run and need to recharge could be met with these devices. And that equally, these screens would emit public Wi-Fi so that anyone in the area could access the internet, even if they had a data plan that was low or even if they didn't have any whatsoever.
These are plastered all over the city. At the time, there were thousands of them, and they were speaking of expanding it to about 8,000. This was in 2018. And what became sort of a source of worry for me initially was through my interactions with both the makers behind the technology as well as through doing a little bit of corporate sleuthing in the setup process.
of the Wi-Fi kiosks and speaking to immigration resettlement organizations,
There was a big concern about what essentially a daughter company of Alphabet, known as Sidewalk Labs, would develop these kiosks, were doing developing these kiosks in the context of New York City, what protections they had in place to ensure that large-scale sniffing of people's data wouldn't be taking place. And lo and behold, there was no real such protection. And in fact, shortly after I arrived, there was an incident where
in which a homeless man was accosted
by LinkNYC for smashing the cameras on the kiosks he was referred to, the NYPD, and rather than seeking community consultations on how these kiosks should be built differently and whether they should be engaged in community surveillance to begin with, they framed it as a mental health issue and continued their expansion of the project.
Now, what became evident through, at the time, college students' interrogation of the code behind the Lincoln YC kiosk was just a sheer amount of intimate information that was being gathered, not just through people's connection to the kiosk, but through the
to the device via Wi-Fi, but also from any nearby device that had a Bluetooth emitter on. So that included personally identifiable information not related to just the phone and the device type, but also the language, information that was coming through your phone, whether it was text messages and whether it was browsing history, information that could actually be used to generate
demographing information and even more targeted information about the people in that particular community. Information that would be important to someone who was posing as an advertising company and trying to deliver targeted ads, but also to an agency that was trying to figure out where undocumented community members might be.
And so if you start to overlay the spatial dimension of where these kiosks are, together with the over 300 deportation raids that start to occur during 2018 into 2019, you start to see that there is actually a bit of a, there's a fairly strong correlation between where the kiosks themselves are situated and where the deportation raids are taking place.
And again, I say this is in 2018, 2019, because further research that I've done since as my researcher role within Amnesty also shows how not just these surveillance kiosks, but facial recognition devices are also concentrated around neighborhoods that have more black and brown community members and non-white members.
And so surveillance, we know in New York, has always followed and continue to form walls around in particular communities that are marginalized and that are precarious. But what was particularly problematic about these kiosks were the ways in which they were situated themselves as being off the community, by the community. In fact,
the Mayor's Office of Immigrant Affairs were asking immigrant communities to use the 311 immigration services on these kiosks directly available within their communities. But then these kiosks themselves were potentially leaking data and snitching on the very communities to immigration customs enforcement. So double-edged in the sense that, you know, you can purport to be community Wi-Fi, public Wi-Fi project,
But also, on the other hand, you're expanding the surveillance dragnet that makes you available to some of the worst actors. Let's go a step further. So from the urban infrastructure to the actual device.
You write how apps themselves can snitch on you, activating, for instance, the process of deportation or the denial of health care. Can you describe this process of snitching that apps might be able to do? Yeah, at the time, shortly after actually the fieldwork, there were a number of investigative reports saying
in particular out of the Intercepts and other organizations, media organizations and elsewhere, that started to show the ways in which data brokers in particular played a huge role in the surveillance dragnet of Immigration Customs Enforcement and in the surveillance dragnet of the Department of Homeland Security. Not least, for example, the ways in which even Muslim prayer apps and various faith-based apps were being used
being channeled through data brokers that basically process the data that is being sold to them effectively through apps and then sold onwards, usually to advertising agencies that then serve advertising on the apps or use that data to create more targeted advertising. But that same data was being sold off onto federal governmental agencies.
that were looking to establish threat models for who and where might be a threat to national security in the United States. So that's just, you know, again, very, um, everyday, uh, faith-based app systems that were effectively snitching on you in other ways. Um,
We were finding that even through using web-based systems that precarious immigration, immigrant communities were asked to engage in to, for example, book court appointments where immigrants
telling Immigration Customs Enforcement when their court date was taking place, which rather than using that as a point for, well, this claim is being processed and they're being seen to through the proper judicial channels, ICE was effectively using that as a signpost to show up at the particular court
where they were mandated to have their court hearing and to detain them for deportation before a hearing could take place, sometimes after the hearing took place, but regardless of what outcome had been decided and the status of the case at hand.
So I was being told at the time by lawyers that because of the ways in which these things are algorithmically processed, because of the ways in which the courthouse has to put these dates into a particular system that is then triangulated with other information that is unavailable to the courts themselves, increasingly asylum seekers and even people that were potentially DACA recipients were
We're stopping themselves from showing up or even going to the court to get their asylum case processed or their DACA case processed because of fear that they might be deported anyway. And this is, again, mind you, you know, in the first Trump administration. So part of what we've seen in the last few weeks is.
With the detention of Mahmoud Khalil, amongst others, and the increasing threats to academics who have shown any solidarity with Palestine across the United States, irrespective of their fields of expertise, irrespective of the permanence of their visa, it's just those same logics and those same functions taken to their most extreme.
This episode is brought to you by Progressive Insurance. Do you ever find yourself playing the budgeting game? Well, with the Name Your Price tool from Progressive, you can find options that fit your budget and potentially lower your bills. Try it at Progressive.com. Progressive Casualty Insurance Company and Affiliates. Price and coverage match limited by state law. Not available in all states.
And maybe this comes like an obvious question, but I do wonder what surprised you the most when you were actually in the field and then later in the moment of writing up your field data? Oh, there's so many things that I was surprised about. To be honest with you, I went into the field thinking that I was going to get at least a few examples of technology interventions that were used in ways that were
slightly more positive and constructive than I had, that I came out of finding. And I guess at the time of carrying out the research, I didn't quite realize the lengths to which tech companies would go to in particular weaponize and instrumentalize the image and the symbol of the precarious, undocumented, marginalized other.
So to me, it's not that I went in thinking, I want to understand racial capitalism better. Let me look at these things and make my framework fit. It was a grounded approach, right? So to me, the framework of racial capitalism emerges only after seeing that at every turn,
There was no shying back from weaponizing the environments within which migrant communities walked in, even if it was through subtle, well-meaning language. Right. Like when urban planners come, tech developers talk about, you know,
segregated communities or outcast communities or migrant communities, and they do it in a well-meaning, top-down, humanitarian way. And that somehow rationalizes and justifies their intervention into those communities in ways
I don't think they realize that they're doing it with a particular way of using neocolonial language that situates them as subject to those interventions, again, without involving them. And that means that they think of those environments as being subject to the free reign and goodwill intervention from them, right? From the very tech developers themselves. In the same way, I didn't expect to see the sort of brazen,
jumping from a failed blockchain-based identity system that utilized and gathered funding specifically to take advantage of this moment of a refugee quote-unquote crisis and
to then enable those very developers to go, well, you know, we're valorizing our failure and we didn't manage to do anything with this app, but we can still advise MasterCard who will undoubtedly then go on to what? Develop tools that are similarly afflicted with well-intentioned neocolonialism? Like that,
unapologetic lack of introspection from the tech sector, from the tech-com humanitarian sector to me was just another liberal expression of the ways in which we've internalized modes of bordering because
Because they're always about us looking to either serve the very particular needs that keep at an arm's length the marginalized other, or that seeks to keep them at an arm's length by virtue of them being a security threat. And so I suppose, yeah, really, that's the major surprise for me is just the
there are no bounds to how far techno-racial capital will go. I know that doesn't sound particularly specific nor particularly satisfying, but yeah, I'd say that's probably my best response to that question. Thank you for that reflection. You end your book with neolutism. And what does it mean? And what does it really mean in your context?
So people who know me know that I can't end a single conversation that is related to my subject area without bringing up the Neoludites. And I think about Neoludites in the way that jealous Glenn Denning did in the 90s when she wrote notes towards the
a neoludic manifesto. And basically what she does in this work is reflect on the Manchester Luddites, who were a movement of textile workers that were increasingly subject to automation in the workplace. And so they picked up arms and smashed the window fronts of their factories, tried to stop the encroachment of automation into their life, not because they
of an inherent ideological antagonism towards tech development and evolution. But because of the ways in which the wealth and profits and sort of benefits overall from these automation developments were not in any way repurposed.
reaching them. They were leading them to be jobless. They didn't have any benefits. They didn't have a safety net. And their call was effectively to reconsider and redistribute the kinds of gains that comes from this level of mechanization. And in the same way, Chalice Glendening in the 90s reflects on the ways in which an increasingly computerized workforce at the time through the early dawn of the internet and the extensive use of word processors in the workplace was
had also led to a similar feeling and experience of tech not distributing the benefits that come from the system equally, as well as subjecting those that were traditionally subjects to forms of oppression to more of that oppression. And so I draw on neoliberalism
epistemology and praxis to say that there are ways in which we start to undo how we've come to think of tech as inevitable. The ways in which we start to think, rather than this is a natural extension of the ways in which quote-unquote development is going, to think of the ways in which technology inhabit particular forms of sorting, particular forms of ideology, particular forms of exploitation, or
that needs to be undone. We can't think of, I think, technology production in its contemporary manifestation without also thinking about bordering, because built within these systems is a form of categorizing and containing bodies, whether it's through categorizing who they are, often in reductive identity terms, or whether it's about situating them on a particular hierarchy.
within these systems of tech development in general and the ways in which computer code works, that is a necessary boundary making, bordering is a necessity. And in the same way within bordering, tech is a necessity, right? The ways in which we've come to think of the border and the border industrial complex is wound up with the use of biometrics, with the use of predictive analytics that makes a decision as to whether you can process through particular borders or not.
We were talking about Germany before. I'm thinking about the Ministry of Immigration's use of the algorithm at the border for asylum seekers that determined whether they were lying about where they were from on the basis of the language of their phone. So the ways in which the high-tech nature of Silicon Valley itself has crept into the world of bordering. And so these things become inseparable. And so therefore, my intervention is focused
My encouragement, I guess, is for us to think of a world in which tech and bordering aren't tied together, in which, you know, by undoing forms of technologization and mechanization, we're also starting to chip away at the border technology.
And I draw on this and I think about this because I think that the border is no longer, as we discussed at the very start of this conversation, just where the nation state has drawn its geographical borders. But the borders are written in ones and zeros. The borders are increasingly on our bodies without our awareness. So when communities go in, they smash cameras on Wi-Fi kiosks or they stop.
um letting for example palantir workers into the headquarters or they form you know chain fences around amazon headquarters they are engaged with also resisting the border in different ways and i think that's neoliberal praxis and i think we should do more of that thank you matt and we are so happy that your book has been published this year in 2025 however
If you could go back and you could rewrite parts of it, is there something that you would adjust, add, rephrase now? Yeah, this is a very cursed year to be publishing anything on tech and immigration. I think if I had to reflect back, I would probably make sure that I was framing the book as a very historically contained piece of
that looked at today's developments, the ways in which we're hearing immigration customs enforcement using artificial intelligence
for finding where student protesters are based in order to detain them and deport them. For Palantir now no longer being engaged in finding missing children and smugglers, but now very overtly saying that they're engaged in the longevity of Western civilization and it's okay if they kill people. I would reflect on those developments as being a direct consequence of
the more liberal side of technology development that we were all too happy to accept into our lives and into our politics back in the 2016, 2017, 2018 moment. Because it is under those Western liberal more, even at the time, thinking of itself as a more progressive tech space, it's under that veneer of
liberalism that these tools and these developments are allowed to fester and incubate. It's not a surprise that these tools are available to this Trump administration when the reality of it is that, you know, under the Biden administration, those previous Trump tools were being perfected for smart border systems, right? These things are not idealistic.
ideologically so different across different administrations. And it's the things that grow when we pretend to be democratic, liberal societies under the veneer of convenience and inclusion and making things safer. And when we don't stop to ask, you know, who are they being made safer for? And on the backs of whom are they being made safer? That we end up in situations that we're in today. So I suppose, yeah,
My reflection and my intervention would have been different insofar as I would have said, well, here's what happens in the light of day under liberalism when we're not looking. And this is the kind of circumstances that we then land ourselves in. Unfortunately, I didn't have the foresight to necessarily be able to see that we would get to this very unapologetic moment.
But I can see how the trends that I observe in the book carry on from then and into this particular moment. Thank you, Matt. And I think that's something that also the readers will really be able to follow in your book, Migrants in the Digital Periphery, that has been published by University of California Press this year.
As a final question to you, what are you currently thinking about? I am thinking about quite a few things, as we all are. I have two strands of thought at the moment. One is concerned with the ways in which aspects of society
Criminality are increasingly being framed as people's willingness to accept tech into other lives or not. And so I'm concerned with the idea of techno crime. So how do ways in which communities actively resist technology interventions that are framed as public property change?
become itself a way of criminalizing and justifying the expansion of those very tools. So we've seen time and time again communities who smash security cameras and who then get subjected to further policing and increased amounts of social policies that are supposed to keep them under surveillance because they're engaged in the criminal act of exercising their directly democratic right to
define what public looks like in public property. And so to me, the idea of sort of technochromality is really interesting because it plays a role in both how
In places like the United States, things are justified. But also in places like the Scandinavian utopian states like Denmark and elsewhere increasingly justify the rollout of various high-tech solutions on urban infrastructures. But I'm also interested in another domain and dimension of that, which is the ways in which
we start to see forms of, in particular, segregation that are unapologetically built into the very systems that I looked at as being otherwise liberal in their veneer back in 2016, right? Things like the emergence of NEOM and the line in the United Arab Emirates in which, you know, four or five indigenous communities were wholly displaced, ethnically cleansed,
in order for this smart futuristic fantasy that, in my opinion, an assessment will never actually come to light, could become at least a space for speculative technological project, right, that could bring in capital and profit and
And so those two lines, both the techno criminality as well as forms of segregation, ethnic cleansing that underlie forms of tech production today are interesting areas to me that I'm increasingly looking into. Thank you so much, Matt, both for your book, but then also for The Outlook. And we are looking forward to see what papers, but also upcoming books you'll work on. Thank you so much. Thank you so much for having me.