This episode is sponsored by Indeed. When it comes to hiring, timing is everything. Opportunities pop up fast. We've all had that moment where you're staring at a gap in your team, knowing things are about to get busy and thinking, we need someone now. Indeed cuts through the noise and gets you in front of the people who are actually right for the role.
So, when it comes to hiring, Indeed is all you need. Stop struggling to get your job posts seen on other job sites. Indeed's sponsored jobs helps you stand out and hire fast. With sponsored jobs, your post jumps to the top of the page for your relevant candidates so you can reach the people you want faster.
and it makes a huge difference. According to Indeed data, sponsored jobs posted directly on Indeed have 45% more applications than non-sponsored jobs. We find that Indeed makes hiring so fast.
When you know you're always going to get such a high level of potential candidates, you waste far less time finding a great fit to fill the role. Plus, with Indeed-sponsored jobs, there are no monthly subscriptions, no long-term contracts, and you only pay for results. How fast is Indeed? In the minute I've been talking to you, 23 hires were made on Indeed, according to Indeed data worldwide.
There's no need to wait any longer. Speed up your hiring right now with Indeed. And listeners to this show will get a $75 sponsored job credit to get your jobs more visibility at indeed.com slash intelligence squared.
Just go to indeed.com slash intelligence squared right now and support our show by saying you heard about Indeed on this podcast. Indeed.com slash intelligence squared. Terms and conditions apply. Hiring Indeed is all you need.
The Hoover Dam wasn't built in a day. And the GMC Sierra lineup wasn't built overnight. Like every American achievement, building the Sierra 1500 heavy-duty and EV was the result of dedication. A dedication to mastering the art of engineering. That's what this country has done for 250 years.
and what GMC has done for over a hundred. We are professional grade. Visit gmc.com to learn more. Assembled in Flint and Hamtramck, Michigan and Fort Wayne, Indiana of U.S. and globally sourced parts. Welcome to Intelligence Squared, where great minds meet. I'm head of programming, Connor Boyle.
Today's episode features world-renowned expert on the social and cultural impacts of social media, Caitlin Regehr. Regehr was joined in conversation by podcaster, research director and author Carl Miller to discuss the themes of her new book, Smartphone Nation, why we're all addicted to screens and what you can do about it. Now, let's join Carl Miller with part one of the conversation. All right, well, good evening, everyone. How are you all doing?
Good. Well, very warm welcome this evening to this Intelligence Squared event. So I'm Carl Miller and I am delighted to welcome here this evening Caitlin Regehr. She is an associate professor of digital humanities at UCL and also the author of a book that's coming out. When is it, Caitlin? On Thursday? Smartphone Nation. Why we're all addicted to screens and what you can do about it.
So me and Caitlin are going to talk about some of the themes of the book for about an hour or so. And then it's going to go over to you. So we've got half an hour for questions. So sharpen your pencils, get thinking. Caitlin, you begin the book talking about why there's actually a surfeit of advice and frameworks and literature about digital literacy and still a book about all of that. Why? That's right.
Thank you. Well, I think it's worth to first ask ourselves, how many times did you look at your phone today? Or rather, how many times did you look at your phone to do one thing only to find yourself 10 minutes later doing something completely different? Or if you're a parent, do you know how much time your kids spent on a screen, what information they were fed, and why that content was different from what someone else's kids saw?
So what my work does is it looks at how these algorithmic processes work and then it seeks to give people tools to take control over them. And in saying this, I want to be very clear that ideally this book shouldn't be necessary. Ideally, we should have regulation in place to protect us. Ideally, we should be holding tech companies accountable.
But we don't and we aren't. So in the meantime, I wanted people to know that if you are going to exist in this space, if you're going to climb this mountain, you need to know you're doing so without a harness. And I just wanted people to have a little bit of a harness.
Well, that sounds very onerous, Caitlin, to kind of, you know, a harness climbing the mountain. I know. But you also said existing in that space. And let's begin with that space. So tell us a bit about what the terrain looks like. Every time we turn that phone on, every time we step into the digital world, what kind of world are we actually stepping into? So...
Most everything else we consume is regulated, right? The food we eat, the medication we take, the cars we drive. There are consumer protections around these things. We don't have those same consumer protections in the digital space. And I say this as someone who fed into the Online Safety Act. I don't think it's very good, but I did work on it. We don't have the same consumer protections
because we are not the consumers of tech, we're the product. Or rather, our time and attention is the product which is being sold to advertisers. And this construct is something that researchers call the attention economy. And what my research at UCL does is it looks at the way in which through the attention economy,
Hate, harm, and disinformation is often algorithmically prioritized because it is more attention-grabbing. Because disinformation is often more interesting than truth. And harm or things that hook into our insecurities, things that elicit emotive responses, will often hold us there a little bit longer. And it's that extra engagement
that advertisers are paying for. And so this whole financial corporate construct through which we've based social media, the financial model, has given way to a plethora of issues. Everyone in this audience will have different reference points for those issues. You might think of the fact that we exist in information silos and you're not having spontaneous cultural inputs anymore.
You might think about the marketization of self and the fact that we were never meant to look at our faces this much. You might think about the way it's changed sex and relationships. But the overarching thing that has linked all of these issues together is that social media changed the way we think. It changed the way we process and we access information. And that makes it a much bigger change.
I think it's a monumental change because I would argue to change the way you think is to change us. It's to change the way we formulate our beliefs and our tastes. But the underlying structure that is driving all of this forward, what I would argue is an unethical structure, is an attention economy. Is an economy that wants to hook eyeballs at any cost and our children's eyeballs at any cost.
And so that is the landscape that anything that we can discuss tonight, anything that we might talk about in this conversation always comes back to what I would argue is an unethical corporate structure. - Whose anxiety has increased?
Since they came into the room. Well, it's about to a little bit more, perhaps, because I have a quotation to read you all from Caitlin's book from a TikTok executive, unnamed. We don't know who this is, do we? But it was in one of the policy depositions, wasn't it, that this was identified? So this is a TikTok executive writing this, an internal memo. The reason kids watch TikTok is because the algorithm is really good.
But I think we need to be cognizant of what it might mean for other opportunities. I literally mean sleep and eating and moving around the room and looking someone in the eyes. Yeah. So TikTok in particular has a very potent algorithm. So it's hyper, hypersensitive. And it makes Facebook look like a dinosaur in that it is addictive very quickly.
It looks at how long you linger on something down to the second. And so arguably it gets to know us and really what we're talking about here is young people because those tend to be the consumers of TikTok. My team and I worked on a project with Askell this just last year and we launched a report that specifically looked at the ways in which TikTok fed misogyny to young boys. And we set up
all of these profiles based on real young people, but we set up these fake TikTok profiles of hypothetical, but based on real boys. And we found that over a week period, the amount of online misogyny that was fed to our archetypes increased by fourfold. And what that's about is that algorithmically, because TikTok assumes it to be a teenage boy, it believes...
that feeding the hypothetical boy something that will elicit anger or an emotive response is more addictive because it is in our vulnerable moments that we are more prone to addiction. In this book, I also talk, and this is another project that I was doing with ASCLE, the Association of School and College Leaders, and I was in a school and there was a 13-year-old girl who was a refugee from the Ukraine
And she said that when her city, which is no longer in the Ukraine, it's now part of Russia, was being bombed, TikTok started to feed herself harm content because she was feeling anxious. And because she was feeling anxious, TikTok seemed to know and it thought that this is what she would want to see. And so what we need to be very concerned about is the way in which
emotional manipulation is being used algorithmically to capture us, particularly young people who are particularly vulnerable to this, to capture us at our most vulnerable and then grow that vulnerability. Okay, well, I promise there are lots of solutions to come. But we are decomposing the problem, and it seems to me anyway, Caelan, it's kind of like the allegation is twofold right now. On the one hand...
The attention economy, the fundamental logic of that means these platforms are built primarily to keep us on the platforms themselves. Addiction promoting, habit forming. And then secondly, that the material which they happen to be serving us to do so is harmful or can often be harmful. So let's take them one after the other. So let's look at addiction first. Who feels...
I'm kind of anticipating that people coming to this probably may well feel that they have unhealthy digital habits. Who feels like they touch their phone too much? Who feels they spend too much time on their phone, on social media? Gaming?
Social media. Okay. All right. So what can we do? Let's talk about technical things that we can do. So how do we begin to push back a little bit against the very, very powerful ways in which these platforms have managed to hook us to them? Okay.
The first thing I want to touch upon is that almost everyone in this audience felt that they were addicted to their phone, but not everyone felt they were addicted to social media. I think that's quite interesting. So let's just start with the device itself. The device which we stroke and through that stroking develop feelings of love and dependence. That device which seamlessly moves us off of one application onto another.
So there are technical affordances within the device which even before we start talking about platforms are addictive. I mean, one is just that it's very beautiful to look at. So one thing that I talk about is that you can actually just turn the color off. In fact, we could all do this right now. If you want to turn your phone to grayscale. Do people want to turn their phone? Who's up for turning their phone to grayscale? Okay, great. Should we grayscale together? Yeah.
So, go into settings. This is the one time, I think, we'll be asking you to get your phone out during this. Now go to accessibility, go to display and text size, and then color filters, and switch that toggle to grayscale. And Kayleen, as people are doing this, explain why taking away color makes things less addictive. This is just one affordance of your phone.
which makes you want to look at it. Has anyone managed to do it? Yeah. Well done, everyone. Doesn't it look terrible? Yeah, it's not as sexy. I'm not saying you have to do this all the time, but it's quite good to sometimes turn off your colorways to remind ourselves that this is a very seductive little rectangle that we wear in our back pocket. The other thing is notifications. And a lot of us will have turned our notifications off, although I have to say...
I was at a literary festival last week with a very prominent BBC journalist and her notifications were going off while we were having this conversation. I think it's really worth thinking about how much you want to be notified and by whom. The other thing, before we even get into social media, that I think is worth thinking about is how you're getting your news. So if you are using news applications,
Even if it's from, even if you're using Apple News and you're following very reputable news sources, those applications are feeding you a very narrow version of the content that those newspapers are publishing. And so although you may think I'm consuming reputable news sources, you're shrinking your worldview and that is being algorithmically driven to you.
So even if you say, "I'm just on LinkedIn, I'm not a social media person," I think it's worth thinking about also what those parameters of social media are. What do we count as social media and what do we not? If you're looking to the Australians who have recently banned social media, they do not count YouTube. So they've banned social media, but YouTube is not banned. In this country, 89% of children
consume YouTube. And I'm talking about toddlers. I'm talking about from the age of two. It's the highest social media platform. It has the, by far the highest usage for every age group for our kids. And it shares a lot of the same content as TikTok.
So when we decide to count ourselves out of social media, I'm not using it. My kids aren't using it. I'm a member of a smartphone-free childhood, but my kid has a tablet and they do watch YouTube. I really, really challenge us to start to broaden our idea of where we kind of draw the lines around technical addiction, social media usage, and what we're choosing to opt out of that.
Do you want me to continue with now? Well, um...
What I do want to talk about, I think, is something you're touching on a lot, but maybe it's best to make it explicit. And this is like a kind of change in mindset as well. So moving from this kind of passive, almost like prey-like state, where algorithms are serving stuff up and they're being shaped really well by the behavior that we're showing online, and instead moving towards a much more proactive, mindful, deliberate state.
So maybe you could talk a bit about that, because that might be helpful to people regardless of what they use and where they go. So the default setting is that we are a passive product and we are fed a very, very narrow picture of the content that is out there through our feeds. We are fed it. That is the default setting.
And so when you have social media executives like Elon Musk say, it's a digital town square, everyone has a voice, this free speech argument does not work because these are incredibly narrow pathways built by algorithms of the content that we actually see.
And so what we want people to do, what we want ourselves as adults to do, and what we want our kids to do is move away from accepting the role of passive product and into being an active participant. So that you're actually deciding this is what I want this device to do for me. This is the content I want to be consuming. And this is the content I don't. This sits outside of what is good for me. This doesn't make me feel good.
which I think we'll get to in a moment of how you actually actualize that. But ultimately, we're trying to move people into being active participants. And this is something that is different from what you had to do as a parent 25 years ago.
So I'm not saying that I myself was not plopped down in front of screens with probably terrible TV dinners to watch children's viewing hours. But there weren't nearly the same amount of choices available for my parents. These were children's viewing hours that were programmed by children's commissioners with diverse content, and there was one or two.
And so, and I hate heaping this personal responsibility on us all, but there are a lot more choices to be made now. And unfortunately, you're going to have to make them.
You chose to hit play on this podcast today. Smart choice. Progressive loves to help people make smart choices. That's why they offer a tool called AutoQuote Explorer that allows you to compare your progressive car insurance quote with rates from other companies. So you save time on the research and can enjoy savings when you choose the best rate for you. Give it a try after this episode at Progressive.com. Progressive Casualty Insurance Company and Affiliates. Not available in all states or situations. Prices vary based on how you buy.
Summer's here, and Nordstrom has everything you need for your best dress season ever. From beach days and weddings to weekend getaways in your everyday wardrobe, discover stylish options under $100 from tons of your favorite brands like Mango, Skims, Princess Polly, and Madewell. It's easy, too, with free shipping and free returns, in-store order pickup, and more. Shop today in stores, online at nordstrom.com, or download the Nordstrom app.
Summer is coming right to your door. With Target Circle 360, get all the season go-to's at home with same day delivery. Snacks for the pool party? Delivered. Sun lotion and towels for a beach day? Delivered. Pillows and lights to deck out the deck? That too! Delivered. Just when you want them. Summer your way, quick and easy. Join now and get all the summer fun delivered right to your home with Target Circle 360. Membership required. Subject to terms and conditions. Applies to orders over $35.
Uber is on the way, so you can be on yours. Uber, on our way.
Are you still quoting 30-year-old movies? Have you said cool beans in the past 90 days? Do you think Discover isn't widely accepted? If this sounds like you, you're stuck in the past. Discover is accepted at 99% of places that take credit cards nationwide. And every time you make a purchase with your card, you automatically earn cash back. Welcome to the now. It pays to discover. Learn more at discover.com slash credit card. Based on the February 2024 Nielsen Report.
Okay, so we're more mindful, we're more proactive, we've all grayscaled. Presumably we've also decided there are going to be times and places in our house and in our lives where we won't be using phones. Right. And so hopefully, you know, we're pushing back a little bit against this kind of addictive pull that these phones and devices are kind of drawing on us.
The algorithm itself, you know, that's it's so central, doesn't it? In your narrative and in your view of why these spaces are so harmful. Is there anything we can actually do about those? Because, you know, sometimes people are going to go on to TikTok. Yeah. The first thing I suggest that we do is that we open up these information silos. So these are hyper, hyper personalized forms of viewing.
It's not like watching TV with your family. It's not like going to the cinema with your community. These are hyper-personalized forms of viewing, and I think we should break away from that. So the first thing I encourage people to do before they decide what they want their digital diet to look like is to do a walkthrough method. And this is when you open your most frequently used social media app with your partner, with a friend, and you scroll through a normative period of usage.
And this might feel super cringy, but if it makes you feel uncomfortable, it's worth considering why that is. I can hear some like nervous laughter in the audience. Like if you're scrolling through your Instagram feed and your partner says, you're getting a lot of ads for Botox. Is that the best thing for you?
You know, it might be worth considering that. If you have so much design porn, like it's like everyone's kitchen looks amazing and that just makes you feel like your house is terrible. It's worth considering if that's a good thing for you.
And so once you do that, once you decide what you want your digital diet, so once you look at what you're actually consuming, and I also talk about ways to look at the quantity of what you're consuming, you can make a decision. You can decide what you want more of and what you want less of, and then you can game your algorithm. So if you're going to game your algorithm, you need to be very clear about what you want your algorithm to do for you.
What do you want to see? And then you actively search for that. And you very quickly move past things you don't. Do not linger on it. Do not comment on it. Don't report it. Because reporting is engagement. Move past it very quickly and then actively look for things. I know parents who go into their kids' TikTok at night and search for science videos to change their kids' algorithms. I mean, there is a form of algorithmic resistance, isn't there? Yeah, yeah.
I mean, you can decide whether that's ethical. But what you're looking to do is you're going to train the machine learning. And you can do that. You can train your machine learning. And I think that we should all be doing that. And that is how you become more active within these processes. Okay, if anyone has tried to train their child's TikTok algorithm to serve up more science, please come back to us in the Q&A at the end.
That's algorithmic resistance. Actually, just before we move on, we're about to talk about some of the material, actually, that the algorithms can serve up. But just before we do, just on the walkthrough method, Caitlin, so you as a researcher, you're professionally able to bring kind of participants into those kinds of environments. But, I mean, I just placed myself for a moment there as a teenage boy again. And I can tell you, the last thing that I would do with my parents would be to go through a typical...
internet browsing experience. I mean, you just can't. So what would you do? As a parent, how would you bring people into that? Okay, so first of all, if you do have young kids, I think it's worth beginning from an early age to decide that you talk about what they see on screens. So I have really young kids. They're five and three. And if they see something that upsets them on a screen, we talk about it.
They know that they have to watch terrestrial TV. Sometimes it's boring. That's what's on. It's not being algorithmically fed to them. And we start seeding from a very early age that we talk about what they see on screen. So that's one thing is that I think normalizing this idea that we talk about what we see on our feeds is important.
If you have a teenager who is already on social media and that feels too much of an invasion of their privacy,
you can do a lighter, you know, a light touch to this. And that is you can say you have weekly check-ins, bring in something that you saw on your feed that made you feel good, something that made you feel bad, and something that really made you question and you want to look into. And the book talks about how you find, you know, how you spot disinformation and how you can fact-check things. But I think also bringing in these exercises, right?
into our normative lives, bringing in fact-checking exercises. There's also exercises that you can do with younger kids about image, seeing if something's an AI image, so that you start introducing these ideas from an early age that we talk about images, not everything we see is real, and this is how we can actually follow up on that.
Thank you. That's really helpful. So let's move actually to the kind of harder edge of online safety now. So we're going to have to go to some of the darker, more harrowing, more troubling bits of the internet to really talk about these dangers and be honest and clear about what they're like. There's a story you tell in the book, I don't know if it's hypothetical actually, where you're kind of watching two young children browsing through YouTube.
and one of them kind of just flips over to adult YouTube and types in kittens and starts watching kittens videos. You know, and there's all the kitten videos, and then suddenly there's one of the kitten being run over. And I guess you're doing that to kind of make the point that
you know, we're so close in a weird way on the internet to so much harmful stuff that can kind of almost unbidden, well, actually not kind of, actually unbidden, suddenly make its way into our consciousness. And, you know, it's pro-anorexia content, pro-suicide content. Talk to us a bit about these, the kind of like, the landscape of online harms that you see, the ones that we really need to be aware of that are out there. It's a very big question. That's a big question, yeah. So I think...
The thing about online harm is that it's very specific to the individual. And as my friend and colleague, Sarah Wynn-Williams, she just published Careless People. She's the Facebook whistleblower, points out, is that this is actually valued by advertisers, right? That we know when someone is vulnerable and that's a great time to sell them something. And if you can build that vulnerability, right?
there's going to be more options to sell them something. So in her book, she talks about that Meta knew if a teenager pulled down a picture of themselves online, they're probably feeling fat. And that's a great time to sell them weight loss products. So online harm is individual, is very specific to the individual. It's amazingly effective and
at targeting someone. We could talk about lots of different types of online harm, which also tend to move down a gender line. So for girls, we tend to be worried about body dysmorphic content and self-harm content. For boys, we tend to be worried about online misogyny and pornography.
And so I think that what we need to be discussing, and I don't know if you want me to talk about one of those specifically, but what we need to be discussing is not just the harm itself, but the process that we're allowing to feed this harm. I mean, in the Molly Russell trial,
In that inquiry, which is the first inquiry after Molly Russell's death to find social media culpable for the death of a young person. Just quickly tell us about Molly Russell, just for a second. So Molly Russell is the teenager who at the age of 14 took her own life a decade ago after being fed self-harm content on Instagram. And during that inquest...
which released all this material, horrific material that was being served to this child. Mehta's representative stood up in the court and said, "We have a responsibility to give voice to people who are suffering, and so it's not our place to limit their voice." They gave the classic free speech argument. But that is the wrong argument.
This argument is not about someone's right to publication. That's never been in question. This argument is about dissemination. So it's not whether someone's allowed to post videos about how to cut yourself or how to kill yourself. It's whether that should have been algorithmically fed to a 14-year-old girl. This is a dissemination debate.
And there are lots of technical techniques like dispersion, which means that that content does not reach young people that are not being employed. And even under the Online Safety Act, we're still dealing with a clean-up job after someone has slipped and hurt themselves in a spill. It still assumes that young people will see harm and that we should clean it up afterwards.
And that is the failing of that act. And so we can talk about individual harms and you might even in your own lives be worried about very specific things. We can talk about boys and online misogyny. We can talk about adolescence, the film, but it all comes back to this unethical structure.
Thanks for listening to Intelligence Squared. This episode was produced by myself, Conor Boyle, with production and editing by Mark Roberts. This is an ad by BetterHelp.
BetterHelp recently surveyed over 16,000 people and found that while 75% of people believe it's wise to seek support, only 27% of Americans are in therapy, and people stay silent to avoid being judged. BetterHelp has over 10 years of experience matching you with the right therapist. Because we're all better with help.
Visit BetterHelp.com slash Stop the Stigma for their first ever State of Stigma report. The latest in mental health research. In honor of Military Appreciation Month, Verizon thought of a lot of different ways we could show our appreciation. Like rolling out the red carpet, giving you your own personal marching band, or throwing a bumping shindig.
At Verizon, we're doing all that in the form of special military offers. That's why this month only, we're giving military and veteran families a $200 Verizon gift card and a phone on us with a select trade-in and a new line on select unlimited plans. Think of it as our way of flying a squadron of jets overhead while launching fireworks. Now that's what we call a celebration because we're proud to serve you. Visit your local Verizon store to learn more.
$200 Verizon gift card requires smartphone purchase $799.99 or more with new line on eligible plan. Gift card sent within eight weeks after receipt of claim. Phone offer requires $799.99 purchase with new smartphone line on unlimited ultimate or postpaid unlimited plus. Minimum plan $80 a month with auto pay plus taxes and fees for 36 months. Less $800 trade-in or promo credit applied over 36 months. 0% APR. Trade-in must be from Apple, Google, or Samsung. Trade-in and additional terms apply.