We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Truth Hurts: From Conflict to Connection, with Dr Julia Ebner

Truth Hurts: From Conflict to Connection, with Dr Julia Ebner

2025/5/27
logo of podcast Intelligence Squared

Intelligence Squared

AI Chapters Transcript

Shownotes Transcript

This episode is sponsored by Indeed. When it comes to hiring, timing is everything. Opportunities pop up fast. We've all had that moment where you're staring at a gap in your team, knowing things are about to get busy and thinking, we need someone now. Indeed cuts through the noise and gets you in front of the people who are actually right for the role.

So, when it comes to hiring, Indeed is all you need. Stop struggling to get your job posts seen on other job sites. Indeed's sponsored jobs helps you stand out and hire fast. With sponsored jobs, your post jumps to the top of the page for your relevant candidates so you can reach the people you want faster.

and it makes a huge difference. According to Indeed data, sponsored jobs posted directly on Indeed have 45% more applications than non-sponsored jobs. We find that Indeed makes hiring so fast.

When you know you're always going to get such a high level of potential candidates, you waste far less time finding a great fit to fill the role. Plus, with Indeed-sponsored jobs, there are no monthly subscriptions, no long-term contracts, and you only pay for results. How fast is Indeed? In the minute I've been talking to you, 23 hires were made on Indeed, according to Indeed data worldwide.

There's no need to wait any longer. Speed up your hiring right now with Indeed. And listeners to this show will get a $75 sponsored job credit to get your jobs more visibility at indeed.com slash intelligence squared.

Just go to indeed.com slash intelligence squared right now and support our show by saying you heard about Indeed on this podcast. Indeed.com slash intelligence squared. Terms and conditions apply. Hiring Indeed is all you need. At Capella University, you can learn at your own pace with our FlexPath learning format.

Welcome to Intelligence Squared, where great minds meet.

I'm producer Mia Sorrenti. Have you ever clashed with someone on social media or even a loved one? Conversations can easily escalate in a time where everyone is consuming different media with wildly different messaging. Today's episode is the recording from our recent live event, Truth Hurts, From Conflict to Connection, the second installment of our Critical Conversations series in partnership with Sage & Jester.

Sage and Jester are the arts production company who create immersive experiences designed to entertain, enlighten and help you harness your internal BS detector, arming you with the tools to question, challenge and pause before you believe what you see and read.

Live at the Pleasance Theatre, host Sophia Smith-Gaylor spoke to social cohesion researcher Dr. Julia Ebner to explore the process of radicalization and extremism in the internet age and explore tips for tackling the most divisive topics in our day-to-day lives and how we can all become better conversationalists. Now let's join our host Sophia Smith-Gaylor with more.

Hello, everyone, and welcome to this event with Intelligence Squared. In partnership with Sage and Jessa, who you were just hearing from, the arts production company who create immersive experiences designed to entertain, to enlighten, and to help you harness your internal

BS detector. Did you all bring your internal BS detectors with you today? I hope so. They will not arm you with the tools to question, challenge, and pause before you believe what you see and read, which is very in theme with what we'll be discussing tonight. They do, by the way, in London, have a bold new immersive theater experience called Storehouse, which opens in June. And I think you've been given flyers that have five pounds off the tickets for that. So if you enjoy tonight, there's more in store.

And this is the second event in the Critical Conversations series. We'll be having another one tomorrow, all about giving you the tools to handle the everyday experiences to combat the misinformation crisis we find ourselves in. But tonight, what is tonight about?

Tonight's about how to approach difficult conversations, whether it's in this polarized world or at the family dinner table. Anyone had an argument with a family member this year about something weird they believed in? Raise your hand. Whoa, that's quite a lot. My dad's raised his hand. That's interesting. Thanks, dad.

I'm Sophia Smith-Gaylor. I am a journalist. I've spent a lot of my career debunking or addressing misinformation. And it is an absolute delight to be here with Julia Ebner, the doctor Julia Ebner, leader of the Violent Extremism Lab at the University of Oxford Centre for the Study of Social Cohesion. She's a senior research fellow at the Institute for Strategic Dialogue, where she has led projects on online radicalism.

radicalization, terrorism, conspiracy theories, you name it, she's probably looked at it. She's an award-winning and internationally best-selling author of several books, including Going Mainstream, How Extremists Are Taking Over. And she's given evidence to numerous governments

and advised intelligence agencies, tech firms, and international organizations such as the UN, Europol, and NATO. But tonight, it's you lot. That's who she's going to be giving advice to about combating and dealing with misinformation. And Julie, my first question for you is, what brought you into this space? Because it can't always feel like the nicest to be in.

Yeah, absolutely. I started looking at jihadism, actually, when ISIS was just at the height of their power. They were recruiting foreign fighters from Europe, from the UK. And I looked at radicalization pathways. And of course, misinformation also played a role back then for some of these radicalization pathways. But then I saw the backlash on

the far right side of the spectrum, where anti-Muslim hatred and resentment was really fueled by some of these jihadist attacks that were hitting the UK and Europe. And yeah, on a personal level, I've always been fascinated by group dynamics, especially toxic group dynamics, especially

I think that comes from my own personal background. I was bullied in high school and I was always, since then, I just really wanted to understand what drives people on a very human level to engage in just really hostile activities against people they hate.

or people they see as enemies. And it starts on a very, yeah, it can start with basically in childhood. And we also see, of course, that now youth radicalization has become more and more of a topic that the UK is struggling with, but also other countries. And I think social media has added a whole new level to those challenges that we're facing. What you were pointing to there, was it the rage? Was that the book where you looked into jihadism and then as well as...

When was it that you published that? I read it, but it was years ago now. That was a long time ago now. That was 2017. It was my first book. And I looked at the intertwined relationship between Islamist extremism and far-right extremism. As if they're kind of, you go all the way around and you get to the same point. I remember that. Exactly. Can you explain that in more detail? Sure.

Yeah, a lot of them, a lot of the conspiracy myths and also the narratives that they build on are very similar. Of course, they find common ground in that idea of wanting to go back to a distant past that is supposedly better than the present. And they also paint the world in very black and white narratives. They believe that there is the West versus Islam or that there is some kind of

inevitable war of cultures, war of religions, or war of races. And that's something that I found both on the far-right white nationalist side and on the jihadist side of the spectrum. And I was interested in also understanding what are the psychological commonalities that the people share who then become radicalized, who then join these groups.

And even on a very individual kind of human level, I encountered a lot of similarities. And it might also explain why some individuals actually first joined, for example, a neo-Nazi movement and then swapped groups and joined all of a sudden ISIS-affiliated groups, which was, to me...

in the beginning, a complete enigma. But then when you look at it, it actually makes sense because a lot of them are looking for very similar things, for a sense of belonging or some kind of exclusive group and community that they can be a part of. When and why did you first decide to go undercover?

I did a small undercover investigation as part of that first book, but that was really, that was the first time. So I joined this Islamist organization, Hizb ut-Tahrir, at one of their meetings. And I also joined the English Defense League, the far right movement here in the UK. Define joined.

OK. So of course, it was all about setting up an identity. And in that case, it was still very much I just went along to their protests. But I then did it more systematically and actually spent months setting up credible online accounts,

a presence to be able to get recruited by some of these movements. So, for example, in 2020 or leading up to 2020, I joined a whole range of different extremist groups across the spectrum. I joined, for example, the white nationalist group Generation Identity, which is a pan-European movement. They believe in the great replacement idea that supposedly white Europeans are being replaced by non-whites in a kind of colloquial

concerted plot by the global elites. And I also joined some conspiracy myth movements, for example, QAnon and some neo-Nazi groups, as well as jihadist ISIS affiliated groups. And joining means in some of the cases, I was able to also go to some of their meetups. And I actually interviewed with some of them in person and joined their secret strategy meetings, for example, in a London Airbnb in Brixton.

For some of the groups, I wasn't able to join them offline because, for example, I joined misogynistic, violent, misogynistic subcultures online. It would have been very difficult for me to pretend to be a man in an offline setting. Obviously, the same is true to some extent for the jihadist movements. I think going to an ISIS meeting was also, apart from being very difficult, it would have also just been too dangerous. Yeah.

And is there like a bureaucracy to doing that? Have you got to get permission? Because I know within the academic research space, you have to get kind of ethics stuff signed off. Walk us through what that process is like. You can't just...

to go undercover in one of these groups by yourself and like not tell anyone? Yeah, so to be honest, this was before my academic career. So I would have not gotten this through the ethics committees, I think. Ah!

I did have my own, of course, I did it in my personal capacity. And I also did have some ethical boundaries that I wouldn't want to cross. Which were? So, for example, I would never want to help any extremist movement create propaganda or spread propaganda or help them with any of their recruitment efforts. And I also always forwarded any plots or any ideas.

Anything that looked like it might be a credible threat to national security, I forwarded that to the authorities. And sometimes I was also, I wrote a memo to the FBI, depending on the geography, because sometimes you also just don't really know, is this UK based or are these users now talking about something for the US? Yeah.

Were you just doing this in your spare time? Like, how are you, how are you finding the time to do this? Yeah. Pretty sad, right? Um, as someone who frequents many a Reddit subreddit, you know, you're my kind of person. This is actually what I do with my spare time as well. Um,

Talking about myths and disinformation, growing up, misinformation wasn't a word I ever heard about. It's actually only something that by the time I was becoming and working as a sort of early career journalist, everyone seemed to start talking about it. Has it really shaped and differentiated how we speak to each other in your view? I definitely think so. It's so interesting how, of course, social media has...

I think it's also tapping in something deeply human. I do think misinformation has always been around, but with every new invention and every new technology, it seems like there is an increase in misinformation that this facilitates. And I mean, even with the invention of the radio has been exploited as were other human inventions were also exploited to spread misinformation in earlier times.

But I do think the Internet and social media in particular have also just unfortunately accelerated these dynamics and have really made us more prone to to not really being able to distinguish the lines between what is fabrication, what is fabricated content and what is what is real.

And why is it getting harder to determine what's real and what isn't? Is it because it's become more sophisticated or is it because our critical thinking skills are diminishing? What's the reason behind it? Both. I would say it's a combination of different factors. One of them is, of course, that our attention span has decreased, that we're exposed to such a huge, hugely overwhelming amount of information and misinformation that

But also our trust is much more individual based and that is really amplified on social media. A lot of us will just probably listen to whatever the influences that we're following are saying. And research has shown that

Trust in institutions is dropping and trust in individuals that we just rely on in terms of our news consumption, our media habits is actually increasing. And that also means that misinformation can be spread more easily by some of these individual influences.

that people trust in. And the established institutions have less credibility to some extent. And then you have the whole added layer of the algorithms, of course, reinforcing radical content, incredible content,

apocalyptic content, all of the content that our human minds have always been fascinated in. But now on social media, that's exactly what the algorithms are tailored to in order to maximize our attention on the platform, which is essentially the business model of the tech firms. But we've always been fascinated by gladiators fighting or by

bloody content or by incredible apocalyptic predictions. I'm pleased you brought up their business model because for it to proliferate so much, it must bring someone loads of money. There must be a benefit in it for someone or various people. So who profits from the misinformation crisis?

I think it's actually when you look into, I mean, my expertise is mostly on the extremist movements, but of course there's also an added layer, the tech firms themselves profit from it because they maximize, as I said, they maximize the amount of time people spend on the platform.

because the content becomes more and more radical and misinformation is actually something that generates a lot of interaction. It just means that they can make people stay on their platforms longer. But the extremist movements also have turned this into something that's monetarily interesting for them. So, for example, some of the conspiracy myth gurus and conspiracy myth influencers run their own companies on the side to sell, well,

I don't know, protein powders and all kinds of canned food and preparing for the apocalypse or the whole prepper community then buys that kind of stuff. And that's very much related to then also benefiting on a financial level from it. And the same is true for some of the, yeah, the same is true for some of the other extremist groups that would then sell, for example, or the alternative medicine communities that would

try and sell products that are alternatives to the pharmaceutical industry's products. And of course,

The problem with misinformation is they often build on things that people are very emotionally attached to or that they have a grievance about. And sometimes this might also build on half-truths. So of course there are a lot of problems with some of the pharmaceutical products, where perhaps some of the side effects have not been studied enough. But they take it completely out of context. They take it out of the scientific consensus about the products, and they benefit from it by then also potentially selling their own products,

which have zero scientific evidence behind it. And if you're sort of in hot pursuit of people spreading dodgy content online, what platforms are they on? And how has that differed in the course of your research? Because I imagine when you were first beginning, what would this social media ecosystem have been like? It wasn't the vertical video era. So this is...

Maybe primarily Facebook, YouTube back then, Twitter, Reddit, and then the naughty ones, like the super dodgy ones and dark web. Yeah. Yeah. It's definitely changed a lot and it's still evolving rapidly, but yeah,

It used to be so just a few years ago in 2017 to 2019, a lot of the big tech platforms started to take down content, radical content and the radical accounts that were spreading hateful messages or misinformation. And

And then, unfortunately, there was a whole that what emerged was almost an alternative tech space or alternative tech, old tech platforms, as we call them, with Parler, Truth Social, Minds, Gab, even alternatives to YouTube like BitChute and Odyssey. And all of these platforms were used by extremists,

Sometimes under the banner of unlimited free speech, sometimes also explicitly trying to provide a safe haven for extremists. And they were used very often to coordinate campaigns that would then be carried out on the bigger platforms or to radicalize people who were already part of those communities or who had already entered those platforms.

Are they still active on those or do they even need them anymore? Because, of course, in the past year, we've had the sort of reversal of interest in fact checking from Mark Zuckerberg. We have had the muskification of Twitter, NowX. Do they even need Twitter?

spaces anymore when it feels like mainstream is becoming a bit alternative too. Yeah, right. No, they still use them, but it's very true that we've seen a complete turnaround when it comes to platform policies. And I think it started, of course, with Elon Musk's purchase of Twitter, now X, and

in 2022 when then there was a whole trend that started with other platforms jumping on that bandwagon and not actually removing misinformation anymore. For example, YouTube then in 2023 announced that they would no longer remove U.S. election related misinformation. And then, of course, we saw what happened in the last couple of years.

with extremist accounts returning to some of these big tech platforms and the removal policies changing entirely and misinformation again being completely normalized on these platforms. So there is no longer a real need for these old tech platforms. I would say that they still sometimes use them, for example, when they plan campaigns.

For researchers like yourself who are monitoring this, what risks do you face trying to do this kind of work?

It's been quite scary in the last couple of years because I've seen that a lot of researchers in my field have really been silenced, have really been afraid of doing research into misinformation, into topics related to radicalization and extremism because they're so emotionally and politically charged. That's very likely that even if you publish something

Well, in the academic space, of course, academic papers get less traction usually. But as soon as you publish something with a media outlet or you put yourself out there on social media, it's very likely that you face some kind of smearing or misinformation campaign or that...

that some of the people in the extremist communities try to silence these researchers. And even on the legal level, there are more legal threats related to it because as soon as you use any type of, or as soon as you make any types of evidence-based claims about X, for example, that would criticize Musk's policies, you might face a campaign and, or you might even face legal threats. So it's, I think a lot of researchers are very careful nowadays.

Do you think anything could something be done tomorrow that would really help you out, that would make you feel safer about doing this kind of work? Because I appreciate that. For one thing, it could stop people going into this. But the other thing that's already happening is that it's stopping researchers from amplifying their expertise and their work, which is just as bad. So what could help you tomorrow if someone changed something? Is it the platforms or is it government?

I think it's a combination, but on a legal level, I would say that it would be really important to guarantee researchers full access to platform data. So if we had the guarantee that we're not going to get sued if we analyze data from the big tech platforms, because it's important to shed light on the dynamics, then of course that would enable just

researchers to do much better science and to understand a little bit better how do campaigns form on some of these platforms, especially the ones that currently don't allow you to scrape them without paying huge amounts of money.

And the money is enormous. In order to get access to this data, we're talking at least like 40K. How expensive is it? Depends, I think. Depends on the amount of data that you want. But of course, for a lot of the projects, you need to collect a lot of data in order to make any representative claims or in order to really have meaningful findings.

And so for a lot of researchers, I think that's just exceeding their own research budgets. It's not hundreds of pounds and it's not thousands of pounds. It's tens of thousands of pounds. It is ridiculous.

You chose to hit play on this podcast today. Smart choice. Progressive loves to help people make smart choices. That's why they offer a tool called AutoQuote Explorer that allows you to compare your progressive car insurance quote with rates from other companies. So you save time on the research and can enjoy savings when you choose the best rate for you. Give it a try after this episode at Progressive.com. Progressive Casualty Insurance Company and Affiliates. Not available in all states or situations. Prices vary based on how you buy.

This episode is brought to you by Amazon Prime. From streaming to shopping, Prime helps you get more out of your passions. So whether you're a fan of true crime or prefer a nail-biting novel from time to time, with services like Prime Video, Amazon Music, and fast, free delivery, Prime makes it easy to get more out of whatever you're into or getting into. Visit Amazon.com slash Prime to learn more.

Ryan Reynolds here from Mint Mobile. I don't know if you knew this, but anyone can get the same premium wireless for $15 a month plan that I've been enjoying. It's not just for celebrities. So do like I did and have one of your assistant's assistants switch you to Mint Mobile today.

I'm told it's super easy to do at mintmobile.com slash switch.

This episode is brought to you by Greenlight. Get this, adults with financial literacy skills have 82% more wealth than those who don't. From swimming lessons to piano classes, us parents invest in so many things to enrich our kids' lives. But are we investing in their future financial success? With Greenlight, you can teach your kids financial literacy skills like earning, saving, and investing. And this investment costs less than that after-school treat. Start prioritizing their financial education and future today with a risk-free trial at greenlight.com slash Spotify. greenlight.com slash Spotify.

She's made up her mind, if pretty smart. Learned to budget responsibly right from the start. She spends a little less and puts more into savings. Keeps her blood pressure low when credit score raises. She's gotten it right out of her life. She tracks her cash flow on a spreadsheet at night. Boring money moves make kind of lame songs, but they sound pretty sweet to your wallet. BNC Bank. Brilliantly boring since 1865.

I want to move now from the landscape of misinformation, as it were, to our own lives and conversations we may be having in our personal lives. And I'm especially thinking about conversations we have with people we know. I'm not talking about getting into something in a comment section on a video with someone who's wound you up, but is ultimately a stranger. When it comes to speaking to people we know in real life and someone you love,

You believe is now endorsing something that you know is not based on the truth. It could even be based on a premise that causes harm to others. How is it that you can engage someone without alienating them off the bat?

Let me first say it's extremely difficult to do that. But I would say that the most important thing to keep in mind is that you always need to build an emotional or psychological bridge. You need to understand what is the underlying psychological purpose of, for example, the conspiracy myths that they might believe in or the radical belief that they hold.

Because usually it really depends on the individual. So a lot of them will have some kind of internal grievance about something, frustration or even trauma, that they then relate to a narrative and the conspiracy myth. And they kind of connect the personal identity with a collective identity. And that can lead to a phenomenon that we call identity fusion, where your personal identity becomes one with the group identity.

And then people start to perceive other members of the in-group as family-like, as kin-like. Oh, so if you're attacking my attitude, you're not only attacking all my beliefs, you're attacking like my people. Exactly, yeah. And the problem is also, of course...

That means that some of these even online subcultures or some of these extremist communities can become part of who you are and can also feel like family and almost replace your biological family. And that means even if you're the mother or sister or daughter of someone who's been radicalized, it's very hard to get through to that person because their identity is all tied up with that other community.

So, I mean, you began that by saying it's very hard to do. Is there a solution beyond trying to do something yourself? There definitely is. And I would recommend, depending on where that person is on the radicalization trajectory, especially if they're very deep down the rabbit hole, then I think it's necessary to seek psychological help and to ask professionals to help or to inform, prevent.

But I would say at the beginning of that journey, I do think that family members or friends or relatives can do a lot to help that person get back out of these networks, in particular when

you do feel that there is something that if you try to highlight, for example, experiences that you share with that person and you try to find those kind of a bridge to them on a very human level, that can often help them to break through some of their very rigid ways of thinking about the world. And equally, sometimes identity fusion comes about because people feel that they've had some traumatizing or deeply emotional experience with their in-group

If you break that apart and you actually show them, well, not all of your experiences are shared with that community. They are very different on so many levels. And you try to really make a very experience based point about their identity that can potentially also help them get out of it. And is everyone in society like equally vulnerable to radicalization?

Perhaps not equally, but something I found in my, especially in my undercover investigations for my book Going Dark was that I think everyone can be vulnerable at specific points in their lives. We all have periods of uncertainty, periods where we're just

not, yeah, we might just be more vulnerable because of personal trauma or because of difficult experiences that we're in. And I do think that means that we're more susceptible in those moments. Even when I was doing the research myself, I found myself at some point being quite vulnerable. Perhaps it was because, as we said earlier, I spent my free time instead of spending it with my friends. I spent a lot of time in these very extreme channels and then automatically

you become more vulnerable. But it's also about the group being able to tie some of those very deep, traumatic or negative experiences in your own life to the group narrative. I think that's quite a normal experience because last year I made a documentary called How to Save an Incel. And it was off the premise of academic research that had asked the very interesting question of can a subreddit...

de-radicalize as well as radicalize because we always hear about Reddit as a possibly radicalizing space online. And there is a subreddit called Incel Exit. Are you familiar with it? Where there are some young men who feel like actually they have found a way out and having, I've not only been in that subreddit, I have been one of the people who's given advice in that subreddit because it's full of kind of lots of different people in there who are willing to give their time to give

strangers advice on the internet who were trying to seek a way out of this space. And the main, the researcher who had been investigating whether this could be an interesting tool or solution said to me that when he had begun doing this research, the pandemic had just started and he suddenly found himself living completely alone. And yeah, without the usual faces and chat and real life interactions that he usually had. And,

And he was so, so involved in these spaces online because he was researching them for his research that he began to sort of understand and see the world in the way incels do, the way that they kind of

put everyone into a social hierarchy. Probably won't go into it tonight, but the way that he would start looking at couples and thinking, well, why is she with him? He's like a low value man. Why are they together? And this is someone who's investigating this space. This is someone who believes themselves to be very sort of clued up on it. You're still vulnerable to it. That's very interesting. What happens if we think

I'm not sure how often this happens, to be honest. But what if we think we might have a problem? So it's not that someone you know has a problem. But if you have a moment where you're like, hang on a second. I was believing this thing that I'm now wondering if it's not so chill to believe it anymore. What is it that we can do with our own sort of expression online? Social media detox, I would say. Really? Yeah. Just going off social media for a week or two.

And touching people in person again, because I think that really helps us sometimes to take a step back and reassess who we are, how we engage with others, how sometimes our emotions just get caught up in online debates or in the type of media that we consume. So what happens when we go back online?

I mean, maybe it's safer to stay online or detox forever. And then what happens if we're at a dinner table and you feel like things are getting heated?

And you kind of have choose your own adventure. Do you think, right, this is I will get heated to in an attempt to break down what you believe in, in the hope that I can dispel it? Or is it best to try and control and minimize what could become a volatile scenario? Yeah.

Yeah, very tricky. Probably many people in this room have been there and can relate to this. But I do think, I mean, although our natural inclination would often be to try and control it and try and debate that person, I do think it's actually...

using a more subtle approach and trying to get themselves to reach a conclusion that is perhaps not as rigid as the one that they're getting to right now is, is the best way forward to really trying to make them question their own beliefs. But in, in a very subtle way. And I do think it's, it's something about prompting them, them to reflect on their own feelings about something or on their own, yeah, their emotions, their fears and

And so on, but without making them feel like they're in the psychologist's chair. And so it's hard. I think it's a very, really difficult balance. In your work, have you ever looked at the concept of sacred values? Yeah. Has that been a useful framework for engaging with people at all?

I would definitely say so. Yeah, in the work that I do in Oxford, we do use sacred values. And it is also closely intertwined with that idea of identity fusion because sacred values really become a strong part of who you are. So could you explain to everyone who may not be familiar with the concept what sacred values are as they've been put forward? Yeah. Well, sacred values... I mean, Scott Adrian is someone who's published a lot on sacred values. And it's all about...

and belief systems that really become, that have a strong personal, I guess, value that people become really attached to. It's not religious. It's called sacred value, but it's not necessarily religious. It can be any type of belief. It can be a political belief. It can be any type of ideological belief.

I would even say, I would even go as far as to say that some of the communities that we now see, where the intelligence and security services talk a lot about them at the moment when it comes to youth radicalization, those communities that glorify violence for the sake of just violence or those gore and aggressive online communities, that even they hold types of sacred values that connect them to each other and

this idea that, well, they don't do everything for these beliefs. I mean, they would go as far as to kill their own pets or go even further when it comes to that glorification of violence.

And so I think it can be anything that really becomes deeply entrenched with who you are as a person and also relating to a group that you're in. Can reverse psychology be done to sort of appeal to someone's sacred value to de-radicalize them? I don't think obviously it would work in the context of violence glorification, but perhaps in others, if someone's sacred value was what they perceive to be unity, for example. Yeah.

You mean what they perceive to be unity and then trying to break apart that unity or... The reason I ask is because I know that's something... I believe I'm right here in saying that in the United States, when they were considering how to tackle anti-vax ideologies, looking at how you could appeal to someone's sacred value that was otherwise being manipulated to be anti-vax, you could manipulate it effectively the other way. Yeah.

Is that something? I think that's, yeah, that I think that's a really, that is definitely an approach that could be promising. And I think especially if you manage to attach some emotional value to something that is seen as part of the out group, where you kind of mix up what is considered to be part of the in group and what is considered to be part of the out group.

And you just because often these boundaries between in-group and out-group become so rigid that anything that you can do to break that apart is a really good step in the right direction, I think. How can we cultivate more empathy?

Well, this is exactly also about our own in-group and out-group thinking. I do think it's really important that we stay, that we continue to have empathy for people on the other side of the political spectrum or ideological spectrum, people we disagree with. I think without that

and understanding of where perhaps people we disagree with are coming from, it's really difficult to mend some of the societal divisions we're facing. And we see that with things like, I don't know if you saw that, but last year, the Financial Times published this article

a statistic about young women and men drifting apart in terms of ideologies. And men, on average, become more conservative right wing. And women, on average-- again, this is average, might not be representative for this room-- but women, on average, become more left liberal leaning. And I think it's really important also for the sake of just the future of humanity. Imagine people just also no longer date, for example,

people on the other side of the spectrum. You see this in other surveys done by OkCupid, which showed that most people don't want to date someone on the other side of the spectrum. I mean, this is quite bad news for... Dating apps actually sort of include more of this information as well.

They introduce this as a method of screening, like where someone sits politically. Whereas I don't think they always did that. Yeah, that's right. But I think also the ways in which we interact online, but also offline, reinforces some of these trends of us drifting further apart and the trends of hyperpolarization, affective or emotion-based polarization.

So I would say that empathy is extremely important for us to reverse that. And I think everyone can practice empathy in their own environment, in their own engagements, whether that's online or at a dinner table discussion. And if there is one thing that our audience members could take with them to do tomorrow morning,

that you think would help maybe give not even themselves a protective film against misinformation, but those they are around, what could be one thing that they can do other than not go on social media that day? No, I think, um,

I mean, Sander van der Linden at Cambridge University has a really good approach for basically inoculating the wider populations against misinformation using similar principles as with vaccines, trying to make people understand what are the dynamics underlying the spread of misinformation, what are the manipulation tactics that are being used.

And I think really trying to understand how you yourself might be exposed to this or your friends or your families can be a first step and trying to basically inoculate yourself or get the vaccine against misinformation and try to brace yourself for also toxic group dynamics, how

and also understand how our identities on an individual level, on a group level, on a societal level are changing in the online sphere. And of course, also how AI plays into that, because that's a whole discussion. To begin with this inoculation that you've described, we actually have a couple of questions for you. This is the interactive part of the evening. You'll see on the screens behind me is a QR code. Chang will now invite you to scan.

You're going to have two multiple choice questions. Please click the link as well if you're watching us via live stream. I can see everyone's got their phones up. Brilliant. Scanning away. You're going to be asked two questions. I'm now going to read the first one. When you hear a viewpoint you strongly disagree with, what's your first instinct?

A. Ask questions to understand. B. Refute it with facts. C. Laugh it off. Or D. Change the subject. What are you picking tonight? Very empathetic crowd, you lot. You ask questions to try and understand. What's your view of the data in front of you, Julia?

This is great. Well, I do hope that everyone goes out and does this with their next dinner table conversations. I do think that also, of course, refuting facts and also laughing it off can sometimes have an important role. But I would say that asking questions are a really good starting point. And I do think humor can sometimes be helpful, but sometimes it can also push people further back because it

people feel like they're being humiliated or not taken seriously. So I do think it can, but in some circumstances, it can certainly be a good way of just, or just changing the subject can also sometimes be helpful. But I think overall, asking questions to understand better why people might hold certain beliefs. Sometimes it can be hard as well, refuting with facts, because someone says something and you know it sounds slightly weird or off, but you don't necessarily have the data at hand to be like,

This is wrong because blah, blah, blah. And the problem is, even if you then look up the data, they might not believe the source of the data, even if it's a very trusted media source or a statistic from an academic paper, because their conspiracy myth might already cover all of these institutions as part of the bigger conspiracy. Second question.

You may have to scan again. If it's not on your screen, scan again for the second question. Which is more dangerous in today's culture? Oh, you've already been voting. A, misinformation spread seriously. B, misinformation spread as a joke. C, people who can't tell the difference. Or D, social media algorithms. Wow. Wow. I'm saying wow because...

Yeah.

But it also, I think it shows actually a really well-informed audience because it is very true that not being able to distinguish between that difference of satire and actual misinformation is the trickiest part because that means, because a lot of the extremist movements that I look into use satire or humor as a way of circumventing laws and as a way of spreading their misinformation. And we've even

seen that on a bigger level with, for example, some of the videos that were created of Kamala Harris, satirical deepfakes were spread with people not realizing that this is actually, that this was a deepfake. And it was originally meant to be for satirical purposes and humor, but people took it seriously. And actually that was quite a dangerous path because people didn't realize that she didn't say that.

And then social media algorithms, I completely agree as well. I think that is the source of a lot of the challenges that we're facing in particular because misinformation can just spread so much more quickly. And we've seen it also with, of course, with the Southport riots, how quickly it was from that first piece of misinformation that was shared on Telegram in the Southport wake-up group after this terrible, terrible attack

in Southport. But then to the point that people mobilized and went into the streets, it was just, it took not even a day, it took hours for that.

So, I'm not surprised you've recommended a detox, which I think I will also enjoy this evening. So, thank you so much, Julia, for a really fascinating conversation. We're going to hand over now to the excellent folks at The People Speak. You have two options. You can stay here in the main room and witness tonight's topic in action in a battle of the in-laws at a wedding.

No less. Or talkie-oke is happening in the bar where you can delve deeper into these conversations with your new friends from tonight. And you can decide which you attend while the set changes over. I'd like to thank you, Julia, for a really wonderful conversation. Make sure to visit sageandjester.com for all the exciting experiences that they've got coming up in the fight against the misinformation crisis. Thank you for coming. Thank you. Thank you.

Thanks for listening to Intelligence Squared. This episode was brought to you by Sage and Jester. Make sure to visit sageandjester.com for all the exciting experiences they have coming up in the fight against the misinformation crisis. This discussion was produced by myself, Mia Sorrenti, and it was edited by Mark Roberts. In honor of Military Appreciation Month, Verizon thought of a lot of different ways we could show our appreciation, like rolling out the red carpet, giving you your own personal marching band,

or throwing a bumping shindig. At Verizon, we're doing all that in the form of special military offers. That's why this month only, we're giving military and veteran families a $200 Verizon gift card and a phone on us with a select trade-in and a new line on select unlimited plans. Think of it as our way of flying a squadron of jets overhead while launching fireworks. Now that's what we call a celebration because we're proud to serve you. Visit your local Verizon store to learn more.

This is an ad by BetterHelp.

BetterHelp recently surveyed over 16,000 people and found that while 75% of people believe it's wise to seek support, only 27% of Americans are in therapy, and people stay silent to avoid being judged. BetterHelp has over 10 years of experience matching you with the right therapist. Because we're all better with help. Visit betterhelp.com slash stop the stigma for their first ever state of stigma report. The latest in mental health research.