The PC gave us computing power at home, the internet connected us, and mobile let us do it pretty much anywhere. Now generative AI lets us communicate with technology in our own language, using our own senses. But figuring it all out when you're living through it is a totally different story. Welcome to Leading the Shift.
A new podcast from Microsoft Azure. I'm your host, Susan Etlinger. In each episode, leaders will share what they're learning to help you navigate all this change with confidence. Please join us. Listen and subscribe wherever you get your podcasts. Reports of image-based sexual abuse in the UK have increased tenfold over the past few years. Women are five times more likely to be victims of intimate image abuse.
But the true scale of the problem is probably larger as many victims do not come forward. Wait, are you talking about revenge porn? Well, yes, I suppose I am, but I am trying to deliberately not use that term. Oh, why? Well, for a while, I've been wanting to do a MediaStorm episode on the scale and range of revenge porn. I've been wanting to delve into how the media discusses it, what it entails, what we can do about it.
And then the more and more I delved into it while researching it, the more and more I realized that the term revenge porn was
just doesn't even come close to describing the issue at hand. In fact, it's almost downright offensive now I think about it. And as we always say on MediaStorm, if we can't even use the correct language, we do a disservice to the community experiencing the issue. Okay, okay. I'm intrigued. I think I sort of see the problem. And I'm glad. And I'll let...
our two brilliant upcoming guests tell us more about what terms we should or should not be using. But first, let's lay out the problem. What do you think revenge porn, or as I'm now referring to it, image-based sexual abuse is?
The thing that springs to my mind immediately is like an ex-boyfriend sharing images online with his friends of an ex-girlfriend that were maybe once sent with consent to that specific person, but were not supposed to be viewed by others. Yeah. And I would say that that's probably what most people think of when they hear the term revenge porn. And that certainly happens and is a big part of image based sexual abuse.
But there are unfortunately many different forms or aspects of non-consensual intimate image abuse. There's the threat of sharing intimate pictures. There's sharing pictures or videos without consent. There's uploading images or videos onto porn websites and money being evolved or the porn industry benefiting from it.
There's the culture behind it all. For example, sometimes the pressure to send these images in the first place. The End Violence Against Women Coalition found that 9% of girls aged 13 to 16 said they felt pressure to share images of themselves that they're not comfortable with.
I mean, I remember how big an issue this was at school and how destructive it was when, and it happened to a bunch of friends, girls my year, you know, photos that they sent guys were then
predictably sent around the whole year and then the slut shaming and the cyber bullying that came from it. But at least back then, you know, we were sending really grainy shit quality photos on flip phones. Today, every 13 year old, I don't know, young teenager has a smartphone with an HD camera and access to the whole universe of the social media cloud, whatever's out there. Yeah, absolutely. I completely agree.
And actually, that rise in technology and mobile phone usage has played a part in this. When we were all stuck in lockdown and on our phones 24-7, the revenge porn helpline said that the number of reports they received doubled in 2020, reaching a record high number of cases. And advancing tech has only worsened things for women in this case.
there has been a huge rise in AI-generated image-based sexual abuse, also known as deepfakes, being thrown out into a world where legal frameworks cannot seem to keep up with this rising technology.
And bear in mind, it was only 10 years ago in 2015 that non-consensual intimate image abuse became illegal in the UK. In many countries, it still isn't illegal. Are you serious? Our entire school and uni experience is it was legal to what? We were not protected. It seems that...
At every turn, not just in tech spheres, the legal frameworks and the justice systems are always one step behind. What are we missing in our reporting? What can we do about it? And are we focusing so hard on isolated incidents we're failing to see the bigger picture?
And that original definition in law of intimate abuse is way too narrow. It leaves many loopholes. The problem is prolific, the software searchable and oh so easy to find, the number of victims.
He was trying to make me out as, you know, a jealous attention seeker. We can reveal so-called revenge porn is rising in the UK. Welcome to MediaStorm, the news podcast that starts with the people who are normally asked last. I'm Helena Wadia. And I'm Matilda Mallinson. This week's MediaStorm. Image-based sexual abuse. Not your revenge porn.
Welcome to the MediaStorm studio. Joining us today are two very special guests. Our first guest is an award-winning campaigner and activist. She is the co-founder and director of Hashtag Not Your Porn, the movement fighting for policy changes and protections for victims of image-based sexual abuse. Welcome, tuning in from North Wales, Eleanor Michael. Hello, thank you so much for having me. Really excited. Yes.
Our second guest, like thousands of people, discovered intimate images of herself published online without her consent. She couldn't trace who leaked them. The crime remains unpunished and the images remain online.
This experience spurred her to found Image Angel, a technology that can be used to identify where non-consensual images leap from. We'll hear much more about it later on in the episode, of course. But for now, welcome, Madeline Thomas. Hello. Thank you for having me. In those introductions, we use terms such as image-based sexual abuse and non-consensual intimate images. But in the media, these terms are often put under one label, revenge porn.
Let's begin with Eleanor. What do you think of the term revenge porn? I think the first thing to understand about any term that's coined by the media is
is that it's only going to offer maybe like 5% of what actually happens to people who are subjected to this kind of abusive behaviour. So one of the biggest problems with terms like revenge porn are that it's not always about revenge when somebody shares or takes images non-consensually, and it's certainly not porn. So it's limiting our understanding of who might be a survivor, but it's also limiting our understanding of who might be a perpetrator
And we want terms that the public understands to understand an element of the crime. At the same time, we need to move away from this language.
It's incredibly derogatory and it does not encompass the full picture, spectrum, magnitude of the kind of harm that people experience. Yeah, Madeleine, in some of our correspondence before this episode, you used the term image-based sexual violence. Is that your preferred term for the crime? For what happened to me, yes.
but there are any number of experiences that go along with this and it could be to do with stalking, it could be to do with digital harassment. So I tend to go with the term image misuse because that is a bit more of a cover-all. It doesn't denote the harm, that's the problem, but we can understand it a bit better when we say image misuse. It's funny because I had never even thought about the fact that the term revenge porn was in any way...
problematic or insufficient until we did this episode. And then as soon as Helena said it,
I was like, hold up. Yeah. Porn. Sorry, you are not in a porno that you don't consent to be in. And porno implies something that's designed for pleasure. That is not porn. That was the thing that hit me straight away. And then revenge implies that the victim did something wrong in order for somebody to seek revenge. Yeah, it's problematic on so many levels. When we talk about the term revenge porn being useful,
Perhaps it is somewhat useful in the way that people know exactly what we're talking about when we use the term revenge porn. For example, there is a helpline dedicated to victims and survivors called the Revenge Porn Helpline. Is it in that way useful?
Well, it's a cyclical argument. So it's useful in the sense that it's a popularised term that people understand and therefore we can catch survivors and we can help them. But in terms of its overall use within media, within policy, it's really incredibly damaging. Like a lot of the work that Madeleine and I have been doing policy-wise has been limited and affected the understanding of law and policy makers.
because of use of this kind of term. It's also similar with terms such as deepfakes. There's such an inherent bias with the term deepfakes that we don't understand that there are often two victims of deepfakes. There's the person who's been non-consensually generally put into some sort of porn video that they've not consented to, but then there's also the body of sex workers that are not
not seen as having rights to choose where their images or where their work goes or their videos. And these kinds of terms, in terms of the wider landscape damage it's doing with law and policy,
It's huge. Yeah, and editors have to recognise the power they have. They can't just say, we speak to people in the language they understand. Well, no, people use the language that you use. You know, no one has more power to define the vocabulary than the mainstream media. Eleanor, I think this is a good time now for you to tell us about the hashtag Not Your Porn campaign. What are its aims? What does it do? It was founded in 2019 by Kate Isaacs,
who was supporting a friend who had her images non-consensually shared on Pornhub. And at the time, there was also a download function. So by the time that Pornhub actually got round, I think it was between six months to a year later, taking that non-consensual content down, the damage was already done. She was trending in the UK and also Europe, I believe, in the top three videos, but she had obviously not consented to being on Pornhub.
So we are, you know, our roots are super grassroots. Me, myself, I'm a survivor of intimate partner violence. The focus was always pushing survivor stories into the press, into policy spaces, because we learned very, very quickly, like people like Madeleine are the experts. They know how to fix this problem. And also they can tell you in intimate detail what is wrong with the system.
So we've evolved since then and we've become more proactive to help people campaign, but also not re-traumatise them because the burden on survivors to fix this problem, unpaid, around also managing their own trauma, also managing, dealing with the police and the CPS and the stigma and all the rest of that shite that comes with it, basically, is huge. There's so much expertise there and there's so much passion and fire, but...
The system that we work in is that survivors' stories or narratives are not necessarily given the weight that they absolutely should, and we're fighting to change that so that survivors are leading the way along with their partner organisations and, you know, academics.
Wow, you're just preaching the media storm mission over here. And you know, that survivor-led approach is particularly important when you dig a little deeper into the effects that image-based sexual abuse can have on victims. Research has shown that victims of this crime can suffer long-term psychological, personal and social consequences, including severe emotional distress, anxiety and depression. There can be a huge sense of powerlessness involved.
Madeleine, before we talk about how you reclaimed that power by creating Image Angel, could you describe to us your experience with image-based sexual violence? I feel like I still need to caveat my experience because when your listeners hear this, they'll go, oh, well, that makes it different when I explain my story. Perhaps it won't. But in my experience, it's...
I'm still dealing with the shame of how my images came to be. I had a very crap job, actually. It was a crap job. And my husband and I had a child and my crap job didn't pay for childcare and for my life. So I thought, I want something better than this.
And he was wonderfully supportive and he said, yes, well, you know, I will help you. I'll do whatever we can. So I got a job doing camming, which is like bizarre, strange, but it was wonderful. It was fun. And I earned more in that 20 minute call than I would in a day. And I'm having fun and I'm staying at home. I mean, where is the harm? I'm having consensual conversations with people.
And at one point I sold some images to a group of people, you know, just some sort of saucy nudes, nothing terribly explicit, in the understanding that I was selling a consensual moment for them to enjoy those pictures. And I thought I look bloody great in them, actually. And...
That was the last of it. I never thought any more of it. It was a consensual moment of an exchange of trust as well as exchange of financial, you know, remuneration. And then it wasn't until probably about six months after that, that somebody said, oh, you do realise you're on this website, don't you? And I'm like, what? No, absolutely.
Why? And I felt stupid. Of course I should expect this to happen to me. Of course it's par for the course. And then this part of me was going, but why? Why do we as society let somebody take your image just because? And to write shameful things about you and to hurt, shame, embarrass, humiliate, dox, all of these things that other people have experienced and just the minutiae of that, it's so complex.
But to experience it is gut-wrenching. And I'd love to know how your audience is now feeling. Are they kind of going, well, you know, what was she wearing, essentially? Because it is the modern day equivalent of what was she wearing? Well, if you put those images out there, of course that's going to happen. And I too felt that. And I still do. I'm still dealing with that.
But I do believe that it's a fundamental right that I should be able to earn in a way that suits me, my family, my children, and still feel safe doing so. In the same way an Uber driver has the ability to drive and be safe in their cab, or a hairdresser has basic rights and labour law protection in their workplace. This is a method of earning for me, and I should have the right to have those images removed because I consented to A, but I did not consent to B.
And so it led me down a path of pain, hurt. At points I thought my life is not worth living because of the shame and embarrassment that I felt.
seeing those images and being unable to take them down. So I contemplated ending my life. I contemplated moving to Russia, finding whoever's hosting this and tearing it down myself. What can I do? And I was faced with the solution, which was lump it. You've made your bed, now lie in it. And I don't think that that's enough.
And I want to change the world now. I don't want anyone else to experience that. There shouldn't be a trade-off of rights. There is this stigma around sex work that because you put your body out there, therefore everything that happens to you is your consequence. We live in a world where
people's bodies, predominantly women, are commoditised. And yet a woman of her own volition, making choices about her body, is seen as, well, you reap what you sow, which is completely, you know, that is the narrative we have to cut through. And I see ripples of it just looking in consensual relationships, for example, between partners or perhaps, you know, someone's posted a nude to one individual. They have maybe...
a tenuous relationship, there's this level of entitlement to do whatever you want to somebody's body. And we're part of the movement that says a woman's sexuality is her choice, whether that's for profession, whether that's personal. And if any fundamental foundation for law or policy is working on this idea of stigma and shame and one survivor's rights is more important than another, we've got a real big problem.
That's something that struck me. Madeline, you were thinking, oh, what are people going to think when I...
include the detail that I was working as a sex worker. But what I hear when you tell that story in the intersection of being a sex worker and being a survivor of image-based sexual abuse is this added layer of exploitation on top of the sexual exploitation is the economic exploitation. And indeed, this is faced by anyone whose image is non-consensually put on a porn site because someone is profiting there. So
Someone is making money off that. And that is something you provided for a fee. That is your labor and you are not being paid for it, but someone is. And that is amount to forced labor, modern slavery, trafficking. Absolutely. So, yeah, I mean, I see there are so many, so many layers. Yeah. And also this there's this strange layer that I'm coming to terms with, which is, you
The Vogue sectors, the violence against women and girls, I feel like my story, my voice is seen as problematic in violence against women and girls sectors because I'm seen as part of the problem, because I commoditized my image. So I'm walking a very, very fine line between acceptable and unacceptable. But when you did share your story, it was...
filled with an emotion that is so real and relatable and frankly uncontroversial. And we do not see that nearly enough in the coverage of this topic. And I think that that brings us on to the media and how they report the subject of image-based sexual abuse.
Victims of this abuse are overwhelmingly female. Countless studies have shown us that women make up between 80 and 90% of victims of this crime. And this is not to do a disservice to male victims who exist and who deserve their space in this conversation, but I say it to show that this crime at its core is gendered. Elena, back to you. Do you think that the media does a good enough job at reporting on the gendered aspect of
of image-based sexual abuse? I think that there are some phenomenal journalists that really understand the issue and understand that it's a multifaceted approach. And it's interesting to me, not always, but the majority of the journalists that are doing a good job are women. You know, just to name a few, you've got Adele Walton, who's also a campaigner. You've got Lydia Morrish, you know, who's just phenomenal. You've got Lucy Morgan from...
Glamour, who's been, you know, Not Your Porn has been working with them, Violence Against Women Coalition. There are phenomenal people, but on the whole, no. I don't think that our audiences are completely one dimensional, unable to critically think for themselves.
and need to have this really watered down, diluted version of the issues, because oh, it's not, it's not nice with their morning coffee. But you've also said, you know, Glamour, a women's magazine, if it's a question of speaking to your audiences, you know, should this really just be in women's magazines to women's audiences? Who are the audiences that really need to be educated about image based sexual abuse, if it is as gendered as the
the data tells us it is. I think there may be a portion of the harm is coming from how we frame people who have experienced this because whenever...
Whenever I do something at home, we always go, sex worker does something because it's such a headline. Whatever I do, sex worker smashes plate, sex worker smashes plate. It becomes so sensationalized because we're searching for clicks. So as soon as a sex worker does something, it becomes this sensational, I should click on that and find out what's going on because that's saucy. Absolutely. Yeah. So I often talk on this podcast,
slash every day of my life about how in cases of
fatal domestic abuse and violence against women and girls they're reported on in the media as isolated incidents and we'll hear a lot about certain specific cases but we'll rarely hear about the wider context in the case of fatal domestic abuse the wider context is that three women a week in the uk are killed by a man in the case of image-based sexual abuse the wider context is
is that nearly a fifth of women in the UK, 17%, have experienced image-based sexual abuse. That status from the End Violence Against Women Coalition. Madeleine, do you think the media does enough to explain the culture of misogyny behind image-based sexual abuse and to point to that wider context?
No, they don't because it's not sexy, is it? It's not clickable and it's not going to get you the advervenue. So no, they don't. What would be the benefits? It's very difficult to
to see really tough hitting conversation happening without there being some kind of showbiz angle or a documentary that pulls a couple of strings and it kind of ruffles feathers because that's what gets the viewers. Yeah, there is a place for a showbiz angle, but that can't be the only narrative. There are so many facets of how this behaviour exists, how this abusive behaviour plays out, how it becomes...
a part of a pattern of behaviour and attitudes towards women and girls. And those nuances are crucial to being able to tackle this problem. And I think that that's part of the wider conversation that we need to be having, that the press play a fundamental role. And I also really get frustrated with some of the journalists I've worked with over the last decade
several years of the way that they treat survivors. You know, this is work, what they're doing. You know, you're not entitled to their story unpaid that you can manipulate and say whatever you want and then not do it justice because you're only going to give a one-dimensional view. That's really deeply problematic because you're essentially replicating their harm. It's just another form of entitlement to exploit somebody's story
in order to put out whatever it is you want to put out. - It's a huge sort of sticky issue in journalism. A lot of journalists, documentary makers would say, "Oh, you can't pay someone for their story." It incentivizes exaggeration, but when you are
consulting experts by experience about these topics. It's a very different case. I mean, at MediaStorm, it is really important to us and has been from the beginning that we pay our guests to share their lived experience and expertise.
And in some ways that makes us actually really quite unusual in the media. And the people who do get paid for their stories, by the way, are sort of, you know, the friends of celebrities who will spill the beans and actually, you know, often do just exaggerate and make up stories for those cash checks. But it is not the people who are exposing themselves and sharing intimate, intimate stories in order to push much needed policy change. So to provide that context that is missing from the media,
It is that one in 14 adults, which is equivalent to 4.4 million people in England and Wales, have experienced threats to share their intimate images without their consent. That stat is from Refuge. Given that stat, our next question would be, you know, does this topic get enough coverage in the media? We have an example we want to talk about. There was recently a very, very important story on this topic that was broken by The Observer. They analysed court records and put in freedom of information requests
And they found, this shocked me, that perpetrators of image-based sexual abuse are being allowed to keep the explicit images of their victims on their devices after the investigation. And this is because of a failure by prosecutors to obtain orders requiring their deletion. This story broke in late February and of 98 cases concluded in the Magistrates Court in England and Wales in the previous six months, that was 98 cases, just three...
resulted in a deletion.
order. This is a huge story and a shout out to The Observer and The Guardian who really are the outlets that seem to report on image-based sexual abuse and online abuse far more than any other news outlet. This story is an example of what the media is for, right? The Observer using their truth-seeking power to call out these loopholes that affect marginalized people. And as I said, by all accounts, this is a huge story, but it didn't make a splash really elsewhere in the media.
One month later, on the 23rd of March, the Observer provided an update to the story, stating that the Crown Prosecution Service are now going to update their guidance on image-based abuse crimes to stop perpetrators being allowed to keep these explicit photos of their victims. Now, this update was picked up in one other place, the town hall.
The Telegraph, who reported on the story by praising the new CPS guidelines and missing out the context provided by the Observer in the first place that only three of 98 cases in the last six months resulted in one of these deletion orders.
In fact, there's no mention in the whole Telegraph article as to why the CPS is suddenly updating their guidelines. And instead, the article is mostly a direct quote from the CPS patting itself on the back for this update. And this is just one example of the lack of traction an important story about image-based sexual abuse can get. And it's also an example of the erasure of victims. Eleanor, why do you think this happens in our media?
Well, first thing to say is huge shout out to Shanti Das for doing both those stories. That is an example of comprehensive, nuanced journalism. Why do I think it happens? It's the lack of connection of understanding how these moving parts fit together. For example, what I would have loved to have seen in the Telegraph article is the criticism that
Deprivation orders for devices should only apply, from what we can tell on the guidance, to one device. But we know that perpetrators tend to have multiple phones, multiple devices. Wait, what? Surely it's not stored on the device.
We all have multiple devices. We all have stored like on a cloud. The cloud, yeah. I know. It's frustrating to me beyond belief. The other thing that really irritates me, if you even get to the point where the CPS is going to prosecute, which let me tell you is an absolute nightmare, there's the other issue of in the course of the investigation, police officers routinely do not confiscate all devices. Right. And then, you know, even if you go through the investigation and you get to the court, the charge rate
for trials is incredibly, incredibly low. Data obtained by Refuge shows only about 4% of intimate image abuse cases reported to the police currently result in the perpetrator being charged. Refuge also noted that we urgently need trauma-informed training for all sections of the justice system, from the police to the CPS to the judiciary. Did you report what happened to you? No. Why not? I
I felt that I don't know who has done this, so how can I chase? I also felt like the shame that I was feeling would just be reflected back at me. And I'm still struggling with it. I still feel icky. And I really don't want to. But it's so deeply embedded in me to feel shame about that. But I'm pretty sure if I had gone to the police, and I have heard from many people who have said,
They're given, well, you probably shouldn't have done that in the first place. There's not really much we can do. Have you tried doing this? Have you tried getting it taken down for a copyright? Which is not successful.
So unfortunately, the frameworks don't exist to support people who are going through this. And it doesn't mean to say that there aren't phenomenal police officers out there. I have worked with many, but an overwhelming proportion of the people we've worked with have had negative experiences with the police. And I can't tell you how many times we've seen cases where police officers have won,
you know, run out of time for pursuing a particular case. Two, don't understand the nature of the offence and so the survivor is doing all of the legwork. Three, don't investigate properly, aren't able to get particular sites up on their computers, so the survivor is having to go through it themselves. It becomes a full-time job. And...
I've also sat in court cases and listened to or read through transcripts of court cases where I'm listening to the questions of defence barristers and I'm thinking, how on earth are you still practising? And there are times where I have to really bite my tongue and be like, just because you are an expert in criminal law, you have completely misunderstood the nature of the offence and what the evidential threshold is and what the components of the offence are.
Yeah, and our media really are not helpful in this case because one of the things that I find very confusing when reading about this topic in the media is the legality and the laws. I'm never really entirely sure in what ways victims are protected or not protected when I read articles about image-based sexual abuse. You know, if I were to go by what I have been reading recently
recently online on the topic. I've read that the Online Safety Act is prioritising the seriousness of image-based abuse. Eleanor, maybe you can tell us what is the Online Safety Act and if that reading is accurate. So the Online Safety Act is a really ambitious piece of legislation. It's huge. And there were some things added onto it, such as the amendments to image-based abuse laws, but primarily,
It gives Ofcom the power to enforce duties against companies for hosting this kind of material and is allegedly holding them to account. The way that Ofcom is reading that duty is incredibly narrowly when it comes to IVSA and ending violence against all women and girls, regardless of their profession, online. This is where government comms plays a huge role in creating
and muddling the waters of what's actually happening. When the Labour government came in, they announced they were basically putting in place a statutory instrument, which is a secondary piece of legislation to include IBSA as a priority offence, which basically means that it's a really important offence. It's on the level of some of our worst crimes, basically. However, the piece of legislation had already done that. Government comms decided to make this like, look at what
at what we're doing convincing the public that you're apparently tackling violence against women and girls but that's not what you've done from the start and so this is one of the things that really worries me about the messaging that people are getting the general public about what their protections are because half the time I don't even know what's going on but I'm thinking where one where's the teeth and two other than a press release where have you done any of the substantive work
Now something else the Online Safety Act does is make the sharing of AI-generated intimate images without consent illegal. This is bringing us onto the topic of technology. As technology advances and rapidly evolves, so does the nature of image-based sexual abuse, IBSA, leading to a rise in AI-generated intimate images, as they're widely referred to, deepfakes.
Now, according to online security experts at Security Hero, between 2022 and 2023, deepfake sexual content increased by over 400%. Madeline, you work in this area of tech-based protections. Are the measures in place that we're seeing going far enough?
Can you also tell us about your solution at Image Angel, what it is and how the technology you propose could help victims? It's not actually a tech issue. It's a society issue that we can change through tech.
is where I'm trying to kind of knock on doors and explain to people, this is how we can make this solution. But what we have right now is the Online Safety Act is asking the platforms that host content to make sure that they're protecting their users from harm. So they're making sure that people who are accessing these sites where harmful content may be aren't children. Is that enough? I don't know. The tech I have made is primarily a deterrent for...
So it is software that a platform can install and it gives a nudge to your average user to say this piece of content is protected. It is assigned to you. You are the guardian angel of this image. You don't own it.
But it is yours to look after. And should you share this somewhere, we'll be able to find out who you are. If you don't share it, nothing happens. No problem. You keep it on your phone, fine. If you keep it within your private space, that's fine.
But this moment of trust that you have with somebody on a dating app, for example, you send a picture, saucy nude, they send one back. Both of you are responsible and entrusted with that private moment.
And if you share that, you are committing a criminal act. An image angel will be able to find out your name, address and can knock on your door, not personally, but we can give it to the authorities who can knock on your door. It would be forensically able to prove that that person shared it, non-consensually.
Therefore, they wouldn't have done it or they would have been less likely to do so. And that real world impact is going to start a step change in society. It's going to start people being accountable for their actions online. So you want really the platforms such as dating apps, social media platforms to stop.
sign up to image angels absolutely and i don't see why they shouldn't this is protecting people on their platforms and that is their duty of care under the online safety act this type of preventative technology preventing harm from happening not mopping it up once it's happened not handing out tissues and going we should probably do more this is stopping it before it even happens it's stopping that user in their track and saying you probably don't want to do that because we're going to find you
Finally, to wrap up, let's look forward. How do our laws surrounding image-based sexual abuse compare to other countries? And are there any immediate steps that we can take from other countries, perhaps? Eleanor? So the law that we're fighting for at the moment for a comprehensive image-based abuse law would fly
plug all of the gaps that we have in our current system. So there are five main asks, which are things like improving the criminal laws, classifying this kind of content as illegal to kind of put it on par with child sexual exploitation material. But we also need to improve civil laws.
You know, we need to be able to go into an online portal, submit a takedown notice, and within a few hours have that image taken down because it's legally mandated. It needs to be accessible. Then thirdly, we need to improve sex education. We need to be having not just conversations at schools with any key stakeholder. The NHS needs to understand that this kind of abuse is abuse and therefore warrants psychological support. We also need...
to have a sort of commissioner type role like we see in other countries. And finally, we need support for specialist services. A recent report showed that actually the government had underspent its Borg budget considerably by the Office of National Statistics. Yet they're telling us that they are trying to tackle Borg. How is that possible? All of these harms and communities and behaviours
are operating at such a fast rate, we are already 10 years behind. So we need to start getting ahead of it and stop thinking about the politics of it and legacy building. We need to see the teeth of these policies. I want to see action because it's only going faster. And the more content is out there, the more we feed into these massive policies
monoliths of data and content and we are contributing to a society that just sees it as throwaway, as offhand when what you're looking at that is private, that
That is protected. It's a special moment. So speaking of action, let's give our listeners calls to action. Eleanor, let's start with you. Can you tell listeners where they can follow you and your work and if you have anything to plug? Yeah, absolutely. So you can follow us on Instagram at the Not Your Porn Handle, the X.
that O in porn has an X, you know, so it's a different one on Instagram. And also on X, formerly Twitter, not your porn is felt normally. The thing that you can do to support us at the moment, we currently have a petition that is being headed by
The survivor campaigner Jodie Campains for an image-based abuse law has over 70,000 signatures. If we can get it to 100, we can stop pressuring the government even further to take this seriously. So please, I urge you all to sign it and obviously I'll give you the link. And Madeleine, same question, where can people follow you and do you have anything to plug? My plug would be demand more, demand, demand more.
protection and rights and where can people find me they can find me at imageangel.co.uk but we also have imageangel.co.uk forward slash request you can type in the platform that you are working on playing on dating on and you can send them a direct mail that says hey you install this for my protection so I would request that if you care if you want to be safe make some noise
Thank you for listening. If you want to support MediaStorm, you can do so on Patreon for less than a cup of coffee a month. The link is in the show notes and a special shout out to everyone in our Patreon community already. We appreciate you so much. And if you enjoyed this episode, please send it to someone. Word of mouth is still the best way to grow a podcast. So please do tell your friends.
You can follow us on social media at Matilda Mal, at Helena Wadia and follow the show at MediaStormPod. MediaStorm is an award-winning podcast produced by Helena Wadia and Matilda Mallinson. The music is by Sam Fire.