We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode #956 - Laila Mickelwait - How Pornhub Became The Internet’s Biggest Crime Scene

#956 - Laila Mickelwait - How Pornhub Became The Internet’s Biggest Crime Scene

2025/6/19
logo of podcast Modern Wisdom

Modern Wisdom

AI Deep Dive AI Chapters Transcript
People
L
Laila Mickelwait
Topics
Laila Mickelwait: 大约五年前,我发现Pornhub允许任何人上传内容,而无需验证身份或同意。这导致网站充斥着真实的性犯罪视频,包括儿童性虐待、成人强奸和复仇色情等各种非自愿内容。Pornhub在2019年上传了690万个视频,每天有1.7亿访客,但审核系统却形同虚设,10个人负责审核数百万个视频,审核员被要求快速点击浏览视频,而不是确保没有非法内容。我测试了Pornhub的上传系统,发现只需几分钟,无需身份验证或同意书即可上传。我在社交媒体上发起了“Trafficking Hub”标签,因为任何被货币化的非自愿行为都是人口贩运的受害者。我写了一篇专栏文章,并启动了一个请愿书,以关闭Pornhub并追究其高管的责任。至今,这份请愿书已收到来自世界各地的230万个签名,并有600个组织参与。许多受害者联系我,讲述他们在Pornhub上被剥削的经历,并寻求帮助。我们通过诉讼和公众压力,最终促使信用卡公司切断了与Pornhub的联系,导致他们被迫删除了91%的网站内容。Pornhub希望通过删除内容来重新获得信用卡公司的支持。Pornhub面临的主要问题是同意和年龄,但他们无法判断视频是否是在未经同意的情况下上传的,因此只能删除未经验证的内容。从2024年9月起,Pornhub将被迫验证视频中人物的年龄和同意情况。Pornhub已被近300名受害者起诉,可能面临数十亿美元的赔偿。视频被上传到网上给受害者带来了巨大的创伤,被录像并传播到世界各地,会给受害者带来极大的痛苦,受害者不断担心他们的视频会被再次上传。Pornhub是犯罪行为的渠道,很多人对Pornhub感到愤怒,但很难找到犯罪者,所以Pornhub承担了很多愤怒。必须追究实际施虐者的责任,Pornhub也必须承担责任。Pornhub的高管故意制定政策来纵容虐待行为,他们提供VPN,并允许匿名上传,从而掩盖了犯罪者的位置,并且在13年内没有向当局报告儿童性虐待事件。如果Pornhub报告儿童性虐待事件,本可以抓捕更多的犯罪者并拯救更多的儿童。通过民事诉讼可以获取对方的通信、交流、电子邮件和内部政策文件,法院意外发布了数千页本应保密的内部文件,现在我们掌握了大量的内部信息,包括宣誓后的证词。

Deep Dive

Shownotes Transcript

Translations:
中文

Pornhub is not a porn site. It's a crime scene. What's that mean?

It means exactly what you just said. So what I discovered about five years ago was what millions of people already knew, and that all it took to upload to the world's YouTube of porn. So this is user generated porn, the biggest porn site in the world at the time. Actually, it was the fifth most trafficked website in the world at the time. I made this discovery that all it took to upload to Pornhub was an email address that anybody in under 10 minutes could upload to the site and

And they were not verifying ID to make sure that these were not children. And they were not verifying consent to make sure that these are not rape or trafficking victims. And because of that, the site had actually become infested with videos of real sexual crime. So we're talking about child abuse, child sexual abuse material, we call it. This is child rape. It's also self-generated child sexual abuse material where children would be

filming themselves and sharing it. And then that would get uploaded to Pornhub, which is completely illegal to be viewing and distributing that content. To adult rape, unconscious women, completely drunk, non-consenting, all the way to what we used to call revenge porn. So this would be image-based.

based sexual abuse. So all kinds of non-consensual content and even copyright violations where this is illegal content because it was stolen material that was being uploaded to the site.

So that was the state of Pornhub. And, you know, they had at the time they had six point nine million videos that were uploaded in 2019. And this was, you know, my fight to hold them accountable for these crimes started in 2020. And at that time, they had 56 million pieces of content uploaded to the site. And they had actually one hundred and seventy million visitors per

per day, 62 billion visitors per year, and enough content being uploaded that it would take 169 years to watch if you put those videos back to back. So that's how much content was being uploaded. And mind you, this is now anybody with an iPhone. So anybody anywhere in the world that had a camera could film a sex act and with no checks whatsoever.

Using a VPN, even, to be even more anonymous, they could upload this content to Pornhub and it was infested with videos of crime. Why was this your job to find? This seems like 62 billion visits per year. One of those 62 billion visits could have sprung somebody else into action. Why was this you?

I mean, this is one of the things that really kind of amazes me even now is that this was something that was hiding in plain sight. So this was under everybody's noses. Like, really, anybody could have sounded the alarm on this.

And it's amazing that it took from 2007 to 2020 for it to get any attention that this was actually going on. And, you know, I am just I am actually honored to have the opportunity to shine a light on this and to be helping the many, many, many countless victims now who've had their trauma in

immortalized on this site. And so I don't know why it took this long for it to come to light. But there's this saying that I love, and it's an idea whose time had come. And I think that that's actually true. It was an idea. Trafficking Hub, which is the movement that I started to hold Pornhub accountable, it started with a hashtag on social media and grew and went viral. And

But I think Trafficking Hub was an idea whose time had finally come and enough was enough and it was time to expose what was going on. What's the story of Pornhub across the years? Huge company owned by an even bigger parent company. What's the arc of how they ended up where they are?

Sure. Yeah, it's so interesting because you think about Pornhub as a solo site often, but really what people don't know is that Pornhub is owned by a parent company. Most people in the world have probably heard the name Pornhub.

you know, they had spent millions and millions of dollars to become a household brand name for porn. They had done things like massive PR campaigns to save the bees and save the whales and, you know, clean the oceans and even donate to breast cancer awareness. And they had this whole arm of Pornhub called Pornhub Cares, which was like this philanthropic arm. They're walking New York Fashion Week. They, you know, have faux commercials on Saturday Night Live. So

everybody pretty much understood the name Pornhub. Now, if you talk about the company that owned Pornhub, well, that's a different story. So most people had never heard of the parent company of Pornhub, which was called MindGeek. Now, MindGeek had essentially rolled up the global porn industry under one

huge international multi-billion dollar corporation because with a $362 million loan, they had actually bought up most of the world's most popular porn sites and brands. So there was a hedge fund called Colbert Capital

that had 125 secret investors that included JPMorgan Chase, that included Cornell University. And they had loaned these hundreds of millions of dollars to, which was Manwin at the time, so I'll tell you the history of that, but to the company to buy up the world's most popular porn sites and brands. So they actually owned everything from

Playboy Digital to Pornhub and all of its sister tube sites. So Pornhub and its sister site. So that would include RedTube, YouPorn, GayTube, ExtremeTube, XTube, PornMD. I mean, I could go on and on. Massive amounts of tube sites that operated all the same way.

But my geek used to be a company called Manwin. And before Manwin, it was a company called Mansef. And there was, you know, these men in Montreal that started Mansef in 2007. They had purchased the website Pornhub.com for about $2,500 at a Playboy Mansion party.

And they launched Pornhub. But it was a man named Fabian Tillman. So he's a German entrepreneur that actually put Pornhub on the map. So he was a very shrewd businessman. And he had this idea that he wanted the world to be able to access free porn. And so he kind of took Pornhub from a somewhat popular site to...

the brand name that it is today. And then he actually got in trouble for tax evasion. So the company was sold because originally the owners at Mansef were in trouble for money laundering. So they had to suddenly sell the site and they sold it to Fabian and

And then Fabian got in trouble for tax evasion. And then he suddenly had to sell the site. And so he sold it to VPs at his company. And now they're in trouble again. So criminally charged by the U.S. government. And now they're trying to sell the site again to a hastily concocted private equity firm. And so that's kind of the story of Pornhub, where you have this history where they get in trouble with the law.

They rebrand each time they name the company something else and sell it and sell it again. And this time they were criminally charged for intentionally profiting from the sex trafficking of over 100 women in California. So now they sold MindGeek and now they call it ALO. So it's a new it's the same company. It's the same website. And many of the same owners and executives have been involved from the very beginning that are still there in Montreal today running the site.

It's kind of like a dirty penny that keeps on getting passed around or like a cursed penny or something that, you know, the holder ends up getting in hot water in some way. And what's the story of how you found yourself embroiled in this? I don't understand how somebody just sort of stumbles upon creating a movement that causes the biggest porn site in the world to end up being basically shut down.

Yeah. So I have been in the fight for many, many years before the fight to hold Pornhub accountable began, the fight against sex trafficking. So I've been involved now, it's almost 20 years that I've been involved in the fight against sex trafficking and child sexual exploitation. So it was in the context of that work.

that I happened to test the upload system for Pornhub as I was investigating the site because I was really concerned by some video, sorry, some news stories that I had heard at the end of 2019. So I was paying attention to the headlines because this is my work, right? And so there was a very concerning story that I credit to the launch of the Trafficking Hub movement.

And it was about a 15-year-old girl from Broward County, Florida, who was missing for an entire year. And she was finally found when her distraught mother was tipped off by a Pornhub user that he recognized her daughter on the site. And she was found in 58 videos being raped and trafficked under an account named Daddy Slut.

And he had impregnated the teen girl and she was finally rescued from his apartment when surveillance footage from a 7-Eleven was matched up with the perpetrator's face in the Pornhub videos. And they actually found the girl and rescued her. And then at the same time, the London Sunday Times had done an investigation into Pornhub.

And they found dozens of illegal videos on the site within minutes, even children as young as three years old. And at the time, they had called out Heinz and Unilever for advertising on Pornhub, shamed them for doing that. They actually apologized. They took their ads off the site. At that time, PayPal cut ties with Pornhub. And it was just this really, you know, kind of shocking moment for anybody in the anti-trafficking and anti-child exploitation space to finally...

say, hey, what's actually going on here? And we're hearing stories of children being abused on Pornhub, very young children, in fact. And that is when, you know, I had just couldn't get those stories out of my mind. I kept thinking about them and thinking about them. And, you know, at the time you see Pornhub in the headlines all the time, you know, just getting so much press. And I said, how just how in the world is this happening? And that's when, you know, late one night I was putting my own, you know,

fussy baby back to sleep in the middle of the night on February 1st of 2020. And I was thinking about the story of that 15-year-old girl. And that's when I said, look, I'm just going to see what it takes to upload to Pornhub. And I just took a video of the rug and the dark room and the laptop keyboard and tried it for myself and realized it only took 10 minutes. It took a few clicks to

no ID, no consent form. And then I started really paying attention. So that's when I launched the Trafficking Hub hashtag on social media. I mean, I only had a few thousand followers at the time from all of my advocacy work that I was doing. And I shared the Trafficking Hub hashtag. And the reason why Trafficking Hub is what came to my mind is because anytime you monetize

So this is non-consensual.

is a victim of trafficking when it's monetized. Now on Pornhub, it's free porn, but it's not free. I mean, these are heavily monetized porn videos and they're monetized mostly with ads. So they were selling 4.6 billion ad impressions on Pornhub every single day. And that's how they were monetizing these millions and millions of videos, including

child sexual abuse, rape, and all forms of nonconsensual content. So that's why I said, hey, this is a trafficking hub. We have to hold this company accountable. And the hashtag just started to catch on. And then I said, okay, this has to go bigger than my tiny social media following. I'm going to write an op-ed about it. So I wrote an op-ed. I sent it to a few different outlets and the Washington examiners who actually decided to publish it

And then that kind of started to get a bit of virality. And so people were reading it. They were horrified as to what was going on. And then one of my followers just said, hey, you need to start a petition. And if you don't start one, I'll start it. So I said, OK, I'll do it. And so I started the Trafficking Hub petition to shut down Pornhub and hold its executives accountable.

for enabling trafficking, and that started to go viral. And today we have 2.3 million signatures on the petition from every country in the world. We've had 600 organizations involved. Hundreds of survivors have come forward.

at one point, survivors were coming forward to me on a daily basis saying, I was exploited on Pornhub. My videos are still on Pornhub. I was a child. I can't get the videos down. Please help me. I'm unconscious in this video. You know, all of these things. And we're able to connect them with lawyers. And then, you know, since then, thousands of media articles have been written about

on this exposing the criminality of Pornhub. And one of the most important things that we did throughout this campaign to hold them accountable was go after the credit card companies. So we knew that the Achilles heel of Pornhub was the credit card companies because, you know, without credit card companies, you don't have a very profitable online business. And actually the former owner of Pornhub reached out to me

in the midst of this viral campaign. And he said, listen, if you really want to get Pornhub, you have to go after the credit card companies. And that's what we did. And eventually, enough pressure through litigation, through lawsuits, through public pressure, through the articles that were being written about this, especially the children of Pornhub by the New York Times, the credit card companies finally cut off Pornhub and they were forced to delete 91% of the entire website.

Why did credit card companies cutting them off result in them deleting content? Did that mean that credit card companies would reactivate payments? So they were hoping that would be the case. So they understood that their site was completely infested with crime. They didn't know what was consensual and what was not consensual. They didn't know who was 16 and who was 18.

They were just guessing. So, you know, one of the things that we understood was that their moderation system was a joke. So we had moderators who came forward and they exposed the inner workings of how they were, you know, trying to vet the content. And basically, what was the process? So it was 10 people in Cyprus. So imagine this. So 10 people who were in charge of millions of videos. Now, it wasn't just Pornhub. Remember, I described MindGeek, right?

They were in charge of vetting all of the porn tube sites. So again, YouPorn, RedTube, TubeAid, XTube, all of them, 10 people per shift. And they were just clicking through. They were expected, like they were actually reprimanded if they didn't click through at least 700 videos per eight hour shift. But some of the more experienced moderators were clicking through, you know, 2000 videos per eight hour shift with the sound off.

And, you know, they were told, like, essentially the moderator said our job was not to make sure that illegal content wasn't getting on the site. It was our job was to just make sure that as much content could go through as possible. So think about that. And so so that's how the site was just they they actually had no idea how many of these videos were rough sex and how many of them were rape. So the only thing they do. Am I right in saying that the two main things?

issues here. One is consent and the other is age. Are those sort of the two big buckets? Yeah, it's because age, again, like a pediatrician can't even guess on a consistent basis who's 16 and who's 18, right? I mean, they had very young children on the site. One of the stories that just

It's probably one of the worst that I've heard is of a 12-year-old boy from Alabama who was drugged and overpowered and raped by a man named Rocky Shea Franklin. And Franklin filmed 23 of the assault videos, and he uploaded those videos online.

to Pornhub and the police went after, you know, the site to take those videos down when they found out and they were ignored multiple times for seven months. Those videos stayed up, even though police were demanding that they come down, getting hundreds of thousands of views, monetized views, mind you, making money for the owners of Pornhub. But so there's 12 year olds on the site, three to six year olds on the site. But most of the victims that came

came forward were underage teens. So they were young teens and teens who are under 18. And again, that's because, you know, like they were just obviously not vetting the videos at all. But even if they were looking at a video, there's no way they could tell who was 15 and who was 18. So yeah, so you're right in saying it's underage issues and then it's consent issues, right? Because

Also, how in the world could they tell if a video was non-consensually uploaded but consensually recorded? There's literally no way they could tell. So the only choice they had at that point when the credit card companies cut them off was in an attempt to woo back the credit card companies, they said, we have to delete all of the unverified content off our site. And so today they have actually taken down 91% of Pornhub. So they took down

over 50 million images and videos in what Financial Times has called probably the biggest takedown of content in internet history. And they still have to delete more. So they're going to delete more next month. So by June 30, they're going to have to take even more content off the site. And that's because the content that they left on the site, they left verified uploaders. Okay, so

Listen, the verified uploader doesn't take care of the problem. So they had some videos on the site that they had actually verified who the uploader was. And Rocky Shea Franklin, who I just told you about, he was a verified uploader. But that didn't mean he wasn't uploading victims in his videos. So they're going to have to take down a lot of the remaining content in the next month. So we'll see how much that ends up. How much do you think will be left?

I mean, it's hard to say. You know, as of September of 2024, they've been forced to start verifying evidence.

the age and consent of people who are in the videos, so the individuals in the videos, for the new content being uploaded. And that's because they've been sued. So they've been sued now by nearly 300 victims in 27 lawsuits. And that includes class actions on behalf of tens of thousands of child victims. These are certified class action lawsuits. They have one in Alabama, one in California.

And they're just getting, I mean, they could have potentially billions of dollars in damages for what's happened to these victims. And, you know, to the damages, sometimes people think of it and they kind of minimize it as, oh, this is just online. These are just, you kind of think of it as pixels on the screen and the actual victim is not humanized in the way that they really should be. But I think one of the things that we have to think about is the trauma that's

that they face when these videos are uploaded online because it's one thing to be raped or abused as a child, but then when that's recorded and then it's distributed to the world,

And it's distributed with a download button. So they had a download button on every single video on that site. So anybody could then download onto their device the worst moment of that victim's life and then re-upload it again and again and again forever. That they just have to then engage in this sadistic game of whack-a-mole where they're constantly in fear of

Who's going to upload their video to the Internet now? And they call it the immortalization of their trauma. They say, you know, one victim said, my abuser put me in a mental prison, but Pornhub gave me a life sentence. And so the severity of this, when you think about the lawsuits and this going to trial and the facts being put before a jury, I mean, this could be massive, massive damages.

In other news, Shopify powers 10% of all e-commerce companies in the US. They are the driving force behind Gymshark and Skims and Allo and Nutonic. And that is why I've partnered with them because when it comes to converting browsers into buyers, they're best in class. Their checkout is 36% better on average compared with other leading commerce platforms. And with ShopPay, you can boost your conversions by up to 50%. They've got award-winning support that's there to help you every step of the way. Look, you're not going into business to learn how to code.

or build a website, or do backend inventory management. Shopify takes all of that off your hands and allows you to focus on the job that you came here to do, which is designing and selling an awesome product. You can upgrade your business and get the same checkout that we use at Newtonic with Shopify by going to the link in the description below and signing up for a $1 per month trial period, or by heading to shopify.com slash modern wisdom, all lowercase. That's shopify.com slash modern wisdom to upgrade your selling today. It's

I mean, horrifying, but there's sort of two big buckets again of crimes that are happening. One being the actual incident, presuming that somebody isn't of age, didn't consent during the act, didn't consent to the recording, didn't consent to the distribution. And then you've got the actual distribution on Pornhub's side. I get the sense that a lot of the

ire and sort of hatred and vitriol and stuff that's directed at Pornhub is also it's like Pornhub are a conduit for who did the crime to and a lot of the times we can't

Who is this person? How do we find them? Where are they? Investigation and so on and so forth. Very difficult to do. I know sometimes people wear masks or purposely blur faces or, you know, do things that mean that you can't see who the potential perpetrator is. So, yeah, Pornhub definitely going to feel an awful lot of wrath from everyone. Yeah. And I mean, to your point, there is there's multiple levels of perpetration in

in this issue and what's happening. And for sure, the person who actually did that abuse, who filmed it, they have to be held accountable 100%. I mean, we want to see accountability across the board. When it comes to Pornhub, I mean, the facts that have been uncovered in legal discovery, I mean, Nick Kristof of the New York Times, I mean, he wrote a scathing expose in 2020 called The Children of Pornhub that featured the story of one particular victim. Her name was Serena. I'll just...

share her story because it's an important story. She was a young teen, so she was 13 years old, and she was from Bakersfield, California. An innocent teen. I mean, she'd never even kissed a boy before. She was a straight-A student. She...

had a crush on a boy older than her and he coerced her and convinced her to send him some nude images and videos of herself, which she did. And she shared those with him and then he shared them with classmates and they got uploaded to Pornhub where they got millions and millions of views. And she would beg for those videos to come down and she would be ignored.

Because they only had one person. So we uncovered through the legal discovery process that out of employees that they had 1,800 working for Pornhub and MindGeek, and they employed one person to be reviewing videos flagged by users as containing rape,

child abuse or other terms of service violations. So they had one person, they had a backlog of 706,000 flagged videos. So they also had a policy where they wouldn't even put a video in line for review unless it had over 15 flags. So a victim could actually flag their video 15 times and it would never even have been put in line for review. So

Serena would beg for them to come down. If she would get a hold of anybody, they would hassle her and say, prove that you're a victim. Prove that you were underage in this video. And if she eventually got it down again, it would just get uploaded again. So this sent her on a spiral of despair. She ended up dropping out of school because she was being bullied. She got addicted to drugs to try to numb the pain.

She ended up trying to kill herself multiple times. This is very common among victims of image-based sexual abuse. So the suicide ideation rate for these victims is about 50%.

And then she ended up homeless, living out of a car. 50%. Yes. 50% have suicidal ideation. So they think about it. So they think that ending their life might be better than enduring the pain of constantly having their trauma on the internet. And so that was, you know, the trajectory of Serena. But if you think about the intentionality, so going back to like,

Who's responsible in this situation, right? We have the individual perpetrator, but then the executives making the decisions, the intentional policy decisions. And we know they're intentional because we've uncovered email exchanges and messages and, you know, all of the communications and policies that they had put in place to

to enable this abuse to happen. So, I mean, even all the way from having a VPN where they offer a VPN to people. So they weren't just checking, they weren't not checking ID and consent. They were allowing you to anonymously upload, but then you could also access the site with a VPN. So law enforcement need an IP address in order to locate a perpetrator. That's how you actually locate a device. So if you use a VPN, well, then you're masking your location.

But not only that, like they were not reporting child sexual abuse that they were aware of to authorities for 13 years until we finally held their feet to the fire and exposed them. So it's actually mandatory in Canada where they have headquarters to report. When you know about child sexual abuse, you have to report it to authorities.

And they were not reporting. They were not reporting for 13 years, even though they were aware of children who were being abused on the site. And so then you think about that. It's like how many perpetrators could have been apprehended and how many children could have been saved from years of abuse if they were actually reporting the videos to authorities like they should have been, but they were hiding it from the public.

How damning are the internal documents? How did you get a hold? How does anyone get a hold of the emails of a company? What's the story of getting behind the scenes? Yeah. So one of the amazing tools through civil litigation is being able to get behind your opponent's communications, exchanges, emails, text messages, emails.

internal, all kinds of internal policy documents. So as a litigation, as a civil litigation progresses, they have this period of what they call discovery. And basically they can compel the company and they have compelled the company to release

documents. And it's hard for lawyers. They do not give this stuff up easily. They put up a fight. But these are amazing attorneys that are representing these victims. And they've been able to get this information, messages. Now, an amazing thing happened a few weeks ago. And this was the basis of Nick Kristof's recent article. So I told you about the Children of Pornhub. But actually, he just released a follow-up

And it was because the court in Alabama for the child trafficking class action lawsuit accidentally, so the court accidentally released thousands of pages of internal documents and communications and messages and emails that were supposed to be sealed. So they had actually accidentally unsealed all of this information. So now we have...

I mean, it's an amazing amount of information. Depositions where they actually deposed under oath. It's a crime if they actually tell a lie in these depositions. The managers, the employees, the executives, the owners. And we have all like, you know, one deposition is like a 500 page deposition.

And all of this put together, the question becomes how in the world are Pornhub's executives not in prison? And I honestly think after this release of this evidence, they will go to prison. I feel confident that we will see this company properly criminally prosecuted. Why aren't they in prison? Sounds to me like relatively open and shut case. There's already been investigations. This weird...

lily padding thing where something goes wrong and then it's we're rebranded over here and then we rebrand a little bit more and there's a tiny exec change but most of the people that are behind the scenes all stay the same and it kind of doesn't really matter who's been switched in and switched out

Is it just taking a long time? I guess it's only been five years to do this. It's a big investigation. Is it just the kind of slow lumbering behemoth that is legislation happening? What's going on? I mean, there is this saying that the wheels of justice turn slowly. Right. And I think that's true. I think the wheels of justice turn slowly, but they turn.

And I think especially if we keep the pressure on, they turn. I think that when we focus on something, when we give it attention and when there's a public outcry about something, then like the squeaky, what's it? The squeaky wheel gets the oil or whatever those, the saying is. But yeah, if you, if we can continue to put pressure on those in power to do their job, uh, then I think that we will see it happen. And, uh,

And so I think that's a matter of time. There was a company called Backpage and the fight to hold them accountable for child trafficking on that website. I think that was a 10 year fight.

So we're at five years now from really starting to shine a light on this. And I really believe that if we can keep it up, that public pressure coupled with civil litigation to continue to hit these companies where it hurts in their bank accounts, that we will see the outcome of justice really being served. And why is that important? I think, you know, why is it important to hold Pornhub and its parent company accountable? You know, people might say, well, this is just one of so many different sites, right?

This is just one website. That's true. But there is something that is real and it's called deterrence. And one of the most important things that we can do to prevent abuse is to deter future abusers because at the end of the day, this is a risk benefit calculation for what I call corporate traffickers. And this is about money for them. And they're just saying, you know, is what's going to happen to me,

the cost of doing business? Or is it worse than that? Like, will I face real and serious consequences? And when they understand that they will face real and serious consequences, they'll make different decisions. They don't have to distribute illegal content on that site. They can, although it's expensive and it's not easy, they can make the decisions to prevent that, to put in those safety policies, and they have to be forced to do it.

And so what we're seeing right now is the power of deterrence. Like right now, Pornhub's biggest competitors are proactively seeing what's happening on Pornhub and they're actually taking down illegal content from their sites. They're changing the way that the upload process works. Fear of being hit with the same kind of litigation. Right. Okay. Exactly. I think, yeah, the obvious question is Pornhub and even

ALO, X-Mind Geek, aren't the only adult website in the world. So you shut this thing down and it goes elsewhere. And I know that you're pushing for Pornhub itself to be shut down entirely, as opposed to just meeting the standards of moderation that you would be happy with. Is this...

Because taking down Pornhub would be a very loud shot across the bow for everybody else. And then presumably moving forward, you want what kind of moderation? Why shut down not moderation exclusively on Pornhub? And then what does a healthy porn moderation process look like? Yeah, those are great questions.

From the beginning, the call to action has been to shut down Pornhub, and I absolutely mean that. I didn't say it lightly when we started, and that is because the level of harm that has been done by this company to so many victims since 2007 with impunity, with intentionality, on purpose, for profit, is absolutely unacceptable. And the only just outcome is for the site to be shut down for

reparations to be paid to all victims, significant reparations, and for there to be criminal prosecution. And that's what justice served looks like. And justice is important because that's how victims can heal when they see that what happened to them was recognized and it was paid for. And so that's important. It's also important, like I said, to be a deterrent to future abusers.

So they understand that there will be consequences if they act in the same way. And in that way, we're going to help other websites not act in the same way that Pornhub has. But again, it's not enough to hold one company accountable. And don't forget, holding Pornhub accountable is also holding probably most of the world's most popular tube sites accountable because they're all owned by the same company.

But going forward, we need policy to make sure that this doesn't happen in the future. And that's why I am a strong advocate for age and consent verification policies, because the crux of the problem here was unfettered, unmonitored uploading on user-generated sites, right? And so the solution is pretty simple. It's

Verifying the age ID and consent, documented consent of every person in every video on every website that per terms of service aligns

allows user-generated porn. And this can be done at scale. So we have the technology to be able to do this at scale. It's very doable. How do you do it? What's the technology do? Yeah. So there's numerous companies that do this. One of them that Pornhub is currently being forced to use is called Yoti. And what they do is they do a biometric scan coupled with verification of government-issued ID.

in order to verify that the person in the video, it includes a liveness scan. And so there's a liveness scan. A liveness. Yeah, so you move when you're doing the scan of your face. So you can make sure that you're not just putting somebody's picture up there. So, I mean, there's different ways that this can be done.

But the technology is there and they can do this quickly, efficiently. Now, it costs money, right? So the one who's going to pay for this is the porn companies who have to implement these third party checks. And I think third party is so essential because I would never, ever want anybody to give their ID to Pornhub. I mean, they're actually facing a class action lawsuit for the exploitation of user data.

So what have they done? What's the story behind the user data? Yeah. So apparently what was happening was that they were without consent obtaining and selling the user data of millions and millions of people who are visiting their sites to third parties without consent. And so they're facing a class action lawsuit for that.

In other news, this episode is brought to you by Function. Did you know that your annual physical only screens for around 20 biomarkers, which leaves a ton of gaps when it comes to understanding your health, which is why I partnered with Function. They run lab tests twice a year that monitor over 100 biomarkers. They even screen for 50 types of cancer at stage one. And then they've got a team of expert physicians that take the data, put it into a simple dashboard and give you actionable recommendations to improve your health and health.

lifespan. They track everything from your heart health to your hormone levels and your thyroid function. Getting your blood work drawn and analyzed like this would usually cost thousands, but with Function, it is only $499. And for the first thousand Modern Wisdom listeners, you get $100 off, making it only $399. So right now you can get the exact same blood panels that I get and save $100 by going to the link in the description below or heading to functionhealth.com slash modernwisdom. That's functionhealth.com

slash modern wisdom they are fully fucked aren't they like they are so fucked dude holy shit like how many different ways i don't know maybe maybe it's the case that we will look back on and think that they were kind of the first through the door wild west frontier style

porn company that just made all of the errors, right? It was, look, this was before we had the, the, the, uh, Lila Micklewaite act of fucking 2028 or whatever. You know what I mean? Um, it was before we had the correct barriers in place, technology, uh,

had enabled this kind of user-generated porn uploading and it had done it at such a pace and no one had any idea what was going on and everyone was making money and lots of people were enjoying free access to porn on the internet from mobile devices and their laptops and then and then we realized just how sort of rotten the core of this was uh and maybe we'll look back and go wow

Pornhub and ALO are a shining example of all of the different ways that you can get this stuff wrong online. But it is kind of impressive that...

It actually genuinely is impressive to have one company that has accumulated... They're like the neutron star of making errors with this stuff. Like, how many... They're the LeBron James of getting... You know, they're like the GOAT of fucking up. But, I mean, it's funny that you say that, though. Because if they had just been left alone, I mean, they were...

So popular. I mean, people were wearing their apparel proudly in public. Culturally. I mean, look, that's the power of brand. That just shows, obviously, you need a product that backs it up. But if you are, they're the Apple of porn, right? They're the first mover advantage. You think mobile phone, you think Apple, you think porn, you think Pornhub. Exactly. That's absolutely true. And the thing of it is this, is that I think a lot of people today still have no idea about

that this ever happened, that they're facing all of these consequences for the horrific actions that they have deliberately done. And, you know, we're talking about like the, you know, the wild west of the internet and this and that. And, you know, the thing that really I think is important for people to understand about Pornhub based on all of the evidence that we've uncovered, like I said,

is the knowing intentionality. It's that, you know, that these were decisions that were made where it's not like they were completely oblivious to the children that were being exploited on the site or the rape victims. I mean, you know, there's pages and pages of in these recently released documents, accidentally released, where there's just years and years where they had people filling out their contact form and saying, you know,

please take these videos down. I was unconscious in this video. I was raped in this video. I was a child in this video. Or like, this is my friend. She's 15 in that video. She doesn't know that this was uploaded. You know, take it down. And these were, this was for years that they knew about that. So, you know, if you knowingly distribute

whatever, underage sexual material. Look at me trying to sound like I know in terms of legislation and stuff. I've heard this sentence before, right? Not at me. I've seen other people have this sentence, a lot of shit than before. If you knowingly chair underage something, you get in trouble, right? Like you're really, really fucked. It's actually a crime, yeah. Is there...

Is there a particular different type of carve out or was there a particular different type of carve out in the same way as whatever that article was that said we are not a curation site, we are a pipeline utility? Right, you're talking about Section 230. Yeah. So yes, yes. Section 230, thank you. Yes, Section 230 of the CDA. So the Communications Decency Act that essentially created a loophole for sites that

allow users, right, to upload things. And so they say that they're not responsible for things that other people uploaded. Basically, they consider themselves to be like a neutral party. They're, you know, they're just the telephone, they're the wires of the telephone and they're not responsible for what people are saying. That's not the case with Pornhub. So Pornhub has actually tried to argue

Section 230, they tried to get all of their lawsuits dismissed. I should have been on the legal team. I could have been a part of the legal team. I don't want to be a part of the legal team, but you know what I mean? They tried it. Yeah. No, they tried it and they've lost. And the reason why they've lost every single, except for one lawsuit that unfortunately was brought by a victim herself. She didn't have an attorney, but all of the others. And again, you know, there's been dozens of these lawsuits. And in every case, the judge has said,

Absolutely not. You do not get dismissed based on Section 230. And that is because they were actually creating. So they were part of creating the content and curating the content and promoting it and duplicating it. So they were taking the content from Pornhub and they were actually uploading it to their sister sites. They were also creating thumbnails of the content, both legal and illegal.

They were recommending and they were helping people reach illegal content by suggesting things like, you know, minor and childhood and whatnot in titles and tags. In fact, in those uncovered legal discovery documents, we have communications where they actually refused. They were recommended by employees to take certain words and keywords off the site, and they actually refused minor, childhood, wasted content.

Things like that where they were actually intentionally and they were tracking to the dollar how much they were making on these categories that included illegal content like teen, which was one of the most popular categories on Pornhub and the most profitable of all the categories. And so they didn't want to take down any of that. So because of that, they have lost Section 230 protection.

Right. What were some of the most or more surprising moments

ally ships that you made when going through this? I have to assume, I mean, like Bill Ackman got involved, but I have to assume that there were other, even more left field people than Bill Ackman. Yeah. I mean, probably the most surprising to me was when the former owner of Pornhub came forward to help, you know, in the summer of 2020. Question on that. Got to interject on that. Yeah. How much is that someone who's

Begging like, oh, please. Like the lady doth protest to him. I must be the white knight that can come in and save you. It's the dude. Yeah. So, yeah.

Yes. Covering your own tracks.

And the former owner, Fabian, told me who it was. So he just, I asked him, who's the secret agent?

shareholder of Pornhub now. And he told me his name was Bernd Bergmeier. And he was a businessman that grew up in Austria and then lived part-time in Hong Kong and part-time in London. But he was the majority shareholder that was hiding his identity forever. And so he was able to be found and exposed. And today he's being sued personally by dozens of victims. But then again, also he said, listen, go after the credit card companies because they're the Achilles heel of Pornhub. And that's exactly what we did.

Other than that, you know, obviously Bill Ackman, he was a surprising ally. He had read the Children of Pornhub New York Times article, and he was incensed because he has daughters of his own. And so he wanted to get involved and help. And so not only did he start tweeting about this, but he also reached out to the CEO of MasterCard because he knew him from the tennis circuit. And so he actually stepped in to help Conor.

convinced the credit card companies to cut ties with Pornhub. And actually they did at the end of 2020. But what we found out was that two weeks later, they

They quietly snuck back. So they actually snuck back to the advertising arm of Pornhub. And it was another two-year fight to finally get them to cut all ties with Pornhub. And Bill Ackman played an important role that whole time. He helped put the pressure on. There was a lawsuit against Visa. So this was key to this whole story.

So Serena, who I told you about, the victim, she not only sued Pornhub, she also sued Visa. And Visa actually lost their motion to dismiss their case. And it was the pressure of Visa losing their motion to dismiss their case, coupled with Bill Ackman helping myself and Serena's lawyer get on CNBC's Squawk Box with Andrew Sorkin, which was this, you know, in America, it's a very popular business.

you know, morning financial show. And for 17 minutes, we were able to call out the CEO of Visa, Al Kelly, saying, what in the world are you doing? Why are you monetizing child sexual abuse? And it was just enough pressure that finally he actually made a personal statement. And he said, I'm a father. You know, we're going to withdraw our services from Pornhub.

And so that's what happened. But other than him and Fabian, I think probably the porn stars and the porn performers were just...

very helpful allies that maybe people wouldn't normally think would be part of this, but they were essential in the fight against Pornhub. I mean, there was allies that were porn performers and porn producers, and they came to me and we would talk for hours. And one of the things that they would share is their own struggle with trying to spend hours on Pornhub every day, trying to scour their Pornhub

Pornhub and the tube site. So the other sister sites trying to take down their own stolen content. And when they were going through Pornhub, they actually were finding so much illegal content and they were sending it to me. And so they documented. Yeah, I suppose they are. They have an incentive to have to get content that they didn't consent to being uploaded from a monetization copyright.

standpoint, but you're observing professionals, adults, right, that maybe even have a management company behind them and have preferential access to Pornhub's moderation team and stuff like that. They are struggling to keep on top of this game of whack-a-mole. So if you are

That's 19-year-old ex, 15-year-old girl trying to chase down your thing, trying to keep it from your parents. You don't want mom to find out. You don't want dad to find out. You're already ashamed. You're doing this on your own. You're spending all of this time. You're in your room. You're worrying. You're, you know, yeah, it's insane. It is absolutely insane. And you're absolutely right there. I mean, imagine the professionals who do this for a living. Yeah.

struggling and frustrated at the fact that every day they're having to scour these sites. And so, you know, the adult industry actually hated Pornhub and the PornTube sites for what they did by allowing just unregulated free porn to just flood the internet, which included so much of their copyrighted material. So

They actually want age and consent verification when they're talking about kind of the professional porn industry. And they've abided by that for many, many years. Like it has been the standard and studio produced. You call it brick and mortar.

Porn Valley, right? Good old brick and mortar porn. The brick and mortar porn, like Porn Valley porn, where they've actually had to abide by this law called USC 2257. And basically in the United States, and they abide by this internationally because they want to not be in trouble for distributing it in the US, and the DOJ can actually inspect a porn company's records. And they are mandated to make sure that they had the ID of

of everybody who's in a scene to make sure that they're of age. And this was enacted in 1988. And the DOJ can inspect those records. And if you don't have those records, it's actually a criminal offense and you can be criminally charged for not having that. And they've accepted that and actually have, for the most part, abided by that. It's just with the advent of the internet and the free user-generated technology

porn site model that things have not been abided by with regard to the law. Now, thinking about criminality, one of the things that's illegal is to distribute content

content under USC 2257. So, you know, I mentioned the download button. So Pornhub had a download button on every single video. They were distributing that content from their servers onto the devices of people around the world, including illegal content, and they were not checking. So they could also be held criminally liable for millions of violations of USC 2257. That's something to note as well. Have you ever got to sit down with the people behind Pornhub?

You've ever been in a room face to face with them? I have not. No, no. And when this... How do you imagine that would go? Well, I imagine it would never happen because one of the things that they did when I started this campaign...

to hold Pornhub accountable, one of the things they did was engage in attacks, smears. Like they just tried to discredit the work that we were doing. They tried to discredit the trafficking hub movement, whatever ways that they could, they were doing that. I mean, they've done, they've, they've what we call dirty tricks. Like they've engaged in dirty tricks to try to silence instead of address it. They wanted to silence it.

because they knew it would be expensive. They actually made the changes that were necessary to stop the illegal content from being uploaded. So no, they didn't want to engage. They wanted to silence. And they've done some pretty horrible things, not only to me, but victims have faced some real hardships as well for speaking out. This episode is brought to you by Element.

Summer hits different when you're actually hydrated. That tired, foggy, muscle crampy feeling most people chalk up to heat or exhaustion. It might be electrolytes and plain water alone won't fix it, which is why Element is such a game changer. It's a zero sugar electrolyte drink mixed with sodium, potassium and magnesium in the exact ratio your body needs. No sugar, no food dyes, no artificial junk. And this summer they've dropped something new, lemonade salt. It's like a

Lemon popsicle grew up, got its life together and started supporting your adrenal system and your hydration.

And best of all, there's a no questions asked refund policy. So you can buy it, try it for as long as you want. And if you don't like it, they'll give you your money back and you don't even need to return the box. That's how confident they are that you'll love it. Plus they offer free shipping in the US. Right now, you can get a free sample pack of all eight flavors with your first box by going to the link in the description below or heading to drinklmnt.com slash modern wisdom. That's drinklmnt.com slash modern wisdom. What has, has there ever been...

direct Pornhub response to your work? Have you, has there ever been, have they interacted with that stuff directly? I mean, one of the things that they, are you talking about like their responses in the media or? Everything. I mean, have they reached, have they tailed you with private investigators? Have they tried to counter Sue for you accessing stuff? Well,

Well, they've never been they've never tried to counter sue because here's the thing. If they were subject to legal discovery, I mean, they know that they're going to be in just hot water. And the problem with, you know, if they were to engage in a defamation lawsuit, right, if from the very beginning, you know, they could have done that. But the problem is when you're telling the truth.

I mean, that's the ultimate defense. And so, I mean, and absolutely 100% everything that I've been saying, and it's not just me. I mean, really, this has been a movement of so many people, hundreds of organizations, hundreds of survivors, attorneys and lawmakers and law enforcement and lawyers and businessmen and so many people coming together that, no, they have not done that. But

Yes, I've had faced, you know, and some of this is not cannot be tied directly to the company. Some of it can and some of it has been. But yeah, there's been a lot of backlash from, you know, companies.

Doxing, hacking, online smear campaigns, media smear campaigns, letters being sent to my house with my children's names, middle names saying, we're watching you, you're going to get somebody killed. Even things like getting reported for violence.

child sexual abuse material distribution myself. So they, you know, people who we know are directly tied to Pornhub put in fake police reports about me to actually get me investigated, but it didn't go well for them because...

Obviously, when they looked, they didn't find anything. But besides that, they heard a lot about Pornhub. And so the police that were investigating me ended up becoming allies and saying, hey, we're on the same page. We're on the same team. And how can we help you? So that didn't go well for them. I have to imagine that the valuation arc of Pornhub looks like the saddest investing could have sold at the top opportunity of all time.

Well, we know some numbers now. So it was a multibillion-dollar corporation. And just a few weeks ago, some information was released from court documents where we understand that the site was

sold and I put sold in quotation marks because it wasn't actually sold. It was just sold on paper, but it wasn't paid for. But the sale price was $400 million. So it has lost a significant amount of value as a company for sure. Okay. I guess the question, Pornhub is the biggest, largest mold that needs to be whacked.

It's a shot across the bow of people who are maybe going to do something similar. It hopefully will be a massive deterrent. Presumably, there needs to be some changes in tech and or regulation to make this more scalable, sort of like scalable protection. I'm aware that what

you know, the purest approach would be this is on the tube sites. They just need to be very strict with their moderation and so on and so forth. But we need to be realistic and kind of enable that, enable moderation to be made as easy as possible from a tech side and then increase the level of deterrence from a regulation side. It seems like kind of those are two important routes to go down. So what's the...

What does the future look like with that? Yeah. Yeah. So that is such an important question. And we have to think that way because we have to at the end of the day, we need to make the Internet a safer place. And yes, we need to hold these porn sites accountable. Like you said, the justice and the deterrence. But how do we at scale help prevent this across the many different platforms?

user-generated porn websites and other sites that may not be porn websites, but per terms of service, they allow and they distribute user-generated porn. And that is mandatory third-party agent consent verification for every person in every video. But the scale solution, the ad scale solution, isn't just that governments implement this policy because

These are international corporations, right? Every website is pretty much operating in every place, every country in the world. So if we have that policy in the U.S., well, we have to implement it in Canada and then we'll have to implement it everywhere, which we should do. But I think that the at scale solution is to have the financial institutions. So Visa, MasterCard, Discover, PayPal, say Visa.

We don't do business with user-generated porn sites that don't verify the age and consent of every individual in every video. And just like they have anti-money laundering policy, they can have anti-online exploitation policy. And we know that these websites are highly motivated by credit card company demands. I mean, that's exactly why Pornhub took down 91% of the entire website was because of the credit card companies.

So we know the power that they have. And when they enact that policy, it's instant and it's global. And I think that's going to be the most effective way to get all of these websites into compliance to start verifying agent consent. Yeah. In Texas, where I am at the moment, there's all manner of age verification stuff being debated right now. It seems like that's even in the news sort of at the moment, there's stuff bouncing back and forth.

what is, you know, I have to assume restricting access

as well. We haven't even talked about that. We literally haven't talked yet about, and what about exposing porn to people who are underage? That's like an entire other world too. The other world. And that's the debate right now in Texas that's going on. And there's this movement across the United States and in other countries. So we were seeing it in Europe. We're seeing, you know, in the UK and Canada, where countries are understanding legislators and the

population is understanding the harm that is being done to children through unfettered free access to these tube sites, to these porn sites, and just to porn online. And so, yeah, there's age verification laws that have been enacted. And right now in Texas, so they enacted mandatory age verification for users. So that's, you know, people who go to that site, they're going to have to verify they're an adult to get onto the site.

And Pornhub obviously does not want this to happen. So they're shutting themselves down in states across the U.S. that are implementing age verification for users in protest of this policy.

And why? It's because it's expensive for them. They have to pay to get every user verified. But that's the cost of running a porn site and making sure that children aren't all over your site, both in back of the screen and in front of the screen, because this is a form of secondhand sexual abuse for a child to have to access and witness what's happening on these sites and

Like I said, so much of it is legal. So much of it's illegal. Some of it's illegal on the homepages of these sites. And some of it's illegal, but it's pretend illegal. So there was a study done by the UK Journal of Criminology in 2021. And they looked at 150,000, I think, was the number of videos on the homepages of the most popular porn tube sites, Xvideos, Xhamster,

Porn Hub, these free porn sites, and they were analyzing what's showing up to just anybody who may accidentally or intentionally land on the homepage. And they were finding that one in eight of the videos was depicting sexual violence. So, you know, some of this may be pretend like, you know, what I said, there's no way to tell what's rough sex and what's rape. There's no way to tell who's 15 and who's 18. But so much of this was also the teen content that children

are witnessing as their sex education from as young as 8, 10 years old. I mean, I get messages all the time from especially men who say, "I was addicted to the free porn tube sites when I was even 6 years old."

eight, 10, and they've been addicted ever since. And it's shaping their sexual template, right? This is where they're saying, what's normal? What is sex supposed to look like? What is it supposed to be like? And they're seeing so much of, you know, things that I wouldn't wish on my worst enemy to have to witness that I've seen on these sites. I have to imagine that this is going to get even more complex as AI renderings of porn, whether it's

stuff that's so lifelike that you can't tell? Is it ethical to put out non-consensual AI porn? Because there hasn't been anybody's consent that has been crossed, but there is an ethical essence of, well, this potentially increases real-world harm by changing people's expectations. There's something kind of just

the virtue of a person being represented in this way is also something that kind of should be protected. The ability to take photos of people and then recreate videos that aren't them, but like them. Do you own your own likeness when it comes to this sort of protection? So, I mean, it is like the real front lines of this at the moment. Absolutely. And there was just a law that was passed in the US called the Take It Down Act, where in

now it's federally a crime to upload even ai generated non-consensual content so this would be like deep fakes where people could have their face on uh superimposed onto porn and it looks realistic and it's being distributed so you know that's uh illegal um but also you know what what's

important, too, that parents might not even realize when it comes to AI-generated content like this is that now there's the ability for predators, abusers,

anybody to take an image even of a child. So if you have an open social media account and you're posting pictures of your children, they could take that image of the child's face and put that, you know, into an AI generated child sexual abuse material video and make it look like it's actual abuse of that child. And, and that's, that's actually happening. Now in

Certain countries, even the depiction of child sexual abuse is illegal. So in Australia, in the UK, in Canada, even if it is somebody over the age of 18 that is being used in a video and they look like they're a child, that's illegal.

in the US, you know, that actual, and we're not talking about AI right now, but the depiction of a child by a person that is over the age of 18 is not illegal. That was made legal in a case called Ashcroft versus the Free Speech Coalition, unfortunately. So, but in other countries, that's illegal. But yeah, there's this whole frontier of what's going to happen with all of this AI generated content. And I actually think

that at least on websites that distribute user-generated content, that by having agent consent verification policies in place, you can actually prevent even the AI-generated content from being distributed. Because that would have to cross the same filter. Yeah. So if you have somebody's face superimposed on a deepfake, how are you going to get the ID and actually have a verification of that person to verify their government-issued ID

and consent, it would stop that from being uploaded. That's an interesting single solution to multiple problems. Yeah. How do you think that sort of user-monetized platforms, OnlyFans, admireme.vip, stuff like that,

How do you think of that as contributing to this sort of ecosystem at the moment? There's a lot of moral panic around the normalization of regular people becoming sex workers online and, you know, all of the objectification that comes along with that.

Do you think about, have you got concerns around that? How do you think about that sort of working into this world? Yeah, well, I know that we have seen some really concerning reports from the BBC, multiple reports from the BBC and from other news outlets that have been investigating the subscription sites. And so some of those are like the OnlyFans model where they've actually said,

you know, had children and victims who have been abused and even that content under, you know, behind the paywall on the subscription sites. And even victims that I have had come forward to me that have been abused on Pornhub or the tube sites of some of them have also been abused on OnlyFans. I think that they've tightened up a lot of their regulation now, again, the power of deterrence, right? Um,

starting in 2020, when they saw what was happening with Pornhub, I definitely know that there was a change in policies and the way that they were checking who's in those videos. But it's not by any means, it doesn't seem like it's perfect. And I think minors are being abused on the subscription sites, for sure. I know that that's true. And so, I mean, it's just a real concern that a lot of this, again, is self-generated.

where it's not that a child is out there getting raped and having an abuser post their content. Children are now, it's become very normalized for children to be sharing nude images, not realizing the harm that that could do to them, the way that

The internet is forever. And there was a study done by THORN. So THORN is a big child protection organization in the United States. They focus on CSAM online, child sexual abuse material. And they surveyed over a thousand children

And they found that one in seven nine to 12 year olds said that they had shared a nude image or video of themselves with somebody else. One in seven. One in seven nine to 12 year olds had shared a nude image or video of themselves with somebody else. Holy fuck. I am so glad that I had a Nokia 3410 when I was 14 years old.

Like, you know what I mean? Yeah, I think the same thing. I do. I mean, it's so hard to be a child these days. Perilous minefield of bullshit. That's a question. I had, do you know Jeffrey Katzenberg is? He's the guy that founded DreamWorks with Steven Spielberg. He did Aladdin. He did The Lion King. He is now pushing this thing called Aura, A-U-R-A. And it is, I mean, it's

To be honest, it's pretty mind-blowing what it can do. It's a security app, I guess, but it allows parents, you just install it on your child's phone and it uses sentiment analysis to work out whether kids are accessing or messaging stuff that

isn't good if they receive adult images or send adult images or take photos of adult images it sort of pings the parent immediately so it doesn't restrict the use of the phone all that much and it's not a overbearing level of supervision but it keeps it allows the parent to sort of be notified about what's going on it can do stuff it can even work out the mood of

of the kids based on the geolocation of where they've been. So it'll say, when you go to football for an hour, you type less hard, you hit the screen less aggressively, and less aggressive screen hitting has been associated with lower cortisol, which means that you're typically in a better mood. On the nights when your child doesn't use their phone for half an hour before they go to bed, they stay in bed, i.e. they don't use their phone for a bigger window, which we can correlate with better quality sleep. It is the most...

It turns a phone into kind of like a wearable, like a biometrically informed wearable device. The whole thing, I think pretty much everything's done locally. So it's not like they're sending this up to the cloud security. So I just think, you know, these kinds of, I know there's always this arms race of tech versus tech. But that was the first time that I sat down and he had Hari, his co-founder at this company,

And they just kept on telling me more and more of this. Oh, yeah, we can work out, you know, how hard they hit the screen is an indication of their level of autonomic arousal and whether they're stressed. So there are some amazing. Yeah. I mean, technology is is the capability of technology to help solve the problems that technology creates.

is amazing. Um, and I mean, there are even some apps and different programs now that for, for children's devices where they can prevent even the filming. So the camera itself could detect whether it's filming a nude image or video and actually stop it from ever being filmed in the first place. If it's a child's phone, that, that kind of thing. I mean, that level of prevention at that level, um, is,

But yeah, I mean, the ways that we can implement technology to help is amazing. But the harm that children face, the danger that children face online and young people and, you know, even adults, right, is incredible levels. And I think...

you know, the more that we can talk about and at least have young people understand the consequences of distributing content online. And that, yeah, it's nude images and videos, but it's also things that we say online that

We take it so lightly sometimes, even things we say or things we distribute online, not realizing the internet can be forever. And that maybe you'll regret saying that 10 years from now when your employer goes and there's a screenshot of you saying whatever. But just to have more of a sense of, I don't know,

I guess, a gravity about the way that things do get immortalized online and the consequence of that, especially when it's a nude image and video. Because there's sometimes there's so many people that think that, you know, it's fun right now and I'm going to do this. But they don't realize that maybe they won't want those nude images and videos online in 10 years from now. But then, you know, it's forever. So.

Informed consent, I think, is a really important thing as well for people to not just consent, but understand exactly what they're consenting to when they're sharing those kinds of images online. There is a weird type of... I think in the history of ideas, it's called conceptual inertia. So you can imagine...

There is a time when it's proposed that maybe the entire solar system and universe doesn't orbit around the Earth. Perhaps the Earth orbits around the sun and this is a total heresy and we can't believe that this is the truth. And then, you know, evidence continues to come forward. And you can say somebody proposes a thing that most people aren't sure whether they agree with. And then slowly, maybe evidence or data or science catches up with this and they go, OK, this person wasn't talking bullshit. This is actually legit. This is the way it is.

But there is still this huge lag, and it even happened with that revolution, this huge lag for just most people to use the right language. And given that we're talking about the internet being around for two decades, sort of widespread porn being around for one, one and a half, something like that, and you think, okay, is it any surprise that cultural norms and...

expectations and understandings of behavior and the way the parents communicate with their kids that these things are taking time to catch up and you know it's

people like you that are applying a nitro boost like turbocharged thing to hey these are all of the areas these are all of the different bits of weakness and and vectors where shit can go awry and don't fall down that fissure over there and we need to be worried about this thing and um yeah i mean you're a trooper you're a real hero for putting this stuff together i think you know but god knows what would have happened if it hadn't been for you and it certainly seems like the

the hashtag and the movement that you put behind this has definitely expedited this process. So, yeah. Well, thank you. I always, you know, want to pass that on because I know that

Shout out to the survivors who have spoken up and without their voices, none of this would have been possible in their bravery to speak up and to share their stories, their powerful, powerful stories. And so many of them have done that at risk. It's hard to talk about your own exploitation, but they have done it because they don't want this to happen to others. And so I just, yeah, thank you for that. And I would love to pass it on.

They've got a powerful ally in you. And my intention is to spend the rest of my life not being the subject of an investigation that you do. I do not want to be on the other side. I'm sure you will not. I guarantee that. Yeah, yeah, yeah. Look, tell people where they can check out your stuff online, support you, do all of the things. Of course, yeah. So,

many people are still signing the Trafficking Hub petition, and it is still a powerful awareness tool. And it is a way that so many people are getting this message. So you can go to traffickinghubpetition.com and sign that and join others. You can also...

So I wrote a book about this story that was released last summer called Take Down Inside the Fight to Shut Down Pornhub. And you can buy that book and all proceeds, 100% of author proceeds from the sale of the book go to the cause, go to the Justice Defense Fund, an organization that I founded.

And in the book, like you will go through this journey with me. It's written. A lot of people are calling it a true crime thriller where, you know, it's first person present tense. And from that moment in February 1st at night, when I tested the upload system, you go on the journey with me all the way through and you're

You will understand this issue not only in your head, but you'll understand it in your heart. You will experience it with me. And so hopefully you'll be inspired by that. So you can do that at takedownbook.com. And you can join what we created called Team Takedown. So this is a team of dedicated activists that are saying, look, yes, we're going to take down Pornhub, but we're going to work to take down illegal content across the internet to make the internet an actual place.

actually a safer place for our children, for generations to come. So you can do that. And my organization is called the Justice Defense Fund, and you can go to justicedefensefund.org. You are doing God's work. I appreciate you very much. Thank you. I appreciate you too. Thank you. Thank you.

If you're wanting to read more, you probably want some good books to read that are going to be easy and enjoyable and not bore you and make you feel despondent at the fact that you can only get through half a page without bowing out. And that is why I made the Modern Wisdom Reading List, a list of 100 of the best books, the most interesting, impactful and entertaining that I've ever found. Fiction and nonfiction and real life stories. And there's a description about why I like it and there's links to go and buy it.

And it's completely free. You can get it right now by going to chriswillx.com slash books. That's chriswillx.com slash books.