Imagine if you had a co-host in your life, you know, someone who could help manage your every day and do the things that you don't have time for. Well, unfortunately, that's not something we can opt into in life, but it is something you can opt into as an Airbnb host.
If you find yourself away for a while, like I do, maybe for work, a long trip, or a big life adventure, a local co-host can help you manage everything. From guest communications to check-in to making sure your place stays in tip-top shape, they have got you covered. These are trusted locals who know your area inside and out, giving your guests a warm welcome while you focus on your own starring role, whatever that might be.
You know that I love how hosting on Airbnb helps you monetize your home, an asset that you already have. That is a holy grail. And as a longtime fan of Airbnb, I have been telling everyone I know that they should be hosting too. But some of my busiest friends have been overwhelmed by this whole idea of hosting. But thanks to the new co-host offering, they have finally signed up.
So if you've got a secondary property or an extended trip coming up and you need a little help hosting while you're away, you can hire a co-host to do the work for you. Find a co-host at Airbnb.com slash host. This episode is brought to you by Progressive Insurance. You chose to hit play on this podcast today. Smart choice. Make another smart choice with AutoQuote Explorer to compare rates from multiple car insurance companies all at once.
Try it at Progressive.com. Progressive Casualty Insurance Company and Affiliates. Not available in all states or situations. Prices vary based on how you buy. I'm Nicole Lappin, the only financial expert you don't need a dictionary to understand. It's time for some money rehab.
Today I'm joined by one of the bravest voices in tech journalism, Lori Siegel. And as you're about to hear, I've known her for about 100 years. But you know her too. You've seen her on CNN, 60 Minutes. And if you've been following her reporting over the last few years, you've probably found yourself both captivated and terrified. Lori's latest work uncovers one of the darkest corners of the internet, deep fakes, specifically the dangers of AI-generated images of real people in fake sexual acts.
In our conversation, she explains the common deepfake crimes and scams and how to protect yourself. We also talk about her totally insane investigation into Mr. Deepfakes, a man that she calls one of the most dangerous people online, and what happened when she actually tracked him down and confronted him. And finally, we talk to the bigger picture, what this means for women, for our democracy and the future of AI. Honestly, my takeaway, it's definitely still the Wild West.
I'm so happy to be here with you. I'm so happy to say welcome to Money Rehab. Thank you. We've known each other for 100,000 years. Correct. We worked together at CNN 50,000 years ago. Correct. But when I saw online that you were sextorted, first of all, I wanted to kill that person. Yeah. And then I was like, what is sextortion? And can I be sextorted?
Yeah, the answer is, unfortunately, like any of us can, which is terrifying. That's like the reality of the world we're entering. So what is sextortion? This would be like if you're a child and someone reaches out on like Instagram and pretends to be your friend. And let's say you have a teenage boy. By the way, all this is going to seem really dark, so I just...
Sorry. Let's say like a teenage boy, pretty girl reaches out, starts trying to get them to send some kind of provocative image. And the next thing you know, they say, if you don't pay me,
X amount of money, I'm going to send this to every single person you know. But your child never even took the image, right? They were never even tricked into it. It's a deep fake of them. And it doesn't matter that it's not real because it looks real. And these types of like sextortion campaigns are so horrific and children are ending their lives. And this isn't just children. I say this is happening to adults. This has been an ongoing thing for a while. So what's happening with the rise of artificial intelligence and deep fakes is basically the democratization of these types of scams. And none of us are safe.
Right.
Right? So someone could say, "I have a nude image of you and I'm going to pass it around to everyone." And I always look at this from a victim standpoint. You can't just say, "Oh, it's just not me," because you'll see that image. It looks like you. And it looks like you to an untrained eye. It's you. It could tarnish your reputation. People might not know it's real. So these images aren't real, but the impact is real. So we always want to like shout this from like the mountaintops. This is what's coming. And we need tech companies to do much better jobs. So let's talk about it. So somebody reached out to you. And then what happened?
this is like a very common scam. And I just happen to know a lot of good guy hackers from my days going to hacker conferences as a tech journalist. So I was able to afford this and be like, what is happening? But someone reached out and said, I have images of you. I've been able to hack into your device. If you don't give me X amount of money, implying they had intimate images of me, I will put this out there.
And what they do is I think they put some identifiable information about you, like your home address or something like that. You thought it was, like you thought maybe somebody did hack into your phone. For a second, and this is what happens, and I'm like a longtime tech journalist, right? Like I am like, I got this. Like I don't have to freak out. And even me, I was like, oh,
"Oh my God, do they have some, what do they have?" And you start questioning, your breath gets short. I literally had a security guy come onto my computer. I like had him remote into my computer and look for any malware. I wanted to be completely sure. And he was like, "No, these scams are actually going around." So I posted about it. And the next thing you know, I get all these people messaging me privately.
being like, this is happening to me. I was so scared. And one of the things they say, this is so, I love that we're really starting out strong on this. They'll be like, we saw you on a porn site or something and now we've remoted in. People are embarrassed to talk about it. So it's just like a wild, I would say like,
It is a wild west right now, and there are so many of these different types of scams going around. Like, we are in the wild west of scams that are only made so much more believable by artificial intelligence, right? Like, parents getting calls from what they believe are their children because their voice has been imitated using artificial intelligence, 'cause it takes 30 seconds of a voice sample to be able to mimic that voice.
This is, I think, the world we're entering at all these different levels where our identities are up for grabs and AI can just mimic our most intimate features, our face, our bodies, our voice. And so it's a bit of a wild west. And I think we have a long way to go with educating people on it. For sure. My husband and I even had this conversation recently where we said we needed a safe word. Right. So if somebody gets called...
saying that they were kidnapped or I don't know, I can't even imagine. We probably don't even know what could happen or will happen. Like say, strawberry, that's not our safe word, but say something like that. - 100%. Hilariously, we all need human passwords.
This is literally what one of the top security guys said to me. I was like, how do people protect themselves? He's like, human passwords, safe words. I called my mom and I was like, if you get something like this from me, which it seems crazy to have to call our parents and say like, if you hear an AI generated voice or you're not sure if it's
me, this is the word you need to say. This is our human password. In an interesting way, it's almost like our humanity is the thing that we're hoping will help us pull through in this weird time. The analog way, I guess, in this brave new world. So then what happened? You brought in the security expert. You are used to tracking down the criminals and the scammers. So this was all kind of a precursor to we did a larger investigation that we've been working on. When I say like, I get obsessed with topics.
Like, this is for better and for worse. And I think three years ago, I became obsessed with this idea. Someone had mentioned to me there was this really shady site online, and it's a deepfake pornography site. So literally, it looked like—and I mean, this is dark—but it's like you were watching
sex tapes, essentially, of many women in the public, even though they never made them, even though they never would ever consent to something like this. But you were looking at hyper-realistic deepfake pornography of if you are even kind of a public figure, there's a chance you were on this site. And I remember going to this being like,
wait, this is insane. And then I started looking into it. At the peak of it, 18 million people were going to this site on a monthly basis. So I'm like, none of these women consented to having their image and likeness used. And like, you couldn't tell if it was real or fake, although like we know it's not real, but that harm is very real. And I just remember being like,
why does this exist? Like, why are 18 million people allowed to go and see this? And these women have no control. And ironically, it was like a lot of women in power. So- Wasn't Taylor Swift? Yeah, Taylor Swift was, there were so many people that were like, their likeness was taken and used on the site. And this site, I became obsessed with it because I was like, okay,
It's not just like about this shady site on the internet, but it was a platform, right? So it's not like people just went and saw these two horrific videos. They could also like create them or pay people to create them. And so like it became a whole platform and an ecosystem where
the idea of sexually explicit deepfakes of saying, "Oh, I like that woman. I want her doing this with this person. I don't care what she says. I'm just gonna use AI to make my dreams come true." Like, your wish is AI's command. That was what this site was, and it was called Mr. Deepfakes. And I remember they also had, like, training manuals
So it wasn't just about these public figures, these women. It was about training young men how to do this and take this into their schools or take this into their workplace. So you look on the message boards and it'd be like, oh, I want to do this to my sister-in-law. I love tech. I love artificial intelligence. I think it's going to do incredible things. But this is ground zero for what happens when it is misused and it's used as a weapon against women and girls and adults.
eventually all of us, right? So I became very obsessed with Mr. Deepfakes and tracking him and it took us all on an investigation that was very wild and never a dull moment. Spoiler alert, we found Mr. Deepfakes. Yay! Yes. It was probably a couple years ago and I'm like, "We should just start talking about this on the internet."
and explaining why people need to care about this shady site. So we said, "Okay, I believe this is one of the most dangerous men on the internet, the person behind this, and we need to know his name before it's too late." Because why should this person who has harmed so many women be afforded anonymity? This site had been up and running for seven or eight years, and he was anonymous. So you had no idea who was doing this.
I just thought, "Let's find him. Let's just try." On my team, we have some incredible investigative journalists that came with me from my 60 Minutes days. One of my colleagues, Nicole, she could be an FBI agent if she wanted. She's wonderful. We started talking about it, and we went out. And I remember I started talking about it at a moms' conference, and all these moms got behind us.
with this idea that this might be about this shady, deep, fake porn site, but actually this is about the future of bullying. This is about what could happen in your schools with young men doing this to women, thinking it's okay. Like, this is normalizing a new type of abuse. And so I think a lot of people really resonated with that message. And I remember I was getting my nails done.
And all of a sudden, I didn't even know I had another inbox on TikTok, but I was on TikTok looking at other inbox, which is like messages that sometimes they filter. And this security company, security legal company called Sedenti reached out and a guy named Jordi was like, we have...
a tip, we believe we have found him. And so I'm like, okay, this feels, I'm not sure if this is real, like 100%, but I'm like, obviously we're vetting it. And we ended up like going on this, we got like a dossier that had, I want to say 17, 18, 19 different data points. I brought in another security firm. Like we all basically tracked down like via social media, via the names we were given. And there were so many connections because anytime you do something on the internet, like you're just not hidden. This is what I've learned
through all my years in investigative reporting, like, covering your traces is actually, like, very difficult, and you will make mistakes. And, like, you know, he made mistakes years ago. There was an 8chan post with him. An 8chan? It's like 4chan, but, like, this message board where people put, like, crazy theories and memes and cultural things, and it's, like, a place where a lot of, like, you know, internet lovers, for better and for worse, go and say some of the weirdest things.
stuff and great stuff too, but it's a weird place. He had an 8chan post. We had him talking about a car, like a red Mitsubishi. We were able to track and we ended up in front of his parents' home with the red Mitsubishi. Like all sorts of crazy investigation went into it. And we tried to reach out to him many, many times. He wouldn't answer. He took down all his social media. We reached out to friends and family. And then finally we said, let's go, let's try to find him and talk to him in person.
Hold on to your wallets. Money Rehab will be right back. And now for some more Money Rehab. I found out he worked as a pharmacist in a hospital, like helping people. I found out that he was the man that had really helped create this site that enabled so much, I would say, digital abuse against women.
had a wife, he had a new baby. Like, he was really living a double life. And we showed up outside the hospital. We were able to call the floor he worked on, figure out exactly when his shift was starting. We were there the next day, and we confronted him. And so it's been a pretty wild journey just to say we shouldn't live in a world where this type of thing is enabled. And it's interesting 'cause
When we confronted him, I knew we would have 30 seconds. I knew that he wasn't going to want to speak to us. And I knew he would know exactly who I am because I'd been reaching out to him for months before. And he saw me and he just started walking incredibly quickly towards the door. And I just remember asking for comment. Legally, I want to ask for comment, right? We have all this evidence.
I asked him, I said, "I want to negotiate how someone who's a father and a son can create this type of thing that perpetuates this type of abuse." And as the doors were closing, I said, "The harm is real." And did he say something? He wouldn't say a word. I've interviewed, like, some categorically sketchy folks in my career.
But I was really shaken by how he looked at me. And that was just part of our investigation. We did so many things to be able to really fan out. And we presented our findings to lawmakers around the world, started talking about why this mattered. And I think when this happened to Taylor Swift, I want to say like January or something of 2024. And I hated this thought, but I thought maybe now people will pay attention. Like it's happened to one of the most powerful women in the world, which is horrific. And it shouldn't take
this type of abuse happening to Taylor Swift for people and lawmakers to pay attention. But it did help, I would say, people be like,
"Oh, this is the language behind it. This is why it's bad." And I think we were able to speed up our investigation, and so it's been never a dull moment. And then I got pregnant and had a child in the process. But did that change how you viewed this? And bringing a child into this world? It's a really good question. I think when we were initially out, I was thinking about it because we just had Mother's Day, and I was thinking about having a child. And I remember thinking, like,
If we are not careful, it's not just about the victims, right? We are gonna train a whole new generation of abusers, of young men who grow up and think that, "I can nudify this girl from class in a couple clicks using artificial intelligence."
And I think that actually was very much as I was thinking about wanting to have a child. Like, God, I remember not this is like probably way too much information. But like when we were in the hotel room the day before tracking him, I was literally tracking ovulation. Like I was like, it was so top of my mind of thinking what happens for our children. I just feel like we have to do better for them. And so it was wild. We went out there and a couple of months later I found out I was pregnant and I this felt so personal.
To me, I just don't want my child to grow up in a world where people think they can control women and girls. It spreads out. And we had a team of women in the field, which is pretty incredible. And the producer I was working with, who worked on "Mostly Human," which was my show at CNN, she was six months pregnant in the field. And we had this moment
She still wanted to come. I was like, "Are you sure you want to come?" She's like, "A hundred percent." I'm like, "Okay." We're doing like car stakeouts and she's literally six months pregnant at the time. And I remember we had confronted him at the hospital and he left through another door. Like he was able to get out. He took some kind of car out because we were right near where he had parked.
We didn't know his home address was. I remember it feeling like a little bit of a dead end. We came all the way out here. We wanted to get some answers. We wanted to ask for some kind of comment and understanding of how you could have created this thing that became so big without any accountability. I'll never forget, we were in the car and Roxy, who the producer was working with, she was like-- 'cause we had just figured out he was a dad because we had gone to his parents' home and we saw a baby seat, like a car seat in the car. And I was like, "Is Mr. Deepfakes a dad?"
And she was like, "Let's call a local toy store and see if he's registered." Like, pregnant Roxy is saying this, and I'm like, "Oh, that's actually probably not a bad idea." And we ended up calling. Mr. Deepfakes was registered at-- I guess for his child, and, like, we were able to somehow get his home address from that, which was just this--
It took, like, I think a lot of women, just in, like, the only way, like, I feel like a pregnant person would think. We're trying to figure out a better future for our children. And the reason I focus so heavily on Mr. Deepfakes is because it's not just about Mr. Deepfakes. It's about the future of consent and bullying and being able to, like, create a better world for our children. And I think that was really personal to me because...
I was thinking about having a child during this investigation. Then I got pregnant, and then I had a child. It's been a wild journey, but it makes it, I think, really meaningful that the site is now down. As of the last couple weeks, the site was down, and I would say it took probably part of it, us showing up at his door, other people beginning to understand who he was. It took people creating friction, Google deranking the site. So it took all this friction, but it was such a win.
Because I think so many people sometimes say, "Oh, it's a game of whack-a-mole." It's what? You take one down, there's gonna be so many others. And I just don't buy it. Do you know how many women are gonna sleep better tonight because of this? And if it's like a game of whack-a-mole, we just whacked like a giant one. So that makes me sleep better. Yes, me too. Do you know if he had a boy or girl?
I don't know. It's messed up in both ways. I think he might have had a boy. And I'm not positive when we did a little investigating, which is just crazy to me. And I might sound like a total crazy person. I always try to understand the why. I think it's too simple to be like, you're just this terrible person and you've done this thing. I think it's actually in trying to understand the why, maybe the more interesting reasons.
He reminded me a little bit of Ross Ulbrich from Silk Road, the guy who created one of the largest sites on the dark web where illegal things were bought and sold. Ross very much had this libertarian ethos of this is kind of the future of the internet and all these things.
I can't speak for David. That's the name of one of the creators of Mr. Deepfakes, according to all of our evidence. I can't speak for the why, but I do think that it started as more of a hobby and an interest. Deepfakes and also porn and all this stuff. And I don't know if there was just a lack of empathy, if maybe he didn't believe that the harm was real. I think that those walls closed in on him. I think the stakes got higher.
As the site got bigger and as people started talking about it more, and as more people started saying, this is really harmful, he never shut it down until a couple weeks ago when it was shut down. So I have no idea where he is now. Did he get fired? I don't know if they fired him. There was a report that he could potentially be overseas. I don't know. Does his wife know? I have...
I had that question too. I mean, so I reached out to her after all of this happened and his name is out there and the site is down. She hasn't responded. I did at one point show up at his parents' home. It didn't feel like, I always think it's important for it not to feel like, oh, got you. I'm going to get the bad guy. It felt sad. We grew up in like a beautiful neighborhood where kids are playing on the street. I didn't get the sense his parents knew, but I don't know. You talked to them?
spoke to his father very briefly before he went inside. And I didn't want to, how do I say this? I didn't want to stay for too long and be harassing at all. There's always this fine line of... Between the reporters that are... Between, but I never wanted to be that. We've seen that just without any empathy. And I'm not saying like I need to have empathy for this, but I think like empathy is the thing that we lack so many instances. It's the whole reason I think we're seeing a problem in sexually explicit deep fakes. People don't realize that.
that there's real harm here. And so-- A person and a family. Yeah. I like to think that we showed up with a certain amount of empathy and being inquisitive without harassing his family. I think I walked away feeling really sad. How did it affect you? I think I get frustrated sometimes because it's like,
I thought for so long, "This is why it's so big, right? It starts here, then it goes to schools, then it goes to democracy where we can make anyone say anything, and then it goes to conspiracy." So I always like to be like, "How do I frame this to different people?" And I think sometimes it can get frustrating to be able to be like,
No, it's not about the shady site. It's actually about safety and consent. And it's about a tech threat that you don't realize. It's not what all the tech bros are talking about, which is AI becoming conscious and like Terminator. I'm like, no, no, no. This threat is already here and it's impacting your children. And I think
Sometimes that can be frustrating to me because I'm sometimes a couple years ahead on this and I feel like I talk about it and people are like, "Huh?" But I do think people are really understanding and I don't blame them. It's a weird one to wrap your head around. But yeah, I think it's, you have to like divide in certain ways. I said this to my colleague this morning because we were speaking to a woman whose son ended his life after an AI sextortion. Someone using AI did exactly what I explained at the beginning of this and he ended his life. And
- How old? - I think he was like 14. He was a teenage boy. And I said to Nicole, I'm like, "Because we're going into turbo mode, we're like, 'Go, go, go.'" And I think sometimes if I sit,
I'm like, "Man, like, I have a boy, right? That's so messed up." And I can't almost, it sounds whatever to say, like, I can't sit in it for too long, but I think feeling it is probably the most important thing. And how do we, as tech, like, for my company, and, like, trying to, like, tell stories about technology, like, how do we produce humanity and how do we produce empathy
and just use tech as our way to do that. I think part of that is you have to feel it and you have to like not just be outraged, but be organized about that outrage and be able to tell that story and let other people tell their stories and see them. So I don't know. That's a roundabout answer to say I think I do OK with it. Good days, bad days. And I think it's weird when you have a child and you just look at
Like a child is so innocent and amazing and like you're just obsessed with your kid and you're like, I don't want you to see this world. I want, I want to, I want you to have the best world. Hold on to your wallets. Money Rehab will be right back. And now for some more Money Rehab. It's a reality I think that we're going to see in a few years because you're always ahead on these trends. Like when you and I were growing up, guys still looked at
-Playboy. - Totally. And then they moved into porn online. And we've seen how that's affected men. And so is the next generation gonna be involved with user-generated AI porn? I think that's the thing that's so scary, which is like,
At least like Playboy, like they consented. There are all these issues that we think about when it comes to this, but now it's like the big thing. And one of the biggest questions about the future of artificial intelligence, and we're seeing this play out in Hollywood. We're seeing this play out like literally with writers. We're seeing this play out everywhere is consent. Did you consent to have your materials uploaded? Did you consent to all of these things? And I think so when we look at this through the lens of consent, it's should consent
should anyone be able to have the power to make anyone do anything without their consent? And I feel like this is like a no-brainer. The answer's no, but it's a wild west. And oftentimes by the time we're having the conversation, it's too late to have the conversation because in the time that Mr. Deepfakes has risen and also fallen, there are all these nudifying apps, right? There are all of these apps that have been popping up that allow people to do this with so low friction. You don't have to be high tech
to do this. It's just a couple clicks. And now we're seeing the conversation around that, and thankfully the laws are catching up, but the genie is certainly out of the bottle at this point. In that time, it sounds like this horrific story of a young boy killing himself.
came from another site. So you're playing this game of whack-a-mole. There are obviously other moles. Yeah. And I think it's how do we educate parents now to say, "Okay, what are the conversations we need to have with our children so we can keep a really open environment?" If something like this happens to them, they don't feel embarrassed. They don't feel like ashamed to come to us and say, "Hey, I received this photo and I didn't take it or I did." Who cares? Being able to even be prepared for these types of things
so we can get in front of what's gonna be inevitable. You have these groups online that are now targeting folks. And so it's like Mr. Deepfakes was just our way in to talking about, like, a deepfake world where we can't really believe anything we see, where our likeness is weaponized against us, where our most intimate qualities are mimicked by artificial intelligence.
And that can seem scary, but the biggest thing for me, honestly, is giving people agency. There's also stuff we can do to get in front of it. The idea that Mr. Deepfakes
is down, there was so much friction created that like legally made it very difficult for this site to operate the way it was. That's agency. That's like saying we're just not going to live in that world. We can actually make changes and AI can work for us. It doesn't have to work against us. It's a tool. There's so many amazing things that AI can do. You're so into the tech world. You've covered it for a
a couple of decades, you know, all the major tech founders was Meta, TikTok. What did they say? Some of these companies have done better than others, right? It's also like a closed model, so it's harder to have AI generate these types of images. They have worked very hard against this, right? But some of these other open source models make it easier for this type of thing to happen. The thing that I've been obsessed with, I feel like this is my next thing,
is as I started digging into Mr. Deepfakes, and I was like, "I want to talk to other women who have been victims of this and survivors of this." And I spoke to a woman recently. Her name is Bree, and she is so hard to describe other than joy in the morning. She is a local meteorologist outside of Nashville.
She is loved in her community. I feel like she's a person who walks down the street and people hug her because in news right now, the meteorologist is the least controversial figure ever, right? And they are in your basement with you when there's a tornado telling you what to do. So she's really loved.
And I got in touch with her after seeing her-- She was trying to get a law passed in Tennessee after she was like, all of a sudden, on Facebook meta. She would post something and then a fake Brie, who seemed like her, fake profile, would respond to her fans and say, "Hey, reach out to me on Telegram more soon." And she had someone message her and say, "I think your husband has been sending nude photos of you out." And she was like, "No, he hasn't." And...
they were using deep fakes of her to make it look like she was nude. And they would get her fans to go on a Telegram account, and they would send this. And one of these scammers said, "Meet me at a hotel in Nashville. Pay me, like, X amount of money, and here's a taste." And sent these images of her. And there was
Literally, she was shocked by this. And then another one reached out to one of her fans, got them on Telegram. They think it's her and said, "Join my VIP fan experience." There's another one that said, "I'm in a terrible relationship and I can't get out. There's abuse." Like, lies. All of these are lies. But preying on her fans and, like, utilizing AI and sexually explicit deepfakes. And they used an AI-generated video of her to say, "No, it's really me."
And all of a sudden, we started looking into it. And we worked with a security company called Vermilio, and they did, like, an analysis of how many fake profiles of her were out there.
And 5,000 and counting. So she was living this whole fake life on the internet where people were profiting off of her likeness. They were sexualizing her. They were doing all this stuff. And she reached out to Meta many times with profiles. And she told me the woman said to her, I don't know what Telegram is. I was like, you know what Telegram is. Oh, I would love to look in your Telegram, by the way. Oh, my God. I've been talking to one of her scammers for weeks now. It's wild. As yourself?
As a fan, to try to understand. But I think like the biggest thing is we don't even realize it, but our identity has been taken and AI is front row center here. And we could be living these fake lives on the internet. We don't even realize it. Selling crypto, selling sexually explicit deep fakes, all of these things because it wasn't just her. She started talking about this and all these other meteorologists came out and said, this is happening to me. I realized it was happening to me. There's multiple fake Loris out there selling crypto scams. Can we check if it's happening to me?
Yes, 100%. This is like my latest obsession. I think we are living these fake lives out there. And I am sure the tech companies know. People are reporting it. I think they know about this. And I think this is tip of the iceberg. Deepfakes are way into talking about a whole deep reality where none of us are immune. And we're just beginning to see that. Thank you so much for the work you do. I am officially scared. And a lot of...
this extortion or sextortion is around money. They want money in crypto. And so we end our episodes by asking all of our guests for a tip that listeners can take straight to the bank. So how can you protect yourself?
I would go back and say what we were talking about at the beginning, because it's a real tangible thing. This idea of a human password and also monitoring your accounts and making sure there aren't those small charges, right? If there's a small charge on your account, you're not sure where it comes from. Like oftentimes this is what scammers will do. They'll try to see if they can get away with a little and then they'll go and charge a lot. So that's definitely one thing. And I think we're
really trying to talk to people you love, tell your parents. Tell your friends, like,
These links that are coming up, these text messages that you are getting, these emails, like you have to be 1000% sure before clicking and sending your information because now they are personalized. These scammers are getting better and better. They make it seem high stakes. And I hate to say this because I don't want to end it in a sad way, but like question everything. If you need to, if you're getting some stuff from the bank, call the bank.
Actually, not the number from the text message where they send it, but look up online the number to your bank and call it or go in person, right, and triple check because these scams are getting really sophisticated. They feel very personal and they're coming from all directions. And I think being able to understand that is going to be really important for the future. Listen, I just got identity thefted. And so I'll just tease this. We're going after you, Mr. Identity Theft. We are coming for you. That's our next episode. I know.
Where he's going to find you. I'm going to your place of work. I'm going to track you down. Go to your parents' house. I know. Look in your car. Yes. Yes. So you're going down. Yeah. Money Rehab is a production of Money News Network. I'm your host, Nicole Lappin. Money Rehab's executive producer is Morgan Levoy. Our researcher is Emily Holmes.
Do you need some money rehab? And let's be honest, we all do. So email us your money questions, moneyrehab at moneynewsnetwork.com to potentially have your questions answered on the show or even have a one-on-one intervention with me. And follow us on Instagram at moneynews and TikTok at moneynewsnetwork for exclusive video content. And lastly, thank you. No, seriously, thank you. Thank you for listening and for investing in yourself, which is the most important investment you can make.