What's up, everybody? Welcome back to The Honest Drink. I'm Justin. You can always reach us at thehonestdrink at gmail.com, Instagram, or WeChat. This episode is hosted by Howie, Eric, and myself. And the three of us talk about the documentary called The Social Dilemma. It's a film that shines a light on the darker, more sinister side of social media technology and how it's taken on a life of its own.
and how it methodically and purposefully manipulates your behavior, your emotions, and even your politics. The film explains it much more intelligently than I can, but I think the overall message is that we are no longer the ones using social media. It is, in fact, using us. Our behavior is the commodity, and it's been that way all along by design.
This is pretty much a topic that touches all of us and it was really fun to talk about. So without further ado, here we go. ♪
Yeah. I don't know, man. What do you mean you don't know? I saw The Social Dilemma last night. And I told you I saw Hereditary. Fucking scary as fuck. It's not scary. It's scary, but it's just more disturbing than it is scary. Hereditary. Yeah. I just saw it a while back. And The Social Dilemma, I guess. That's what I was going to say. The Social Dilemma is a scarier film for me than Hereditary. Yeah, yeah. Yeah, they're both pretty disturbing. I guess The Social Dilemma, because it's real life.
is even more disturbing. Have you seen any of these, Eric? I saw Hereditary a while back. That's the one, I don't really remember it very well, but it's like in a house. Most of it's set like in a house and stuff like that. And it's kind of kind of some weird shit. I didn't, I fell asleep halfway through it. Okay. But I just remember one scene when they're driving that was like pretty gory. Yeah, yeah, yeah. But that's, that's where the whole story starts.
right like that's not even that's not the climax i only watched up till that point because they had like a miniature did they have like a miniature house or some shit well it was filmed yeah like she was working on a miniature house and the whole style is filmed and it's like it makes you feel like they're all in a real life dollhouse kind of thing um yeah the whole the head out the window thing that's the starting point that's the catalyst for like how everything starts unraveling towards but
But yeah, I mean, the social dilemma was a lot more, I guess, disturbing. But what's interesting is that it's nothing like we didn't know. Like a lot of the points they make, we've talked about on the show before. You know, we've talked about algorithms, how we're controlled by algorithms. We talked about the effects of social media. We've talked about...
the rise of, you know, teen suicide and stuff like that. We've mentioned them on the show a bunch of times, but like when you really watch the documentary, it kind of puts a new spin on it. It gives you a little bit more information and depth into hearing from like actual industry people about it.
And it makes it a lot more, feel like a lot more dire, right? Like it makes it feel, I got the same feeling after watching The Social Dilemma that I did when I watched- Game Changers? Yeah.
No, not Game Changers. Al Gore's... The Inconvenient Truth. The Inconvenient Truth. Back in the late 90s or early 2000s? No, it wasn't. 2001, I think. Yeah, sometime in the 2000s. But Al Gore's documentary about the global warming and, you know...
environmental effects that humans have on Earth. Yeah, the inconvenient truth. I got the same feeling afterwards. You know, that same kind of dire, like, oh, shit. Like, okay, we got to take action. You know what I mean? Well, Eric, you haven't seen this one, right? No, I haven't. So can you tell me...
what, it's a documentary, right? All I know is that it's kind of a documentary. He's talking about like the impact of like social media. So I don't think like you're going to be spoiling it for me, right? Or... Well, I think we could talk about some of the topics that are on it. I think we could talk about it because Eric himself has talked about a lot of these topics and I think nothing in this documentary I think will surprise you necessarily. And plus you work in the tech industry. Do I? Well, you do. Can't say where, but you do. Vegetable, fruit. Yeah. So...
But yeah, I don't think, I think for the purposes of like the listeners, like we won't spoil like, it's not like a movie like Hereditary where we wouldn't want to, you know, like reveal the plot. Yeah, there's nothing to reveal because it's a documentary. It's just like you should watch it anyway. Like everyone should be watching it. And if anything, this documentary is not meant to be revealing things. It's actually meant to be spread because people need to watch it. And people need to talk about it. I think it's okay. And because I haven't seen it, I think I can take the kind of the position of,
of our listeners. So like, tell me like, I guess what's the movie about? What's the argument being made, right? Is it, cause I just thought of game change. I haven't seen it, but I thought of game changers just based on the start of this show, because it's like, you know, trying to make an argument and it's causing a lot of conversation, although they're very different, obviously it sounds like, but what's the argument they're trying to make?
Yeah, that's a good question. I mean, they put it so much better than I can put it in the actual documentary. But if I'm going to take a stab at it, I guess the basic argument they're trying to make is that obviously we all know with advances in technology and social media, all these apps and platforms like Google, YouTube, TikTok, Facebook, Twitter, Douyin, like, you know, all these things.
There is a very dark side to all of that. And while I feel like we all kind of already knew, like, yeah, you know, we spend too much time on our phones. We get addicted to these apps. You know, we engage less in person. Our social skills start declining over the generations. But watching this documentary, it puts so much more of an urgent kind of
need to address this issue and to have this conversation openly and publicly. And I guess the main argument is that it's no longer a technological tool that humans use, right? Because a tool is very passive, right? Like a bicycle, right?
Like they use a bicycle as an example, like you ride a bike when you need to ride a bike. If you don't, the bike just sits there, doesn't do anything, just like any other tool that can help people. And while technology and social media kind of started off as a tool to help you,
to empower you, it has completely turned into something that's no longer a tool and it has its own agenda and it has its own means to manipulate you proactively. So whether it's push notifications that get your attention, whether it's the algorithms that slowly manipulate what you see, right? Recommendations. Recommendations, tagging, it's all these things that actively manipulate
actually seek you out now it has it's like its own goals it has its own goals and it has its own like um what's it like basically spirit or its own way of pursuing like its goals its own which is you well how's that different than what has its own mind now how's that different than like um like marketing in general isn't that what marketing is
Yeah, but you take this to the nth degree, right? So marketing, you can pass by a billboard, you can see a commercial on TV, fine. This is because we are so attached to our phones and on social media, this is actively manipulating you into changing the way you behave over time. And we don't notice these things, right? And a great example he used was like in this documentary, it was like, take Wikipedia, for example, right?
or in China it's Baike, right? Is that the Chinese? - Baidu Baike. - Baidu Baike, right? It's a Chinese Wikipedia. And Wikipedia is one of like the only things on the internet that's kind of universally the same for everybody. So whether the definition is accurate or not, that's not the point. The point is when you go and you search for something on Wikipedia, it shows up the same for everybody. Now imagine if you were to search for something on Wikipedia,
And based on where you are geographically, based on your search history, based off of all the other data they can pull off of you personally from online, your online activities, it gives you a personally tailored definition of whatever it is you're looking up.
And it does that for every other single person on earth. So what starts happening is everyone starts working, has their own set of facts that are different and manipulated based off of all this data it's drawing from you.
And that's how a lot of these algorithms work, where now everyone is getting a different set of facts and working on a different reality. Right. And you can go down the street or drive through your cars or look on the media and news and be like, why do people believe this? Why are people acting this way? Isn't that so stupid? Like, don't they see like all this information I'm seeing?
Well, the truth is, no, they're not seeing the same information you're seeing. They're actually getting different information that's splitting and driving people further and further apart. So it's like we always mention echo chambers, but take echo chambers on steroids and it's creating these echo chambers where it's manipulating people based off of what they actually like to see.
or their watch times and driving them further and further in that extreme over time. And if you're on this side, it's driving you further and further in extreme over time. And this guy gave a great example. He's like, at scale, if you take the plane and level it, tilt it one way, take the whole floor and tilt it one way, yeah, you still have the power to kind of climb to the escalated end,
But it'll take a lot of effort for you to do so. And at scale over large populations, the majority of people are just going to slide down to one end. So it's like, it's about manipulation, really. And it's about how we have less and less control. And even more so is that most people don't even know they're being manipulated.
And they tie it into all the division in the world and misinformation, which I see all the time. I see a lot of misinformation. That's like the big secret is that you don't even know you're getting manipulated. And that's what this documentary really was trying to touch upon, was using different examples. Because these are all engineers that literally created information.
For example, the like button, you know, or create the algorithm behind the like button or recommendations or stuff like that. These are all pioneers in the tech industry, you know, in America. And so these are the guys that are stepping out in this documentary and saying, hey,
well, yes, I did this, but I did not think about the cause and effect. And it's not to paint like these engineers and these scientists and these people who work at these tech companies as evil. It's not that at all. It's about they're inventing things that they had unintentional consequences that they couldn't have thought of before. And even worse, they're not in control of anymore. Like these algorithms that were saying like,
really, they don't really know how it works. Like, yes, they created these algorithms, but these algorithms become so complex that they actually have a mind of their own. And that even the people who created them don't really fully understand how they're working right now. One of the things that I remember that...
when I was watching the film that really stuck out to me was, I forgot what the guy's name was, but he's the guy that created the like, you know, the like button, right? From Facebook. Yeah. And I remember he was saying when him and his team were creating that technology, they were coming from a very positive place. They're like, like, that's a positive thing. We wanted to spread joy. Yeah. We want to spread joy. We want people to share likes and that's a positive thing. Right. Yeah.
And they did not know that because of this, it caused a huge result of younger people getting depressed, feeling anxiety because they're not receiving enough likes. Cutting themselves. Yeah. And suicide. Yeah, which totally played into the, I guess, the mental...
makeup of a young person needing to feel accepted, needing to feel part of something and the like being a part of, you know, like I feel liked by people. Wait a minute, I just posted something and no one's liking my thing. Does that mean that they don't like me?
And not thinking in that way that- And when you do get those likes, you want it again and again. You want it again. What's the next thing I'm going to post, right? This documentary has a lot of great quotes, right? And one of the quotes they showed in this documentary was that there are only two industries that name, that refer to their customers as users. There are only two industries. One is illegal drugs, and the other is tech and social media.
Those are the only two industries that actually refer to their customers as users. Yeah. It's insane. Another great quote was, they said, if you are using a product for free, then you are the product. Yeah. Right? Yeah. And I was like, oh, damn, shit. If a product is being sold to you for free, that means you are the product. Well, it's almost like also going back to, I don't know if your parents ever shared this with you, but...
My mom would always say to me, it's like, if anything is too good to be true, right? Which is a very common saying. If anything is too good to be true, it likely is too good to be true, right? So if you're... Because this actually, in my opinion, the social dilemma, even though the main focus was about social media, but...
I don't know, I feel like it links into gaming as well, mobile gaming especially, as well as even shopping platforms, right? Because they all use the similar idea. They all use different techniques to tap into people's biological, natural behavior. They all prey on the same thing. Yeah, exactly. And to me, it's all one and the same.
And for me, for example, I do play some mobile games. And one of the things that I did give up, as you know, a few years ago, is like, you know what? I used to be very addicted to video games. I'm not going to buy the next PlayStation. I'm not going to buy the next Xbox. I'm going to stay away from those because I know when I get them, I play for hours, right? Yeah.
But, you know, I can install a couple of mobile games, you know, because they're quick. You know, they're little. You know what I mean? And it's simple. And then I do have those couple of games which are, they require 10 minutes of my time, 5 minutes of my time, and that's it.
But they're also free to play. And they also tap into many people's psyche of, well, I kind of want to play more. But for me to play more, I need to spend some money or whatever. Next thing you know, people are on their games, like these mobile games, for even longer periods of time and spending extra money, which relates to gambling. You know what I mean? So if Wikipedia is like...
I mean, a positive example or at least a neutral example, right? Because everyone kind of gets the same thing. Then, like, what are some other examples? Because podcasts are too, right? Because, like, what you're saying is, like, personalization is being weaponized against people, right? To kind of hack into their brains and to, like, to, you know, leverage their neurobiology and then to be able to establish, like, new behaviors. And then once I can kind of, like, establish these new behaviors at the individual level, then I can start...
making money off of you basically i mean i i would assume that like my profit yeah making money is the end result one of the end results right like it's the main yeah number one um but like what are what are some counter examples like should we listen to more podcasts like maybe the t maybe thd is actually very sinister as well we're embedding
messages into our podcast. - Well, it's about the ubiquity of social media, right? So that's like going back to your example of gaming and things like that, I draw a line, even though yes, they are preying on the same kind of human psychological weaknesses, I guess. But I draw a line where gaming is not as ubiquitous in terms of like every single person is gonna be using this.
And number two, it's not about... Gaming has nothing really to do with the information and your worldview. Whereas social media is really shaping your worldview and the information you get about what the realities of this world really are. And that is an extremely, extremely dangerous thing. And going back to kind of your point, Eric, it's like this...
It's this documentary is not saying like, oh, get rid of all your actually they do make an argument about deleting your social media apps. But it's they but they make a point to say in this documentary that it's not just all bad. We're not just saying there's nothing good about social media. There are a lot of great things, obviously.
And what needs to happen is not about you need other things to fill that void. It's about social media and the tech industry is pretty much fairly unregulated right now. And there are no real laws around it and regulating it. So these tech companies kind of run unchecked. Whereas if you're a food company, you have the FDA. If you're an airline, you got the airline. What's it?
what's the airline regulatory system? FAA. FAA, right? You got regulatory things that make sure like things are kind of above board, right? There's a standard there. Now there's not so much, now there's more talk about it recently in terms of implementing more of this. That's why you see Mark Zuckerberg at these like congressional hearings, blah, blah, blah. But as of now, there are no real structure and mature laws governing any of this. So it's completely a free for all. And again,
Many of these companies, according to their ex-employees in this documentary, are saying that there really is not much of a moral compass in many of these companies because they brought up, while they were working in these companies, they brought up these issues.
And they were kind of completely just kind of disregarded. And that's why they left to start their own thing. Like, what's his name? Tristan Harris? Tristan Harris. Tristan Harris. So, you know, so it's scary in that sense where there really is no regulation over these things. And while these are the most influential things that we have today,
They really are. I mean, they have toppled governments before. They have started, you know, genocide. They have created conflict in the world, like real tangible, physical, real world examples.
translate it's not just a thought experiment what are some examples that they give of like you know this new sort of social weapon of mass destruction uh they gave an example uh was it miramar there was like yeah there was like um there was like this mass killing of people in miramar a certain ethnic group in miramar and they were saying that facebook is like huge in miramar and when you get a when you buy a a phone a smartphone in miramar
the only app that is preloaded onto the phone or in that region, or I'm not sure if it's specific to Myanmar, but the only app that's actually preloaded onto that phone is really Facebook. So everyone uses Facebook in that region. It's like, of course, it's like how we use WeChat here, right?
And that because Facebook, there's no regulation. And even in Facebook's defense, they can't even really tell truth from fact. It's not like they know what's real and what's not. And they're just not, they're turning a blind eye to it. There's no way they can know either, right? So a lot of this kind of hate propaganda was being spread around virally in these regions of the world. And it led directly to this huge genocide movement.
- And Exodus. - And Exodus of people, and it was covered by the news. - It was like 700,000 Muslims leave Myanmar. - So, yeah. - It was ridiculous. - So you mentioned earlier, like sometimes, like we don't know how the algorithms work, right? And that's scary.
This hate propaganda, was it like spread by the actual algorithm and we just like no one knew why? Or was it actually spread by a group of people? It was spread by the propaganda, but unleashed by a group of people, right? So the algorithm didn't create this content, right? But based off of how the algorithm works and how it targets people,
it pushed this upfront and spread it virally. And it's not that the algorithm had some political motive to do so, it's just that the algorithm, this is how the algorithm in its current state is designed.
It's designed to spread things that people that incite some sort of emotional reaction. - So it incites like sort of mob mentality or like snowballs. - Yeah, well, it does it, I have to be clear, it does it indirectly in the sense that it's programmed to understand what creates an emotional reaction from the user, right? And they know that when you are reacting emotionally, whether it's positively or negatively,
your watch times are longer. And they know from mountains and mountains of data that your watch times are almost 99% of the time are always longer when it's negative. So when it's a negative emotional reaction, your watch times are longer. And this is the only information that this algorithm, or not the only information, but one of the main information that this algorithm works off of.
And it targets that to spread this. Now, it doesn't know what this content is, but it knows that people are having an emotional reaction to it and it keeps spreading it and spreading it in the algorithm and it reaches millions and millions and millions of people. What they also say is they kind of relate whatever the content is to other contents that may be of interest to the person. And, you know, not...
one type of negative content is actually highly relatable to this other negative content because so-and-so or X amount of people have watched both. And so you as a fresh person is most likely going to like this one as well. Hence creating that, it's so good at creating that rabbit hole, that echo chamber and that rabbit hole. So once you,
it's like pulling the thread loose thread on a sweater once you start pulling it oh it's it's got you right and then it starts leading you deeper and deeper and pulling that thread more and more and more where you get into this hole that at like over time psychologically it's almost impossible to get out of because number one you don't even know what's happening yeah so this reminds me of like a couple of things right it's interesting to get your perspective on it um
So like one is like there have been movies that have foreshadowed this type of or, you know, like this type of situation. Like Kingsman was one. And then Snow Crash, a book that was written, I think, in the 90s is an incredible, really awesome book by Neil Stevenson. And so in Snow Crash, basically, like when people saw the screen and it was just like fuzzy, then their minds would get reprogrammed. And it was like once you saw it, like you were fucked. Right. Right.
And then in Kingsman, it was the same thing, right? There was some, but I don't think it was a visual thing. It was like a cell phone. Yeah, it was a cell phone thing. And then people just became like really, really, really crazy, right? And so like, it reminds me of that. And then,
you know, like when I, when I, when you mentioned these things, like it's kind of like a weapon, right? I mean, because it's like, if you think about the negative impact, it's like a weapon that causes harm. And I automatically, or I instinctively thought about like guns, like cars, nuclear weapons, whatever, all these types of things, right. Or even a virus, you know, I think the thing with a gun though, is it has to be like operated generally by a person, right.
And so it's not as scalable. But once you get everyone with a gun, it can be very, very dangerous. But guns have good and bad, right? A gun itself is not the problem. Cars, same thing, right? You get all these people on the road, people die. So there's always these- It's who's behind it though. What? It's who's behind it. Right. There's all these unintended consequences, right? But I think one of the things that you guys are-
kind of alluding to is that these algorithms are more powerful in some ways than traditional weapons because number one, it feeds off of human reaction, right? Like there's a symbiotic relationship, right? A gun and a person, they don't interact.
And then the second thing is that because it's digital, it's like a virus. It actually spreads faster than something like COVID. And so that's like, because like with a gun, you'd have to ship it. If you wanted to kill everyone in the world, you'd have to ship a gun to every place physically. There's the physics of it. Whereas an algorithm, and because everyone is already plugged in to this mass internet, because we've already built this infrastructure. Then if I wanted to tap into people, then Facebook becomes that device
And it's instantaneous. And 20 years ago, it wouldn't have been possible because we didn't have platforms as large as like whatever Facebook, WeChat, et cetera. But now it's sort of like, it is like a matrix. It's like the movie, The Matrix, right? And in The Matrix, everyone was physically plugged in, but now we are plugged in. And because like,
99% of people have these apps and then bam, now you have this viral medium which good and bad things could happen. Actually, one of the statistics that was shared in the film, if I remember correctly, which goes back to what you just said about 20 years ago,
I remember that it was like 2010, 2011 in America, they saw a sudden spike after 2010, 2011 of depression, anxiety, and suicides. It was 2009. I know exactly. Yeah. There's a spike from then on, a spike of suicide, depression, anxiety from two groups of people, especially individuals.
Like young teen, young girls and under teens. Yeah. Right. Teen girls and preteen. And especially preteens was like really dramatic. It was like a 300% rise. And it was just like a crazy dramatic rise. And that's also when a lot of social media algorithms were getting spread out. And it was, it was this, it was a, I know exactly what you're saying. Yeah. Right. It was 2009. There's a graph of two groups of people, teen girls and preteen girls.
And it was about self-harm, like self-bodily harm and suicide. And both graphs rose like by hundreds of percent, like 300%, 300 more, 380%, whatever it was. And it all kicked off very clearly in this graph. You can see it all started ticking up really rapidly in 2009. 2009, I think it was 2009. But 2009 also happened to be when these social media apps went mobile.
when everything started shifting onto mobile smartphones. That was the same year. And then boom, you see this rise. And yes, it's not like, aha, like this is end all be all, but all the evidence so far of all these research that are being done, and a lot of research is being done, points to
social media going mobile. I mean, it makes sense. Yeah. Yeah. And it's like going back to the whole matrix thing. It's like how, like they said this in the documentary too, like how do you wake up from the matrix if you don't even know you're in the matrix? If you don't even know there is a matrix. Well, didn't they say that there's another thing they were saying that, you know,
I mean, you have much better memory than I do. I remember they were saying something like these corporations, they program these algorithms and plus their overall big goal is to sort of like
nudge each person little by little like by by fractions of a percent every day yeah as opposed to getting you immediately getting you addicted and like you don't even like because then it becomes obvious but if it's such an incremental increase of addiction of manipulation little by little by little and within like a year or half a year or whatever the time frame is you're in and you don't even know you got in you know it's like it's like getting pricked
Little by little, you're getting pricked. And then next thing you know, your whole arm is fucked. It's like injecting you a little bit with an addictive substance, an addictive drug like heroin or something. A little bit every day. A little bit every day where you almost don't notice it. Almost don't notice it. And then all of a sudden, over a few years, you're a full-on addict. Right? Yeah.
And before anyone thinks like, oh, you guys are kind of blowing this out of proportion. Manipulation is kind of a strong word. You know, like we're paying this. No, it really is like, like the way they paint this is like, this is an active and aggressive way
It's not like, oh, this is kind of happening and this is just a kind of a side consequence of the technology. No, this is like the primary actually goal of a lot of these tech companies because their customers, we think of ourselves, the users as the customer. We're not the customers because most of these things are free to use. We're not paying them a dime. We're not the customers. Who are they getting the money from? Advertisers.
Advertisers are their actual customers. So they are in partnership with the advertisers who are actually paying them the money because we're not. And who is the product? What is being sold? We're being sold. Our data is being sold. And not just our data, but our behavior is being sold and farmed out. And they want to change our behavior in a way where it's more in tune with their
capitalist agenda. You know what I mean? Their profiting agenda, I guess. What I find also interesting, which maybe we can talk about in terms of opinion, because right now we're just spitting off some things that we saw in the film, right? Yeah. But it lines up with our real world experience. Of course. It's not like it's out of nowhere. But here's my question, and I'm going to sort of dodge a little bit. So right now, we're discussing The Social Dilemma, a documentary film that we saw recently, and
And that's all. Well, who made The Social Dilemma, by the way? Obviously not Facebook. Well, I think, was Kristen Harris probably a producer behind it? I don't know. It's on his website. It's on Netflix. Yeah. It's a Netflix original, I feel. You can look it up.
But basically, that's basically, in general, based off of America's stats and America... Well, the funny thing is the social dilemma, like, I mean, you know, just kind of reading through the description, it was released on Netflix. Yes. Right? So...
It's interesting. But I guess people do pay for Netflix. So it's a little bit less maybe insidious than something like Facebook where it's really personal. I mean, Netflix, like they personalize the choice of videos, but then every video actually has to be produced and made. And you pay Netflix a subscription. You pay for the service. You pay for the service, right? So it's a little bit more of a fair kind of thing. Yeah. And, you know, there's a wide range of information on there that you can kind of search for.
Whereas it's not actively in terms of, well, I guess it is. Okay, look, look, there's no getting away from the fact that on a basic level, all algorithms optimize what you see depending on what you like these days. That's just everywhere. Wherever you go, whatever service you use, they want you to stay longer. I mean, there's nothing wrong with trying to make money as a business.
And there's nothing wrong necessarily with kind of the morality of trying to make money off of their users. The problem is that there's no oversight. There's no control. So it's not this thing where it's just kind of gently sailing down a river. It's a freight train right now. And amongst the kind of misinformation or even information that it provides you,
Once it latches onto you, once it knows what you like, it's a freight train and it just drives you down, no stops, express lane, down that hole of what it thinks you like to watch. Mm-hmm.
So what that happens is there's no more balance of anything. There's no more balance of like, okay, you know, here's one view. Here's one side of the argument. Now, let me recommend a different side of the argument so you can kind of make up your mind and get a balanced approach to it. No, it's one side of the argument. Oh, you like that side? We're going to shove this side down your throat. We're going to take you down the express lane as deep as this hole can go.
And that's the danger because it just, it's a freight train leading you down a one-way road based off of an initial kind of search. That's not entirely true because if I choose to use Safari or Chrome or whatever, I can just load up New York Times and I can see what's in there. Well, that's you proactively going out. I mean, I think we, I mean, I sort of,
you know, I hesitate to say that we're like lemmings, right? Like we're so weak as individuals that we just have no choice, right? But I mean, like my normal behaviors, I'll go on safari, you know, I'll pull up. I need to cut you off right now because we're talking about social media. Yeah, but it's all related. No, no, no. It's all linked together. Let's stay on this. Okay. Yeah.
Even, okay, because right here example. Because I don't really use social media, but yeah. I know, exactly. You don't, you don't. I do, but I don't. I use private networks. You're not like always on it. No, I mean, I'm not always, I mean, I'm a victim. Like, I mean, I, cause I want to peel back like where we're being impacted because we're all impacted. Anyone who says they're not impacted is obviously being impacted, right? But there are, there, there are different layers, right? Like they're private networks. Like,
For instance, like private networks themselves can be echo chambers if you're on WhatsApp, you're on WeChat, et cetera, right? - Of course, of course, of course. And people are gonna be affected by it to different degrees. - But when they say social media, right, what are they, are they just talking about Facebook? What are they primarily talking about?
Facebook, Instagram, TikTok, Snapchat. Twitter. Yeah, these are the ones that keep bringing up. And YouTube. And YouTube, yeah. Okay. No, but going to... Okay, I don't want to cut you off, but I am. Because you're going about the whole kind of... You were kind of just going on the topic of search engines. If you want to search for something, you're going to go to your search engine of choice and type in something. I wasn't talking about search engines. I was talking about...
I was just saying that my normal behavior, let's say that like, you know, I want to learn more about a specific topic like politics that can be extremely, you know, polarizing, right? And that's causing a lot of the challenges that we have, right? And so, you know, it sounds like one of the dangers is that social media will then amplify certain things and you keep going down this expressway, etc., etc., right? Yeah.
And I'm just kind of sharing my own personal consumption. It's like, well, I'll go on Safari or Chrome or Internet Explorer or whatever, and I'll just open up New York Times. I'll read through it. And then I'll pull up like Fox. I'll read through it. I'll pull up OAN.com.
you know, I'll pull up Breitbart, right? And I kind of just go through these things, right? So, I mean, that's one scenario. But are you saying that most people are getting their political news not from these traditional digital sources, like they've already moved past to the next generation where they're just getting everything from like Facebook directly and Facebook has become a news media outlet where people trust it? All the data, all the data,
Points to that more people are getting their news and information from social media than from anywhere else so while
Like, you don't represent the mass majority of people, Eric. Like, the way you get your information is actually quite good, quite balanced. Like, you... But you practice that. Like, you practice mindfulness. This is something you're consciously aware of, I feel. I just don't like social media. Yeah. It's not... I haven't practiced. I'm not better. But that's you. I'm not any better than anyone. The majority of people get their information through social media. There's a reason. It's because...
It's not that social media, like Facebook is writing these articles and pushing this content. It's just a congregation of different links and opinions. Do you guys use Facebook? I'm not on Facebook. I have it. Do you use it? Once every month or two. So, Facebook is the one that gets mentioned the most in this time. So, okay. So, let's take some examples that are... I mean, because I think... I understand the argument, right? But Facebook isn't necessarily relevant to us or the people that are listening, right? Yeah. So...
What are examples in your own life where we, after watching this movie, feel like we may be manipulated by things? Because they're clearly examples, right? Like even where we live right now.
you know, we're using technology more than anyone, right? So let's not even just say social media. Let's just say technology, right? Because everyone has incentives. Everyone has an agenda, right? Where are areas in our own lives where we feel like we're being pushed towards, maybe nudged towards, you know, unhealthy behaviors and we're not aware? Have you guys thought about that? Sure, yeah. What are some areas?
Well, for me, because like you, I am not that much on social media either. But where I see it most is in other people. Like, I know that sounds kind of fucked up, but it's true. Like the way I see how other people are absorbing their information, and these are some people have been very close to me.
And the way the kind of rabbit holes, I see them going down. And I'm not saying I'm perfect and I'm not, you know, I'm not a victim of this either. I am, but I just, I am just less on social media than most other people, just like you.
And I see the rabbit hole they go down and it's this kind of vicious cycle of, okay, like you say, it's not just about social media. But we can't get away from the fact that most people are on social media more than they are on any other app on their phone. Most people are. So what happens is once they start getting influenced and their worldview starts changing on social media,
Even if they want to break out and go to like a search engine or wherever to look up their own facts and do their own research, they're doing research based off of what they've been looking at on social media and what's been shaping them on social media. Why is this important? Because in this documentary, they point out these are not isolated apps. All this information that you're putting out when you're using your phone, when you're using your computer,
It's all being collected. It's all open. So what happens when you go on Google and they do this, they name this example and point out this fact about Google. That blew me away. I didn't even know. I thought, you know how sometimes when you go on Google and you search for something and before you can finish the phrase or whatever you're typing in, it has suggestions for you.
that completes the sentence for you, right? And gives you options, right? I thought that was the same for everybody. I thought as like today, as long as I type this in, whatever suggestions are popping out. Like the popular ones. Based on these phrases, yeah, the most popular of the day. I thought that would kind of be the same for everybody. It turns out it's not. It's not.
That is based, it's different for everybody. It's based off of your location. It's based off of your search histories. It's based off of all the other data that is pulling from you that isn't even necessarily Google-based. It could be what you've been viewing on Facebook, all these other things.
And it tailors and completes the sentence and gives you options tailored completely towards that. So it's manipulating what you're actually searching in search engines for, even when you think you're being the one proactive off of social media doing your own research. And the hits that pop up after you press enter, right? It's not just a search bar. The hits that pop up after you enter are also different for everybody.
Well, let's do an experiment. It's not the same and that's scary. Can we do a little experiment? Because like that's very fascinating. Okay, because I've noticed this as well because what I've noticed is that when I type in a couple of letters and I'm searching, right? Like somehow it's able to really quickly zero in on what I'm looking for and it's based on context, it's based on what I searched before. I see this pattern, like I've noticed this, right? And I think it's very clear, right?
it never disturbed me per se because I'm not looking for anything unwholesome, obviously. But, um, so there are 10 things that like, so I just typed up the letter a on my Google. And so like, sorry for those that don't have VPN, but anyways, this is your name just keeps getting repeated. Eric. Yeah. So there are 10 results in here. Right. And I'll explain this to you, to everyone briefly. Um,
But you guys can do the same thing, right? So the first thing that pops up, the first two are Apple commercial songs. Yeah, why does Apple keep popping up? I don't know. It's weird. But I was... Why is it associated with your name? I was listening to... Somehow, like the other day, I had an idea of like, I wanted to know what songs were behind some of the commercials, right? And there's actually hundreds of them. That popped up. Angela Merkel popped up. And so recently...
I think she will not be running for another term or whatever it is. And she is this, I think she's extremely well-respected, right? Like, I mean, she is considered by, before Biden came along, like the leader of the free world. So I was kind of curious, like how she elevated herself in a small country to be so well-respected. And you kind of read her background. She's actually like a PhD physicist or something. Her husband's like a physicist.
really remarkable woman but I was just looking at a couple of books that I was planning to read to learn a little bit more but I know nothing about her but like you
She's, by all accounts, very well respected. The fourth entry in here is Asa Raskin, who is one of the founders, along with Tristan Harris of the Center of Humane Technology, which I was just Googling. Adam Grant, who's one of my favorite American behavioral psychologists, organizational psychologists. I've mentioned him on this show. He's written a lot of books, is on here. Amazon, Amazon Prime. There's a game Among Us. I have no idea what that is. And then...
And then there's Amanda Gorman, who's the 22-year-old poet, who I think is the youngest poet laureate of the U.S. ever. I did not specifically search for her, so I think it's just based on the inauguration. And then Airbnb, and I haven't, you know. It was very interesting, like, why these 10 popped up when I pressed A rather than anything else, right? Howie.
I'd be curious to see what yours brings up. Well, before we do that, let me... I think also because you also read a lot about politics and stuff like that. So this is a recent... This Amanda Gorman, right? Well, it could have been Trump. I mean, not Trump, but it could have been like other A's, right? It could have been...
There are a lot of... It could have been anything. Yeah, it could have been anything. But I don't know why it's Amanda Gorman. It's all kind of... Who's Amanda Gorman again? The 22-year-old African-American poet laureate. So it's all politically based? No. Is this Google? No. There's like two that are politically based. Which one's not?
Amazon Prime? Apple Commercial Songs, Adam Grant, Amazon, Amazon Prime, and Airbnb, and Among Us. But Adam Grant, I've known you've been a big fan of Adam Grant. No, no, no. I'm saying that most of these are related to something that I've been looking at, but not politics per se. Okay.
But politics is something you look at. This is Google. Yeah. Like, and if you do like B, like it's very similar. Like I was talking to my colleague about John McElwain, who is the editor in chief of Bloomberg. And he wrote a book recently about how other countries have other political systems have
had a much better response to COVID like Taiwan, Singapore, et cetera. So it's not necessarily East versus West anymore because the West was always very advanced, right? BJ fog, tiny habits. So it's really interesting, but I don't know. Bath and body works, right? So that probably for you, that's random. Yeah.
Well, it's also, I'm sure like the more information you give it, the more it can tailor it. So when you just type in a single letter, it's going to, I mean, it doesn't have too much information to work off of there. But when you're typing in phrases, like what is the best way to do this or whatever? I don't know. But what Eric was bringing. But it's scary, right? Like, I mean, it knows you. It knows you. To your point, it totally personalizes it. Here is a story, true story. This happened several years ago here in China with Chinese apps and Chinese website.
Um, because I want, I want people to know like, this is probably a universal thing. We're not just saying this is only an American thing, right? It's, it's global. And when it comes to social media usage and technology, there is no country that is more sucked into that than China that I know of. Right now, this is an example from a, a, a few years ago and it freaked me out. And James was with me and he had no explanation for it. And it just, it just completely freaked me out. I, uh,
I was in the office on a computer, a desktop computer.
And I was searching for something really obscure. It was like Pantone colors because we were like designing for Roxel at the time. So it was like this book of different colors that you can get for like fabrics and pens. Because there's a number, right? Pantone has a number. Every shade of color has its own code, right? And Pantone codifies it and makes it a way so everyone can be on the same page. It's like an encyclopedia of colors. That's what it is.
And I was looking at a way how to buy this encyclopedia of colors. And something really random, something I've never really searched for it before. And so I'm on this computer and I'm searching for it. And then, okay, I search for it and I see, and I do my business and I'm done. Later that night, I go home and on my home laptop, a whole separate computer,
A whole separate computer, different username logged in. I was logged into my company username at the time in the office on my desktop. I go home and on my own laptop, different username, separate, completely different internet, right? Different router, different everything. I'm at home now. I go up and I go into Taobao. And what's the first thing that pops up in my Taobao search? Pantone colors. Hmm.
when I've never ever searched for Pantone colors on Taobao before. And that completely freaked me out. It's like it followed me. And it wouldn't make sense if I used the same computer and logged in again and all of a sudden, this is a whole different network, a whole different computer, under a whole different username, and it's still fucking new.
Can I share one as well? Yeah, please. Freak me the fuck out. I'm going to share one as well, also with Taobao. And this was recent. Okay. And this was literally three weeks ago. Shit. I'm sitting with Vivi watching a TV show. Okay. Our phones are on the desk. You know, we're not touching the phones.
And it's cold. You know, we have the cold front in Shanghai. She's putting her feet in hot water. Foot bath. Foot bath, exactly. She's doing a foot bath, like a home foot bath. And then we have a dog, right? Maki. And so Maki's like next to us and sitting next to us. And she goes, wouldn't be great if there was a foot bath for dogs.
And I'm like, oh, that'd be awesome. Like a foot bath for dogs. That'd be hilarious. So you're saying it out loud. Yeah. We're like, oh, that'd be hilarious. It'd be a foot bath for dogs. They're like, oh, it can't be a foot bath for dogs. Right? And that was it. No search, nothing. Okay? We go to bed. I go to Taobao. What's on my fucking front page?
Footbath for dogs. You've never searched for it before. I never searched for it. So before that, you only said it out loud. I said it out loud. Front page. First recommended like row footbath for dogs. Oh my God, dude. That freaks me the fuck out. That really scares me. So check this out. That really scares me. So number one, let's, you know, like going back to our earlier conversation, it's not just social media, right? Like it's tech, right? Technology. Yeah. Yeah. Okay. Okay.
But social media is the only one that can really weaponize it because everything else is more commercial based, like selling you products. Right. Like when it comes to your information and shaping your worldview, that's when it can be really weaponized. Right. Right. There's different. That's the line I draw. Yeah. Yeah. There's different distinction in the sense that like, like what, like going to Taobao is not necessarily going to like incite you to like riot or something like that. I get it. I get it.
um it's the uh it's the framework of the platform right and how about being a commercial but but having said that if taba ever expanded to taba news and like all that stuff so check this out right so technology is kind of like a fucking kgb spy that knows everything about you and then sets up a time to meet you and knows everything and then becomes your friend
That's what technology is. Without you even knowing. Without you even knowing. That's what it fucking is. It's like saying, I'm sitting next to you and I'm KGB. Yeah. And I'm your friend right now. Yes. That's what it is. But it's like, you know, but like KGB spies, you got to train that motherfucker up. And they have their own agenda. Exactly. And they want you to behave in a certain way. And over time, they manipulate you to behave in that way. Which is the same as the KGB spy. Yeah. The difference, so it's,
it's very insidious, but the difference is that you can't really clone like a billion KGB spies easily. Um,
whereas you can do that with an algorithm. So the algorithm is literally like a KGB spy digitally. And you know what's also interesting, because that's what I was trying to get at and going along with what you were saying as well. I mean, without naming specific names, I mean, certain governments control the media, control the topics that are being discussed and censoring maybe certain news or point of views. And that is...
utilized in order to a control the population right make sure that um doesn't go get out of hand news doesn't get slipped out etc etc control right and americans always are always talking about certain governments that do that or how horrible it is a lot meanwhile
In their home country, in our home country or whatever you want to call it, in America, this is being done nonstop. Well, they're writing the playbook on it. Exactly. Which is funny. You know what I mean? To me, it's funny. It's funny. It's fucking, really fucking terrifying. Yeah. Because, look, we get lost in this, you know, going back to what you're saying and what Eric was just saying.
Yeah, like it's kind of everywhere and we kind of see the effects. What I feel we fail to notice most of the time is because we always think naturally, we always think in very individualistic terms. Okay, so how has this affected me? What have I done? And when you put it in that light, you're like, okay, I see it, but...
I'm still doing good things. I'm still a good person. I'm not crazy. I'm not shooting anybody. I'm not inciting riots. I'm me. And so you feel like, okay, well, yes, it's there. I just need people more mindful, but I got it under control. What we fail to notice is, yes, on an individual level, we can feel that way, especially us who are more mindful of this maybe, but at scale over mass populations, right?
it becomes extremely dangerous and extremely sinister. Well, you're shaping kind of landscapes. You're shaping the way people, you're shaping culture. So how is this, like one thing I like to think about is just sort of like, you know, have we ever seen this kind of thing in the past? Because we always like to, you know, be a little bit sensational and say, oh God, like this is the world's ending. This is the, you know, something we've never seen before, right? And that's like human habit. But
pretty much everything that's ever happened, like that will happen has already happened, right? How is this any different than like religion in the Catholic church or, you know, like all the stuff that happened before? Like, cause people did not have access to any other information and it was interesting, right? Like they were literally being forcibly, you know, kind of influenced because their entire environment was based on a certain thing.
Yeah. How is this different? You make a great point. It might not be any different. It might not be any different, but we can't downplay the role religion has in shaping our reality today. It's had unfathomable, like, I mean, you can argue that religion has caused more death and suffering in this world than anything else ever. You can argue that.
So we can't like, yes, it might not be any different, but that's pretty big. That's like a huge deal. If we're comparing this to the religious impact on civilization over all the eons that mankind has been around. And I'm glad you brought this up because even before, after I watched The Social Dilemma, I was actually thinking, you know what? I think technology is
I draw a comparison to the agricultural revolution. I draw a comparison to the agricultural revolution because while agriculture, the invention of agriculture, brought on a lot of great inventions, made food plentiful, farming, all these things, but...
If you research the agricultural revolution, if you read books like Sapiens by Yuval Harari, you understand that agriculture also at the same time, because there's a flip side of the coin, right? There's positives about technology too. The flip side of that coin was a lot of great suffering and changing the way human civilization lived and not necessarily in a better way. You know, in the book Sapiens, they make a really, you know, just a legendary, very famous book, right? Great book.
They make a very compelling argument that actually the agricultural revolution created more harm to human civilization than actual good. We just don't know that because this is all we know. We've grown up and this is our world. This is our reality. Having agriculture, having civilization, having urban dense areas of population settlement. This is what we know. This is normal. This is what we feel we're comfortable with.
But if you, they do early human studies and tracing it back all the way to modern, now to all, to modern times. And the amount of illnesses, the amount of the way it's shaped, the way humans are actually supposed to behave and the way our bodies have evolved to be more hunter-gatherer than to be sedentary and sitting in cubicles with fluorescent lighting and all these things.
This was all traced back to the invention of agriculture and making human civilizations not roaming around anymore, but staying in one place and building out colonies and settlements and cities from there, spawning what we see as human civilization today. And that is a huge revolution over thousands of years. I see technology having a similar impact as agriculture did,
And also in the sense that there will be positive sides. There will be a bright side to technology, but there's also going to be a heavy price to pay at the same time. Now, technology is going to increase and evolve way faster than agriculture did. Way faster. Way faster, which means that we have less time to adapt and catch up and realize what's happening as humans because technology
Our bodies and our minds are not evolving at that rate, right? So what this means to me is that, look, it's right now, it's anyone's guess whether the net good or the net result is going to be a positive or negative. We're not saying one or the other. But I think what this discussion needs to, this discussion that needs to have, we need to have, is that we need to really start thinking critically about this and not be oblivious to
to the impact that technology is having us and not just paying lip service to it, but really understanding how deep this really goes and what is the real impact it's going to have on us. Because I think technology is going to change human civilization and is already changing human civilization similarly in the size and scope of the impact that the agricultural revolution did.
And that's the comparison I draw. - Well, what have you changed or thinking of changing ever since watching "Social Dilemma"? - Okay, so right after watching "Social Dilemma", I turned to Jessica, I'm like, I'm fucking deleting YouTube, I'm fucking deleting fucking Instagram, I'm fucking deleting all this shit. I turned around, I'm like, I'm fucking deleting it. I grabbed my phone, the first thing I do- - You started shaking. - Yeah, the first thing I do, I hop on YouTube and start watching some videos.
And looking up the social dilemma. You know what I mean? I start researching the social dilemma on YouTube. So yes, are we all hypocritical? Of course. But that's the problem. We are so addicted to this thing that we will find ways to justify using it versus doing the hard work of having an open discussion of maybe we should try to get rid of these things. It's like a drug addict.
whatever a drug addict says they're gonna try to make excuses and try to justify why they need that one more hit why they need that one more dose just to get over it just so that they can start getting off of it and this is the same thing we do with social media so if using technology like using technology you know inappropriately right like
And inappropriately, you can define in many ways like using it too much or like, you know, waking up and having bad behaviors or going to bed late and, you know, using it or relying on it to boost your sense of self-worth, whatever, right? Let's just say inappropriate use of technology if it's a bad thing, right? So what does the movie say? Let's go back to the movie, right? Because all the experts on there, right? Like,
obviously, I mean, Tristan Harris, I think I've read a bit about him, right? He's a really smart guy. He's done a lot of great work. He's influenced a lot of the tech companies to actually, you know, to build safety nets into their products, right? What does the movie say? What do the experts on the movie say about what we should do, what the anecdote is? At the end, I forget, but at the end, one of the anecdotes...
No, at the end, they say- There's no anecdote. They were saying that, but they were giving- They say you can't put the genie back in the bottle, number one. But number two, what you can do is start having this discussion about it. Start being mindful that there is this kind of influence on you and not being like oblivious to it. And number three, at the very end, most of them themselves, they say, number one, I delete my social media. They're like, look at the world. Look outside your window. It's a beautiful world. You don't need social media. So a lot of them themselves have deleted social media. And what they do and what they say about-
is more about pertaining to the next generation of people. They're like, we're adults already. It's kind of like, we are what we are. But what this is gonna have the most impact on, and we saw that happening since 2009 with the teen and preteen girls, is that this is gonna influence the younger generations more so than anybody else.
so what it is is they have um you know they're very zealous zealous about like protecting their children from screen time having uh having a talk with them uh controlling and budgeting the amount of screen time they have or even the social media that they can be on or limiting it or um only giving them access to social media when they reach a certain age like 16 or whatever it is um
and not having that exposure before then. These are things that they're talking about in the documentary. - So backing up a step, right? - And also government regulation and laws. - I realized like I jumped into the solution a little bit prematurely. So just backing up a little bit, right? If part of the antidote or whatever it is, is to be mindful and to understand these things, right? What are the signs
that you might be using social media inappropriately. I mean, I think we need to step back 'cause I don't think that I can get on my high horse and be like, "Oh, I don't have Facebook." That's bullshit, right? So assuming that we all could be victims because it seems like a very pernicious thing that impacts everyone, what are the signs for me to diagnose that I might have an issue? - I think one of the signs, main signs they bring up in this documentary,
is that when you're on social media, if you feel that a certain content you're watching is pushing your emotional buttons, it's probably doing it by design and for a reason. You need to be aware of that. That's scary. Because you can only know that if you're actually asking yourself. Yeah, asking yourself. Otherwise, you're just going with the flow, right? You're just reacting. But as Matt Beadle told us two weeks ago, he said that if we can purposely send a signal...
right? To our prefrontal cortex and not just, you know, react. In the limbic system. With our limbic system. Yeah. Right? So, so like going back and sort of like going back to his stuff, like, okay, so we see something, it activates our limbic system, right? We get emotional and then it perpetuates. It's the fight or flight response. So what you're saying is one thing is just to recognize because we, the prefrontal cortex can observe the limbic system, right? Like, I think that's what he's kind of saying is that like,
I have two parts of my brain, this part of the brain, if I activate it, I can at least reflect, even if it's already happened, I can actually write it down. And every day I can say, okay, at the end of the day, reflect, did this happen to you? Right? Cause I think that's kind of mindfulness. So what you're saying is like, okay, find a time during the day and actually ask yourself that question.
Did you get your buttons pushed? Right? Because I feel so much better now having talked about this because I do watch YouTube sometimes and I do get into that rabbit hole or whatever, right? And then I'm like, fuck, and I don't feel so great about myself. And I realized over time, so I stopped watching YouTube at night. I just don't watch it. But I realized that just subconsciously. But for everyone, that's a great sort of litmus test. Is it making you feel emotionally reactive? Yeah.
right? I think that's powerful. Yeah, it really is. And it makes me feel better too, knowing that, okay, at least now I'm aware of it. Yeah. So the question is when you're absorbing content, whether you're on Douyin, whether you're on Bilibili, whether you're on YouTube, whether you're on Facebook, whatever you're on, Twitter, doesn't matter. WeChat, doesn't matter. The honest drink. The honest drink right here for pushing your emotional buttons. If you're feeling emotional about the content you're absorbing,
try at that very moment while you're absorbing that moment of that content to ask yourself, number one, am I feeling emotional about this? One way or the other. And number two is if you are feeling emotional about it, realize there's probably a reason and it's probably by design why you are feeling emotional about it. The chances are, statistically speaking,
It is designed that way to push your buttons. And look, I'm not telling you what to believe. I'm not telling you what you should think. But just know that there is an active agenda to push your buttons when you're feeling emotional. That's, I love that because I think that
The first part of the sentence is like, just ask yourself, am I feeling this way? I think that's powerful in and of itself. But what I love what you just said, and I think this is probably really a huge, huge takeaway is that, okay, recognize that there's a reason is probably by design, right? And this is the insidiousness. This is probably what you guys got from that movie that we don't realize. Because everyone knows, okay, if I watch too much social media, it's going to piss me off. It's not good, but it's by design, right? And I think that's the scary piece.
Now, I just thought of something on the other hand, right? Because like, I use these apps, right? But like, I use certain apps that I pay for that are designed to actually encourage positive behaviors. Yeah.
because they're like habit building applications, right? Like Fabulous is one of them, Streaks is another one, right? On the other hand, right, if you have an app that makes you feel emotionally good and helps you build positive behaviors, then you can also realize there's a reason behind the design. - Exactly. - And then you can choose those apps that help you do that. You know what I'm saying? - Yeah, yeah, yeah. - But it's really interesting, right? Because like I've got this app and so on the weekends, one of the things that reminds me of is, hey, connect with your friends.
So today I was like super happy because I built that sort of, we talk about, I think it was like, you know, atomic habits and stuff like that, right? There's a cue, there's a trigger, there's a reward. So I was like, oh yes, I'm gonna see my friends today on the podcast, check. Right, so then it's like, and I checked it, right? So it's really interesting that like, there is the other side to technology. - And it's gratifying to check.
Right. But I think like we're, you know, like this show, like, I mean, I think we do get caught in the rabbit hole sometimes, but generally speaking, I think we're trying to promote like positive things, right? We're not trying to scare people or anything like that. Of course, of course. And no, but there's, I'm so glad you brought that up because this reminds me of my last conversation with Gabor on this show. And-
And the reason why I say that is because I was having this discussion with EQ. I was asking Gabor what he thought because he's a leadership coach. He's a really intelligent guy. He has very unique takes on things. And I was asking him, like, okay, so what's your whole approach as a leadership coach on EQ? Because I assume that EQ probably has a big role to play in people's leadership skills, right? Yeah.
And he's like, yes, but he's very wary. He's very careful about these terminologies, whether it's IQ, EQ, these new things, right? Well, not new things, but these terms. And he's very careful about using these things because he painted an analogy to be like, yes, EQ is a thing. It's real, but it doesn't necessarily apply to everyone. And not everyone maybe needs to focus on EQ. Right.
But what happens when you sensationalize even positive topics, like we're discussing EQ and seems positive, right?
is that what happens later on is that people start downloading apps that market EQ or like practicing or improving your EQ. And what these apps are designed to do is they're designed just like everything else we've been talking about to get you addicted, to keep you thinking that you need to improve your EQ and to keep you on their app and eventually monetize you, right? So even positive things, even though they're positive and they're not as sinister, right?
There's still an agenda at play, and we realize that. And it's not to say that, look, this society would not work. Human civilization cannot work without agenda. Everyone has agenda. We have our own personal agendas. Even with the show, we have an agenda. Everyone has an agenda. Agenda isn't bad.
You just need to be aware that the agenda is there. And you need to be aware if it's manipulating you down a positive path or down a negative path. And that is a really hard differentiation to make, especially when you're so absorbed into it. And there's this kind of feedback loop that you get rewarded off of because everyone, I don't care who you are. You can't tell me that you don't have like this positive or like this...
release when you get rewarded, well, not rewarded, when you get fed back information that you already believe in, you know? Absolutely. Hence the echo chamber. We all like to hear our own thoughts reinforced and backed up. For sure. Well, you know, kind of to add to that, right? I think it's, that's,
really well said like you know agendas themselves like everything has an agenda social media technology is particularly sort of dangerous because the agenda is so deeply embedded you know into the algorithm it's just it's not visible right and what comes to mind is that you know
If we study the science, so I think one of the learnings is like if we start thinking about what we can do, like potential antidotes, there's no magic bullet, but studying the science and just really being curious and reading, you know, all the scientific literature to understand how the mind works.
because it's like you can train yourself over time. And the example is like, if you're a kid, you don't know not to take candy from like bad adults. So when you're a kid, you're like super naive. You're walking out, there's all kinds of dangers in the world. And there are bad people out there and they're trying to manipulate you, right? And, but over time you grow up and then you realize, oh, that, you know, that guy like on the street corner, always giving you ice, whatever it is.
the priest at the church, you know, whatever. Like you kind of learn that there is a hidden agenda and then you become aware of that. So I think for adults and this, you know, technology and these topics, a social dilemma, it's like, okay, we need to study the science. Right. And I think the other piece is that then if you are, if,
If you are consciously and mindfully making decisions of your life, for instance, like you go into your phone and you know what apps, you've chosen those apps purposefully, right? Like if I look at, like the app I actually use the most is Notes, like by far. Yeah, yeah, yeah, you do. And so like, so if you actually just sit down. It's ridiculous. And you schedule time every week to like just review the decisions that you're making in your life. What apps are you using, right?
Right? Like, who are you meeting? This is no different than like the people you meet. If you meet a bunch of drug users, you're no one's bigger than their environment or situation. Right? So it's like some of this stuff to combat...
the potential downsides of technology are not anything new. It's just understanding like, what are your habits every day, every week? Where do you spend your time? Where do you focus your energy? Do you have a plan? And what is your goal? And if you know what your goal is, like, you know, Ray Dalio always asks like, what do you want? And if you're really clear on what you want, that's probably strong enough to get you to avoid doing all this stuff because clearly these things will not let you get to where you need to go. I think that's really well said. You make really great points there.
And yes, it is about kind of, quote unquote, fighting back in a way. But something that the social dilemma highlights as well is that it's not, the current situation is not going to stay the same. So what that means is machine learning, algorithm learning, artificial intelligence, which is what this really is at the end of the day, is learning so much about you at the
such an infinitely faster rate than you can learn about it. Most of us know nothing about it. Even though we are aware something is there, we don't know anything about it really. It is learning so much about you at such a rate that it's a completely unfair playing ground. There's no way you can really overcome it in the sense that you can outlearn it
You can't outlearn it. It's going to always be ahead of you in that regard. But I guess it's the importance of knowing, okay, number one, it's there. Number one, it's not a fair level playing ground. Okay, it's got so many more advantages. So what do you do about it? Now, most of us are still going to be on social media, still using it. I mean, that's just the reality. I doubt any of us are really at the end of this episode or at the end of watching Social Dilemma are going to start deleting our social media apps.
Maybe some of us. I'm fucking deleting Facebook now. Well, I think... But it's about... Oh, you know what? It's about understanding... I don't even have Facebook on here. Because I think one of the things that we didn't really get to touch upon yet is because, once again, this is mainly based off of America and a lot of Western countries, right? That use Facebook and... Well,
We're using American companies and American apps as examples because those are like the most popular ones globally. But this is not at all limited to just America. Not at all. But I also think we live in China. It's...
It might be worse in China. I feel like it's slightly different. I mean, it's a little different here. It is. So we're discussing this topic based off of American standards. But here, I really think it's a bit different. In what way, though? In what way do you feel it's different?
Okay, so one is you have the option of this polarization of information, right? So that's number one. I mean, that's the most obvious, right?
Because that's one of the topics that was touched upon in social media is the divisiveness of what social media can – how divisive social media can make you become. The polarization of the left and right was mainly purely based from social media. You know what I mean? But that is not going to happen as easily now.
in a country like this. - That's true. - So that's number one. - That's true. - Right? So what's another topic? - Less polarization. - Less polarization. So why is this country so unified in a lot of thought for majority? Number two, what's another topic that was touched upon? It was not just a polarization of information, but also it was about like your self representation, right? Depression, anxiety, the feeling of being part of something, right? Of being liked, you know, for example.
Now that... Validation. Validation, yes. Thank you. That... Might be even worse here. Might be even worse here. Yeah. Exactly. For sure here. For sure here. Yeah, yeah. Because there's... So it started in Asian countries where you have camera apps that modify...
the size of your face, the size of your body, the beautification of your eyes. Asian countries wrote the book on that. Yeah. Korea, Japan, China. Exactly. We wrote that. And Western countries like America did not have that. No. So that started here. Like, like that, no, I did not start here. It got amplified here. That's so interesting. Right. So interesting. Yeah. And so one of the, one of the examples that was given in the documentary, and I can't remember the actual term, but they were saying from Snapchat, um,
You know, when you take a selfie or share something on Snapchat, you had these filters, right? And Snapchat is not, it's blocked here. But over there, a lot of girls were really influenced by the way they, you know, they look beautiful, quote unquote, right? Because of the filters and the likes they got. Oh, you're so beautiful and stuff like that.
But that was one of the topics and that was an actual condition that came out of that, you know, with cosmetic changing of your face and stuff like that, beautifying through surgery or whatever. I laugh because here in Asian countries. I mean, you bring up a really good point, right? And so let's look at like the similarities. I always like to look at similarities and then look at differences rather than start with differences, but it's great. I agree with you. There are differences, right? And so the symptoms are different.
So like in China, like maybe the demographic of people that would be impacted negatively, because we're talking about negative impact, right? The level, the degree of impact, et cetera, et cetera, is going to be a result of how these algorithms are manipulating and tapping into your limbic systems. And in the U.S., it might manifest itself as polarization, right?
The symptom is different, right? The negative impacts are in different forms, but the result is the same. It's still bad shit. So again, it's still the same thing. It's these algorithms have their own agenda or agendas by their programmers, and they're tapping into people and getting them to behave in unhealthy ways. And whether it's polarization in the US, whether it's validation in China, it's all negative. Yeah. Like, I mean...
like what you brought up, Howie, actually I never even thought of before. And it's super, super interesting because you can see a clear line between, okay, we're being affected in different ways, like what you're saying, Eric. And in the West, because social media is for the most part open, right? Even though we know it's not because even Trump can get banned from Twitter and all these things. And if he can get banned, then anyone can get banned. And they have banned other people.
So there is censorship, but it's censorship not by an authoritarian government. It's censorship by private companies. And the bar is a little bit different. The bar is higher. Yeah, the bar is a little different. And so it feels different when it's not from a company, not from a government censoring you. Yeah, yeah. So it just feels different, even though it's not at the end of the day really that different. But anyway, that's a whole different topic.
Because in America, let's say you are more or less open to share anything and say anything and post anything on social media for the most part, you get a lot more political divide. Whereas in China, because you can't post anything, there is a lot less, there's more political unity and a lot less political divide.
But because when it comes to non-political content, you can pretty much post anything here. Yeah. As long as it's not political. Yeah. You see an exaggerated, or not exaggerated, but you see highlighted the social content
divide and the social pressures and the social negativities that arise and stress and everything that relates to kind of the social aspect of everything so whereas i mean at the end of the day if you want to take a net it's probably still worse in america because it's not like america doesn't have that social negative aspect too they still do they have it as we see with young young preteens and teens yeah girls killing themselves and all this really terrible shit sure
So they have the social stuff, but then they also have the political stuff. Whereas in China, you don't have as much the political stuff because that's more or less controlled, but you have a lot of that social stuff, right? So that's super, super interesting to see. Until the, and going back to a very early point you made at the beginning of this episode, until the US has stronger regulation, right?
The Chinese system inherently has regulation built in, so it limits the scope of the game, whereas the U.S. game is sort of unlimited, right? But you also made another point, like systems can learn infinitely about us at such a rate that it's unfair. However...
I think we do have that, you know, when you watch like movies, like back in the day where there's like that monster, but then they all had a kill switch. So no matter how powerful like the monster was or this technology, there's always like a button in their butt or something. And if you just press that button, like, like fucking star Wars, right. If you just kill, if you just shoot that particular part of the death star, it blows it all up. So it's like, no matter how powerful these systems get, you can choose not to play. You can turn the lights back on for yourself, you know?
Yeah, but the documentary makes an argument that the genie is so much out of the bottle and that because the system is so capitalistic and the way kind of society and markets and industry has formed that yes, like technically you're right. Like technically you can... Like, you know, Facebook, they can shut themselves down. They can turn off all their servers. I'm not talking about them. They can do that. I'm talking about me. But they won't. I'm talking about...
I'm talking about being in control. I can turn it off. But that's the equivalent of telling a drug addict, I can turn it off anytime. Yeah, you can quit. You don't need to take another hit. Yeah, you can stop taking heroin. But can you like tech? Physically, they can. Physically, there's nothing stopping them from doing so. And that's correct. We know that they can't really. And it's really, really tough. When we're talking about the whole kind of US and China comparison thing,
It really makes me feel like, and this might be really controversial, especially with a lot of our Western listeners and stuff like that, but I feel like actually having watched The Social Dilemma and talking about this with you guys now,
I actually feel like more positive, more positive about the fact that China bans Facebook, YouTube, and these things, even though it makes my life a lot less inconvenient, a lot more inconvenient because these are the apps I like to use.
But imagine if that was open. Imagine, like we saw this play out with the whole Trump administration and the election, the recent election and everything. All this disinformation spread out there, all this division, primarily really fueled by social media.
Trump was really fueling these things on his own social media, let alone all the other channels of individual users kind of spreading a lot of bunch of other conspiracy theories and all this. So we saw the mess that that played out, right? Imagine if all these other platforms were open here in China with this type of population, with this type of ethnography over its landscape.
It would be a nightmare here. It would literally be a nightmare here. So in a way, I know this is controversial, but in a way, I'm almost glad that those things are kind of banned, even though China has their own equivalent to Facebook. They have their own equivalent of Twitter. They have their own equivalent of everything.
But at least there's not also all these other things here at the same time, you know, just adding fuel to the fire, you know? So in a way it's kind of like, it's almost like I can understand why these things are banned. I can understand it.
So, I don't know, you know, it's a discussion and it's a type of open-mindedness that a lot of people need to have instead of just right away thinking like, you're banning Facebook. That's, you know, like people's free speech are gone. You can't, you know, like you're taking away people's liberties. Like just completely jumping into the deep end there. You need to understand like, look, there are pros and cons of doing this. And there's a reason behind the madness. There's a reason behind the method of,
And you just need to understand it and you can still make up your mind. You don't need to agree with it, but as long as you make an effort to understand it. I think what you're saying is a general ethos that we try to spread is be mindful of both the good and the bad, both sides. I think that every generation, and I don't want to overgeneralize here, but I think that every generation...
um, gets confronted by massive technology, technological change, right? Like, I mean, we can't say every generation, every generation starting basically with our generation. Yeah. But I mean, I think, I think, I think if you were to kind of go through history, every generation is at some point confronted with like massive, um,
you know, technological change, right? Like, um, the invention of the printing press, the invention of, there's a lot of things, right? So I think, so we don't want to generalize and say it's only our era, but let's say like the last couple of hundred thousands of years, right? There have been things that have introduced, like been introduced that we are confronted with that changes our lives, right? Um,
And then it's our decision, right? It's our choice to kind of decide how much we want to let that technology into our lives and how much of that we want to not adopt, right? And we know that if no one sort of was open to these new ideas and technology, the world would be a very, you know, a very, very different place, right? So one person came to mind, which is Henry David Thoreau, right? Because we... Henry David Thoreau, like a very famous American philosopher, he was...
He loved nature. He wrote on Walden Pond. He was a real kind of innovative thinker, right? And there was a period of his life where he moved out of the city and he was kind of, I think he wanted to see what it was like to choose his own values and live like life essentially. So he moved out to the woods on Walden Pond, right? And, um,
He was not opposed to technology and progress, right? I'm kind of reading from this site that talks about his life. And it says, while he strongly felt people should not let the technology advances of the day rule their lives and substitute technology for connection with nature, he did utilize the advances of his day when they served a useful function and purpose.
So for instance, he took the train to places like Boston and Cambridge. I think that's a great lesson. It's like you have to define how you want to live your life. You want to define what your values are. And you don't want to substitute things that are truly important. Like connecting with friends and family and loved ones is really important. Being out in nature. Right?
right, is really important. You know, exercising and being healthy and fit, these are all really important things. And when technology starts substituting, right, and they start taking over these things, it's not a good thing. But when technology can help you and serve a useful function, I think it's extremely valuable. And think about all the technology it takes for us now, unprecedented technology to be able to just sit in a room, have some drinks, like friends, you know, buddies, and
And then actually have people be able to enjoy and engage with us. So, you know, I think it's really a balance of what you let technology do rather than let it actually take over your life. 100%, absolutely. And what you just said reminds me of like kind of this metaphor. And it's just, I mean, in my opinion, I feel like this kind of sums it up the best in terms of
You know, when it comes to physical things like pollution and polluting the river, polluting oceans, and, you know, there's oversight on that. You can't just dump anything in the river anywhere. Whether it's China, US, you can't just start dumping toxic waste in rivers. There's regulations, there's laws surrounding that and protecting that. Well, we need to think of our information and our minds in the same way as pollution. There's a lot of misinformation out there. There's a lot of crap out there.
And it's polluting our worldview. It's polluting the way we think. It's polluting how we feel and how we see things in our reality. And there's a pollution there. And right now, there is no regulatory system regulating what pollutants you can put into the information river. There is none. And so while we see things very clearly when it comes to physical health,
Like, you know, we all know fast food is bad for you and McDonald's, Burger King, all these fast food chains have gotten a lot of crap through the years, especially with the kind of health food craze and wave that's been going on. And we know that we wouldn't feed our bodies fast food and processed food all the fucking time because we know that's killing our bodies and bad for us physically.
We need to start thinking in the same way about our mental health. And not the mental health in terms of mental illness, but our cognitive health, the information, our worldview. Really, it's your worldview, the health of your worldview. And stop feeding it junk food because there's a lot of that junk food for your mind on social media. There's a lot of great things on social media too. But we need to be aware of
And don't just feed yourself the fucking Big Macs of information. You know what I mean? And we need to start seeing things and being aware of that, that there are equally equal parallels to that, that we see so clearly with our physical health than with how we see things, our open-mindedness and our worldview. And if we realize that and we start paying attention to that, then maybe, maybe we have a fighting chance. Maybe.
but we gotta be aware first and i think that's what conversations like these yeah start to have you know that's the effect it starts to have i love that i love that and you know the funny thing is like i mean there there have always been like big macs you know and it's up to us not to gorge ourselves on whatever the big mac is of the day but i do love a big mac chili cheese fries oh my chili cheese fries for the mind anyway um look we've been
We've been ranting for a while. I think we've been preachy-preachy on this episode.
um especially me i've been quite preachy on this one but that's because not too much not too much that's because i'm fresh off of watching the social dilemma yesterday and as with many compelling documentaries like the day after you watch it you're like so gung-ho about it you're telling everybody about it you know i mean and then a few weeks go by and then you kind of forget about it well that remember that real quick remember that analogy from um tristan he was saying because he wrote he was a google employee he wrote this open letter or like this like a
presentation about yeah yeah yeah and he worked for like months day and night on it yeah yeah yeah he was saying how horrible what they were doing was and they needed to be a little bit more aware with the negativity of of what they're doing especially with Gmail and their email algorithms and stuff like that
And then it caught fire, right, within Google. And all of a sudden, like, it went from, like, you know, 40 people to hundreds of people to the whole company. Thousands, yeah. To even, like, Larry Page. Yeah, he knew about it. You know, he knew about it. They had meetings about it within, like, a week or two of it getting released. So it was, like, viral within the Google company. But then, all of a sudden, it just disappeared. Nada. And nothing happened. No one talked about it. And no one talked about it ever again. And that's a perfect analogy to what you're saying right now. You watched it.
You're really gung-ho. You're fired up. We're talking about it on the show. Two weeks later.
It's out of our mind. And this reminds me of something Faye, when you guys had Faye on the show, she said, and this, because her quote always stood out to me. She's like, every day we're on the internet and we think we're going to start a revolution. We leave our apartments thinking we're going to start a revolution. But every day it's nothing new under the sun. You know, there's like a Chinese saying about that, right? Like every day you feel you're going to start a revolution and every day it's nothing new under the sun. And around and around we go. And...
Yeah, sadly, that's kind of true.
But I hope conversations like these... But we keep talking about it. If we keep talking about it, we keep... Then it becomes... We are our own algorithm. You know what I mean? Hopefully, right? That's the goal. Yeah, and I think the revolution... I remember that. It was an awesome quote. But the revolution starts with yourself and the revolution starts small. And the thing that I'm really trying to remind myself, and I'm like, this is what you have to do every day, is gratitude and hope.
And gratitude and hope are probably one of the few things that are free and don't ask you for money. So that's one thing that I just, I'm like, yeah, I just, I want to be grateful. I'm grateful for you guys, family, because they're not going to be around forever. Yeah. Amen, brother. Cheers, guys. Cheers. Ooh, we went on rabbit hole down this one.
Yeah, I'm going to post a picture and try to get some likes. No, it's great. I need those likes. I need them. I love this show because you guys taught me a lot of stuff because I hadn't watched it. Wait,
Wait, who are you, Eric? Let's sign off. Let's sign off. Okay, we gotta sign off. You're so eager to get off the air. He's like, I don't want to hang out with these guys. On the whole show, he's like, I appreciate you guys. I love you guys. And meanwhile, he's like taking his headphones off. He's like, fuck, I gotta get out of here. Okay, that's it, guys. We love you. Be good. Be well. I'm Justin. We love you. Very special. I'm Eric.
And I'm going to try to get some likes. I'm Howie. And try not to storm any capitals, guys, please. Don't fall slave. Don't fall slave to the algorithm. Peace. Cheers.