We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Marc Andreessen: Fire Hose of Knowledge

Marc Andreessen: Fire Hose of Knowledge

2024/5/16
logo of podcast Literally! With Rob Lowe

Literally! With Rob Lowe

AI Deep Dive AI Chapters Transcript
People
M
Marc Andreessen
联合创始人和风险投资家,专注于人工智能和技术领域的投资。
R
Rob Lowe
Topics
Marc Andreessen: IBM在鼎盛时期是一个庞大的帝国,拥有大量的员工和市场份额,然而由于其僵化的管理模式、缺乏创新和对市场变化的反应迟缓,最终导致了衰落。这与当今大型科技公司在AI发展中面临的挑战有着异曲同工之处。 从IBM的经验中可以看出,即使是曾经的行业巨头,如果不能适应技术和市场环境的快速变化,也会面临被新兴企业超越的风险。 此外,Andreessen还分析了大型科技公司在AI发展中面临的困境,例如创新者困境、安全问题以及社会偏见等。他认为,这些公司在追求商业利益的同时,也需要关注AI技术的伦理和社会影响,避免因技术滥用而造成负面后果。 Rob Lowe: 作为访谈的主持人,Rob Lowe主要通过提问引导Marc Andreessen阐述其观点,并就AI技术发展中的一些关键问题进行深入探讨。他表达了对AI技术未来发展趋势的关注,并就AI技术可能带来的社会和伦理问题提出了疑问。

Deep Dive

Chapters
AI will revolutionize access to healthcare, legal services, and education by providing personalized, expert assistance to everyone, democratizing access to professional services.

Shownotes Transcript

Translations:
中文

Art in your home can instantly transform your space and bring you joy. Saatchi Art makes it easy for you to discover and buy one-of-a-kind art that you'll love. Whether you're looking to complement your home decor, fill a blank space on your walls, or start an art collection, you can find the perfect piece for your specific style and budget at Saatchi Art. Go to SaatchiArt.com today to bring the beauty of art into your home. Plus, listeners get 15% off their first order of original art with code ROB.

That's 15% off at SaatchiArt.com. S-A-A-T-C-H-I-Art.com. Ever wish your favorite TV show had twice as many episodes? Everyone knows that feeling. And so does Discover. Everyone wants more of their favorites. That's why Discover doubles another favorite thing. Cash back.

That's right. Discover automatically doubles the cash back earned on your credit card at the end of your first year with Cash Back Match. Now that's a real crowd pleaser. Everyone knows how it ends. Double the cash back. See terms at discover.com slash credit card. Before we go, I want you to give me the average person listening to this.

In two to three years, how is AI going to change their life? Oh, so... I know this is a whole podcast, what I just asked you. Hey, everybody. It's Rob Lowe. It is literally so excited about this talk. Today, we have Mark Andreessen of the VC firm Andreessen Horowitz. Mark Andreessen is one of the smartest men on the planet.

He is one of the smartest men in tech and in investing and geopolitically. He is the co-founder and general partner of Andreessen Horowitz, co-founder of Opsware, co-founder of Mosaic, cover of Time Magazine.

In 2002, he was appointed to the Homeland Security Advisory Council. Mark and his partners acquired a major stake in Skype in 2009 at a valuation of $2.75 billion, which people thought was insane. They sold it in 2011 for $8.5 billion. He knows what he's talking about. And he's a great guy, Mark Andreessen. People don't come to Mark Andreessen, nor me, for rhythm. They have different wants and needs.

I promise you this is the most unserious interview you've ever done in your life. This is not Squawk Box.

That's good to hear. I get that impression already. That is great. So when the Sam Bankman Freed whole thing broke, you know, and like that whole firestorm happened, we're trying to figure out who that guy was. And somebody said, you know, he's a funny guy when he dances. He just, he only moves up and down. And I was like, but that's what I do. Yeah.

I've seen you. I've witnessed that myself. Exactly. That's it. So, yeah. Minimalism. You practice minimalism in your dancing. Yes, yes, yes. So, it's good that this is not, this is neither a CNBC nor a dance-off. So, I'm excited. First of all, congratulations on the raise. I mean, you just crushed it. Oh, thank you. American dynamicism. I said it. That's a good way to put it. Dynamicism. Okay.

I'm sick. I mean, there's so much to get into, but I just want to start a little bit with your, but you're a Midwestern boy like me. And when you were going to Urbana,

Yeah. Champaign, University of Illinois. You interned at IBM. What's it like walking into interning at IBM? First of all, where was the IBM that you interned? New York or LA or where were they? So it was in Austin, Texas. Oh, Jesus. So I spent a year and a half in Austin, which was a fantastic experience because it's such an incredible city and I had a great time and I still love it. Yeah.

So this is 1989, this is IBM. And IBM, there was actually a TV show a while ago called Halt and Catch Fire that was actually about the birth of the computer industry in the 1980s. And it actually did a really good job of showing you what IBM was like in those days, which at the time was a really big deal. So IBM in those days, IBM was basically a nation state. It was like its own empire. It was like the British East India Company or something in the computer industry. And

It had at the time 440,000 employees. Whoa. Which, you know, if you adjust for like economic growth today, that would be over a million. It would be, you know, just by far the biggest company. And it was just gigantic. And it was at its peak, a few years before I got there at its peak, it was 80% of the total market value of the entire tech industry.

And so it was this like incredibly dominant force. It was incredibly intimidating and scary in the industry. It was very famous for having this very regimented kind of way of operating. They had everybody famous, all the executives famously wore blue suits, white shirts, red ties, like just uniformly. And then any meeting with IBM would be like 30 guys coming in like dressed identically.

you know, like an army. Yeah, so it was this empire at the time. And then the group I was in in Austin, there were 6,000 people in my group

And then in the adjacent building was another 6,000 people working on a different project. And so it was like units of 6,000 people. They had basically offices all over the country. So they were spread out. And so they had probably 30,000 people in Austin at the time, which was one of their smaller outposts. And, you know, it's so big that you could... I spent a year and a half there. I don't think I ever met anybody who didn't work for IBM. So...

So it was just like, it was just like enormous thing. And then I got there basically right as it started to tip over. Um, and, uh, so it basically crashed really hard, uh, over the, over the several years that followed, you know, 89 through like 94, you know, it kind of, it really fell apart. Um, but it, that was at the tail end of a 70 year run. Um, and I, and I sort of got there, I got there sort of right at the tipping point. How did they, how did they fumble the bag as the kids would say? Yeah.

I think they just, it's like they want too much. You know, so they had grown for 70 years. They had never had a layoff. And so they had, and they promised lifetime employment. And so people went to work there. Everybody just assumed you worked there your whole life. They never, they didn't, by the time I got there, they had long since stopped firing anybody.

So, like, we had a guy in our group who literally did nothing. He sat at his desk all day and played solitaire. And, like, you know, there was no mechanism for—you couldn't fire anybody. So, what management had gotten good at is what Larry David calls foisting. Yeah.

You know, to try to get your low performers to transfer to another team. So I got there and I remember my manager at the time was telling me about it. And he's like, okay, we're in the product group. And this is one of the big ones. And then there's this other group, they have this group called marketing, which is actually their term for sales, but they called it marketing. And so, you know, so our group builds products, this other sales group.

group that you'll probably never meet, uh, sells the products. Um, and then I said, what about that building over there? And they said, oh, that's, that's, that's the planning department. And I said, oh, they, they do all, that's great. Um, uh, you know, we, so we, do we consult with them on our plans? And they said, no, we never talked to them under any circumstances because that's where all of the bad people have been offloaded to, uh, you know, they all think that they have very important jobs, but like, that's where all the, all the stinkers are. And like, we just, we, we, we will, we will not speak of them again. Oh,

Oh my God.

Right. And so this is a big problem they had. It's like, even if you at IBM identify... I'll give you another example of how big it was. We all had accounts on a mainframe at the time with company email and all this stuff. And they had this thing, the org chart. You can navigate the org chart on the email. And one of the things you could do in the org chart is you could have it show you how many layers down you were from the CEO. And I punched in and I was 12 layers down from the CEO. And of course, what that means is my boss's boss's boss's boss's boss

was still six layers down from the CEO. So I'll give you another story. So they had what they called the big gray cloud, which was the, whenever you saw the CEO, you never saw the CEO by himself. You saw the CEO surrounded by like 50 guys in gray suits and they just like followed him everywhere, right? And their job was to prevent him from getting any information. Right.

Right. So there was, you know, it's like, like the president, president of the U S or something. So it's like having the secret service, but they're not there to like take a bullet for you. They're there to prevent any reality from like permeating your force field. Yes. Yeah. It makes sense. I get that. Yeah. Wow. Which is, you know, which is the thing that, you know, big, big,

people run big companies, it's very easy to get into the mode running a big company where you just, you get no information. You never talk to anybody who knows what's going on. Everybody's lying to you all the time. And then, yeah, you're going to have dysfunction. You're going to have drift. And then, you know, at some point, you know, life being what it is, at some point, the sins, you know, the sins snowball and catch up to you and, you know, smack you square in the face. And then, you know, at that point, it's often too late to respond. I mean, particularly in an industry like that that was innovating and growing so quickly to not be able to be nimble and move is...

And it was driving them crazy because they had invented the, they invented the, they invented basically the computer, the modern computer industry, especially IBM. So they invented, you know, the idea of sort of the computer in a lot of ways, the mainframe, you know, they invented many, many, many, many aspects of what we all take for granted today. And then they had, they were not the first to market with a personal computer, but they were the first to market with the personal computer that everybody bought, which was the IBM PC in 82. And so they, which is the story that this TV show tells, Halt and Catch Fire,

And so they thought they were on top of the world. And it just happened that there were these structural flaws and the industry was changing and really broadening out and they just became unable to adapt. And so, you know, look, they're still around today. They're essentially a consulting company today. They consult with, they sort of help big companies kind of deal with tech issues, but they're much smaller and sort of in the shadow of the level of power that they had at the time. So yeah, it's a classic kind of cautionary tale in our industry.

Yeah, because their market cap back in the day would have been what as compared to now? So at the time, the economy was smaller. There were a lot fewer computers in the world. The world wasn't as globalized. And then there's been inflation. And so if you correct for all that, so the biggest market cap of the world today, I think, is Microsoft at $3 trillion. And so I think if you adjust everything out, IBM in those days probably...

They weren't worth this at the time, but if you correct for all that, they would probably still be worth more than Microsoft today, so probably $5 trillion or something in current dollars, something on that order had they kind of held onto that position. And today? I haven't checked recently, but I don't know, $100 billion or something like that, which is a lot, but it's not what it was and it's not what it could have been.

And by the way, look, like I would say, look, they were very, you know, they were very nice to me. They, you know, I had a great experience there. I learned a lot. You know, look, many, many other companies have gone through this. You know, many other, many other industries have gone through this.

You know, as you know, like Hollywood goes through this, you know, from time to time and maybe it's going through it again right now. Yep. And it's just this thing where, you know, success begets, yeah, success begets complacency. Complacency, you know, begets, you know, sort of, if not failure, at least like massive upheaval. And so, you know, the world's not a stable place. And then, of course, our day job at my venture firm, our day job is to start startups that try to accelerate this process, right? And so we try to fund all the companies that go up against these big companies and cause them a lot of trouble. Well,

I mean, you've kind of answered my next question, which is how in the past—

You know, there was always, when disruptive technologies, and I don't think there's anything more disruptive than AI, and I know you agree with that. There was always a breakout success by sort of the non-incumbents, which is what we were just talking about with IBM. But do you think with AI, will that offer the same opportunity? Or do you feel like the Googles, the Microsofts, and the Amazons are inevitably the winners in the space? Yeah.

No, so actually, I think it's already happening. So, you know, the leader today in AI in terms of the products that people actually use, you know, is measured by usage and revenue. The leader is actually a new company called OpenAI. Right. And it's actually, history actually repeated itself in a very interesting way from what we were talking about, which is, so Google has had an AI research lab basically since, you know, for like almost 20 years. Google actually invented OpenAI.

the key AI breakthrough, which was a thing called the Transformer in 2017. And then they did the classic big company thing, which is they put it on the shelf and they didn't do anything with it.

Tell me what the transformer, just briefly, tell me what the transformer is, does, was. Yeah, so the transformer, it's an algorithm. So it's an algorithm or a mathematical formula. And it basically is, it's the sort of key algorithm that makes what's called neural networks work. So basically, the way that AI systems are built is they're built on this architecture called neural networks. And so they're built basically on a software architecture that is modeled after how the human brain works.

And it's not exactly how the human brain works, although it's interesting actually how close you can get, which we could talk about. There's a lot of philosophical questions you get into very quickly around AI, which is like, what is intelligence? What is consciousness? What does it mean to be human? You know, and all these things. But basically, the neural network is kind of the core architecture. The neural network is an idea that people actually came up with in, actually, in 1943 is when the first neural network paper came out. And so there were like very smart people at the very birth of the computer.

the whole idea of the computer. There were very smart people at the time who said, actually, we should probably model these after the brain and here's how we do it. But between 1943 and 2017, this was an idea, but it never quite worked. And you could get it to work in like little cases. And so like the post office had a system for like being able to tell the numbers, you know, zero through nine to be able to do like sorting by zip code by machine. So you could have these like little versions of

things that worked. So it worked just well enough where people kept experimenting with it, but it didn't work well enough where it ever became the kind of thing you see today. And then basically, this squad of very smart people at Google Research in 2017 came up with this algorithm called the Transformer, which runs on top of a neural network, basically. And basically, it sort of steers the neural network to the right answer in the way that you get when you talk to OpenAI, you know, talk to Chad GPT here, if you, you know,

If you use any of these systems today. And so it basically, it was the big breakthrough. I talked to a friend of mine who was a senior engineering leader at Google during that period. And I said, you know, if Google had invented, if they invented the transformer in 2017, and then if they had run as fast and hard as they could to basically get to the equivalent of chat GPT today, you know, just like gone for it, when could they have shipped the equivalent of chat GPT? And he said 2019. Yeah.

Right. Like they had, they had everything they needed. They could have done it and they didn't do it. And then, and then here again, you get into this big company thing. It's like, Hey, why didn't they do it? And I think basically there's two reasons they didn't do it. Well, maybe there's three. So one is just, you have this thing we say that applies in situations like this, which is the motto of any, any monopoly is we don't care because we don't have to. Yeah. And so it's just like, if life is good and everything's going great, like you,

Why bother? Like, you know, it becomes an extraordinary act to do things that, you know, that require additional effort just because like everything's going great. Like why put effort into it? I think that's one reason. I think the second reason they didn't do it is because it's a threat to the core Google business model of search results.

Right. Right. Which is and you see this today. So if you if you go on Google and you ask a question, you know, if you go on Google and you're like, you know, what are the best episodes of a literally podcast? Right. It's just something frequently that happens. Exactly. You get you get a list of blue links. Right. You get your you get your famous list of blue links so you can click on the one you want. And then their business model is they sell ads. And so some of the links that you see are for ads.

And so an ad might be for a subscription signup page or t-shirts or like, you know, whatever else, you know, somebody might advertise. If you ask chat GPT the same thing, it will just answer the question without giving you the links.

And with no ads, right? Because you don't need the links anymore because it'll just answer the question. And so that is what we, in our industry, we call this the innovator's dilemma, which is like, okay, the existing business might no longer be relevant. It might not only be necessary in the new world. And so are you going to cannibalize yourself? And so that's issue number two. And then issue number three is everything that involves this term safety that gets used a lot now, which is basically all reduces to what if it says a bad thing?

And specifically, what if it says a bad word or, you know, says the wrong thing about, you know, whatever topic people get upset about. And we've seen that happen a bunch recently. We've seen it happen. Exactly. So, number one, like, if you basically just, like, run these things without a lot of what they call, what they sort of call human training, like, yeah, they will say things. And then, look, they

the way these things work is they're just trained on huge amounts of human data. So they're just like trained on all the content on the internet and they kind of composite it together and kind of do figure out a way to process it and give you kind of smart answers out of it. And so, you know, look, people, a lot of people have said a lot of things about a lot of people and like, you know, if the machine, then the machine doesn't know, like the machine doesn't know, like, is this a person or a political movement or whatever, this or that? It's just like, it's just, it's processing words and it's just trying to give you like an answer that you like. And so it is very easy for it to kind of give you an answer that's going to like really, really offend you. Which is worse, that or,

or having an overlord of committees that are, quote-unquote, steering it, and who knows what their agendas are? So this is precisely the problem. This is 100% the problem. This is the sort of vice that these companies are in right now, and it's a real struggle. And this is on the heels of the same fight playing out on social media for the last basically decade, which is sort of there's all these questions around sort of safety and censorship, and so this is precisely the kind of key thing.

This became very visible recently with a different actually AI product from Google called Gemini. So Google has now released a chat GPT equivalent competitive product called Gemini, which is live now five years later. One of the things Gemini can do is it can generate images. And so you can type in a text prompt description, show me a picture of, make me an image of the founding fathers, for example.

it will draw you images. And then you can say, you know, give me the, you know, anime version of it, you know, give me the cartoon version, give me the photorealistic version, give me this, you know, whatever you want. You can do, you know, give me the rom-com version. Like, it'll generate any number of images. It's very clever in doing that. But,

You know, it has this problem, which is like, okay, if you say, for example, you know, give me a photo of a group of doctors, by default, it will just give you basically, it'll give you a representative sampling of sort of a group of doctors as you'd see in the population today. And so it'll give you basically a group, if you're in America, it'll give you, oh, there's a bunch of white male doctors.

Right, and so, because that's sort of in the training data, that's like most of the reference images, unlike doctors, most of the photos of doctors who are practicing today are white men, and so it just kind of does that. And so they do this thing called, they call it de-biasing, where they basically modify your prompt to basically add diversity. And so they do this thing where it's like-

You just want an answer. And they've added a layer that puts diversity into the answer, period. Yes, correct.

Correct. And so stick with the image generation thing. So now if you ask Gemini for an image of a group of doctors, it gives you sort of a representative of demographic sampling of a group of doctors equivalent to what you might see, for example, on a network TV show. And so you've got, you know, your black doctor, you've got your lesbian doctor, you've got your doctor in the wheelchair, you've got your Muslim doctor, you've got your, you know, it sort of does that. Now, does that accurately reflect doctors as they exist today in the American population? Maybe, maybe not.

whatever, they're trying to sort of de-bias what they believe to be sort of cultural biases, even if that doesn't represent the underlying population group and they're trying to add diversity and make the world wonderful. There is a problem with this, which is if you ask it to give you an image of the founding fathers, it makes them all black.

Or if it's, you know, feeling particularly diverse, it makes them, you know, sort of black and Asian, right? And it's a little bit like, okay, that's a little bit, you know, if you're a kid in school trying to learn about the history of the U.S., like, okay, that's going to be a little bit confusing because, like, you know, they were not, the founding fathers of the U.S. were not actually black. In fact, there were actually issues at the time around, you know, the rights of, like, you know, black

black people in the U.S. at the time, right? Slavery and so forth. And it'll just bury all that and it'll just pretend that lots of people were black when they weren't at that time. It's even worse than that. If you ask it for a group of Nazi SS stormtroopers, it will also make them black. That is insane. That blew up in their faces. So what did they do? I know it was all over the news. Have they pivoted? And if so, how do they pivot without jettisoning

their core beliefs because we know they want to, in their view, make the world a more just, equitable place that's all very, very good. But

And that's clearly not the way to go about it. So how do they manage that? So that's what they're trying to figure out right now. And by the way, it's not just them. This is what every other big company in the space is trying to figure out. That's precisely what they're trying to figure out. And this is why I say they're in a vice. There are no easy answers on this one. Because to your point, if you don't make these modifications, from their point of view, in this mindset, you get a sort of unrepresentative sampling of what you want the world to look at.

if you do make these changes, it's just, it's very, and then look, it's like a sport, right? So it turns out people on the internet have like a real sense of humor. And so it's like a sport. Anytime any of these products drop now, like it's a sport for people on the internet to try to figure out these cases where it will do things like this. Like, so, so like, I'm quite sure it never occurred to anybody at Google to like test what happens when you ask it for, to render a photo of like, you know, Nazi stormtroopers.

It's like normal people working in a company are not sitting around trying to figure out, oh, let's have it draw Nazis. That's not a thing that you would do. But people on the internet are like, wow, let's give it a shot and see what happens. And it's like, oh my God, they're black. And then you become, it's this term internet famous, right? You become internet famous when you get one of these systems to do something like this. And so there's this, basically the internet is a hive mind that's basically trying to break everything and trying to find all the edge cases and corner cases.

And so you've got this kind of adversarial relationship between the vendor on the one hand and then all the users that are trying to basically be the one to be able to claim credit to say, aha, I got it to draw Black Nazis out.

And so this is a real challenge. And then, by the way, the other thing that's happening is just incredible. So one is just incredible scrutiny and all this from the employees. And so the employees at these big companies are very politically activated and socially activated, and they're very fired up about these things. And they really get up in management's face on this. You've got these shareholders that are very – these investment managers that are very politically kind of active, motivated. And then you've got the government. Right.

And you've got Congress, you've got all these regulatory agencies, and they have all these agendas. And then especially for internet companies, they have this kind of weird thing where – and Facebook has run into this a lot over the years, and Twitter has run into this a lot over the years – which is a system like Facebook or Twitter or Google search or any of these AI systems. Look, people use them on the internet. The internet is big. In any given month, they're going to generate 10 billion pages or 100 billion pages or a trillion pages online.

of content. But what happens is all you need, like if you want to attack one of these companies, all you do, all you need is one screenshot of something gone wrong, right? And so all you need is like the one screenshot of the Black Nazis or one screenshot of like somebody saying something horrible next to like a Verizon ad. And then what the activists do is they print out that screenshot and they take it to Congress and they're like, aha, this company is evil. I have the evidence for it.

Right. And then you, and then you roll in a, this happens congressional hearings all the time now for tech and you roll in congressional hearing and you're like, well, you know, you know, Congressman, like, let me explain, like we generated 40 billion pages last month. This was one out of the 40 billion. And then all of a sudden you're basically like defending somebody like saying the N word next to a Verizon ad. Like it,

It's not a – there's no clean answer on this one. It's just inherently messy. And then I would also say, look, there's kind of a geopolitical kind of aspect to this, which is these products by default are built by Northern California companies that are run and staffed by the exact kind of people you would expect in Northern California companies.

highly, you know, sort of American or immigrant American, highly educated, highly refined cosmopolitan people, super smart, very generally, very progressive, very socially aware, very, you know, kind of politically activated, you know, that live in this kind of hothouse environment that we have up here where they, you know, they basically only meet people, you know, who are like themselves.

You know, but that mentality and worldview is maybe 7% of the U.S. population, and it's, you know, 0.2% or something of the global population. And so now you have this thing where this, you basically have these systems that are encoding social and political beliefs that are not representative of the vast majority of the users.

And so you can imagine being a, you know, somebody in France or somebody in, you know, Argentina or somebody in, you know, the Congo or whatever. And like it's, you know, using these products, it's just, they just have these like views on things that just seem like they're just beamed in from Mars. Yeah, for sure. 100%. Right.

Right, exactly. And so, or even through a lot of the US, people have that experience. So, you know, the classic example on this for a very long time, I don't know if this is still the case, but it probably still is, for a long time with any of these AI systems, Chet GPT or Gemini, you know, if you asked it to write a poem, you know, extolling the virtues and glory of Joe Biden, it would be happy to do so, and it would go on a great length. And if you asked it to do the same for Trump, it would completely refuse on the grounds of, you know, it can't possibly say anything nice about a hate figure. Right, and so it's like, okay, like...

you know, if you're in my world, like, okay, that's a good output. If you're, you know, if you're a Trump-supporting, you know, blue-collar guy in Arkansas, like, of course, that's, like, incredibly offensive. And so, anyway, like, we have gotten embroiled in all of these social and political controversies. And, you know, I would say so far the companies have, you know, let's just say, let's say, they have not yet figured out a formula to navigate through this. ♪

All set for your flight? Yep. I've got everything I need. Eye mask, neck pillow, T-Mobile, headphones. Wait, T-Mobile? You bet. Free in-flight Wi-Fi. 15% off all Hilton brands. I never go anywhere without T-Mobile. Same goes for my water bottle, chewing gum, nail clippers. Okay, I'm going to leave you to it. Find out how you can experience travel better at T-Mobile.com slash travel. ♪

♪♪♪

I also look at, as I was coming in, I was reading the Wall Street Journal, I guess, about the unnamed source, but clearly it was planted at ByteDance, saying that they absolutely would rather shut TikTok down in the United States than sell it because the algorithm is so proprietary. So, I mean, that's another, and it's a little bit of a pivot in the conversation, but it's still, it's a huge, it's a huge thing. Let me ask you this, why is the algorithm so proprietary?

Which is insane, by the way. I mean, we deal with a lot of that goddamn algorithm in Twitter. I'm sorry, at TikTok. TikTok. TikTok, yeah. Knows me better than I know myself. It knows stuff I want to know before I know I want to know it.

Yeah. How does it do that? I mean, is there a way? I mean, I guess if you knew it, it wouldn't be proprietary because, I mean, I guess you are invested in ByteDance, but tell me, how do they do that? Yeah, so just for clarity, just for your audience, yeah, so ByteDance is a Chinese company. ByteDance owns TikTok. Yeah.

which is the American service that everybody, it's a short video service. Everybody, you know, a lot of people know it's very popular. And by the way, we're not investment by dance. Oh, you're not? Okay. We're not active. We don't do China. So we're not, which we could talk about. We're not in that. But, you know, I'm on the board of Facebook, you know, now called Meta and we're involved in Twitter. So we're involved in the American versions.

And then, of course, there's been all this controversy. Actually, a new law just passed this week that basically is going to force a new U.S. law just passed. Biden just signed into law that is going to force ByteDance, the Chinese company, to divest and sell TikTok to an American owner or shut it down.

And, you know, it's up to ByteDance. You know, it's up to ByteDance and their Chinese leadership and the Chinese Communist Party, which is what ByteDance, you know, who ByteDance reports to the CCP, as all companies in China do. So it's up to them whether or not they want to sell TikTok to a U.S. owner, separate it out, or whether they want to shut it down. And the news reporting has been that they may prefer to shut it down, which I think makes sense from a number of angles. And so, yeah, so this has been like a super hot topic. And so...

Yeah, so this goes to kind of how the internet's evolving. So the way I described it, so if you go back to the internet like the 1990s, 2000s, we called it Web 1.0. And the reason we called it that was it was sort of, the web was kind of the main thing people use the internet for to go online. They surf to different web pages, websites.

But the idea basically was you go on the internet, you surf the internet, and then that means you basically you go out and you go to whatever site you want. And so if I want to go to Rob Lowe's site, I go there. If I want to go to the New York Times site, I go there and I just kind of surf around. And then, you know, the Google search engine helped me find those places. And that was kind of how it worked. And then basically Facebook, there were a set of companies, MySpace, Facebook, and then Twitter that invented this concept called the feed.

And that led to what's called Web 2.0. And the key hallmark of Web 2.0 is basically the feed. Instead of you going out and finding things, everything just comes to you.

Right. And this is the experience now that you have if you use Facebook or Twitter or Instagram or TikTok, which is you just sit there on the app and you just scroll. Right. And everything comes to you. And so all that content that comes to you is the feed. And then these companies also invented this marvelous thing they call the infinite scroll, which is it used to be when you scrolled, eventually it would end. Like it would just run out of content and it would end. And if you notice now when you use these systems, it never ends.

- Never. - And you can sit and scroll like a mouse hitting the lever and getting cocaine pellets. Like you can sit and scroll for the rest of your life and it will just keep feeding your content. So it is, you know, these have become, if not the big argument over whether these are literally addictive, but you know, literally or figuratively addicting. And look, you know, a lot of people love this stuff and you know, I love getting on these things and you know, they give you all kinds of fun things to see and look at and people and so forth. And so there's a lot to like about it, but that's kind of how the Web 2.0 world works.

As a result, the Web 2.0 world really consolidated power in the hands of a small number of very big companies. And so the amount of time people spend on the Internet going out and surfing is much smaller than it used to be. The amount of time they sit in the feed doing the infinite scroll is most of the time people spend on the Internet. And so correspondingly, these companies have gotten very big and powerful and important companies.

The big breakthrough that TikTok had worked as follows. So if I went on Facebook, if I went on the Facebook feed, you know, even as recently as like, you know, five years ago, the way Facebook worked was I follow, you know, basically we're friends. It's called a two-directional basically friend relationship. And so I friend you on Facebook and then you would decide whether to accept the friend request or not.

And if you didn't, nothing would happen. But if both of us, it's called kind of double opt-in. If both of us said, okay, we're friends with each other now, your content is going to show up in my feed and my content is going to show up in your feed. And that was good because it's like, wow, I'm friends with Rob, therefore I want to see his stuff and vice versa. It's great. So that's called the sort of bi-directional friend connection. Twitter changed that.

The way they did it was a single-dimensional thing, which was follow, right? And so I follow Rob on Twitter. I see his stuff in my feed. He may choose to follow me back, but he doesn't have to. And what you see in the feed is just based on who you follow. TikTok did this very clever thing and said, let's not do any of that. Let's not do either one of those. Let's not do bi-directional or single-directional follow. Instead, let's just show you whatever is the best for you, no matter where it's from and no matter who made it.

And so TikTok's feed was the first feed that got programmed with just, we're just going to show you the most appealing thing, regardless of who made it. You can follow people if you want, but frankly, it doesn't matter. And then we're going to build it. We, TikTok, are going to build an AI system that's going to be so good at understanding all of the behavior of all the users in the system and all the reactions and everything that they like and everything that they respond to.

right? And all the data, and we're going to use that. It's an AI. It's basically, it's an AI, and it basically sits, and it just, it knows so much both about you, and it also knows so much about everybody else who's like you. And it has this kind of infinite ocean of content being created by many millions of people to draw from. And so anyways, and that's why it's like, that's why it's so good. That's why it's so compelling is because it's drawing people

from everybody's content. Now, by the way, what's happened is both Facebook and Twitter have since adopted that same method, right? And so both Facebook and Twitter have thrown out the constraint of you having to friend somebody or follow somebody. Now they just do the same thing. They'll just optimize across everybody's content. And so if you go on Twitter and use their X, as they call it, XNow, it's giving you content from people you don't follow just alongside content from people you do follow.

And the reason the algorithm is so important is because there's an ocean of potential content to show you. And so that process of getting it to, okay, I'm going to show Rob at this specific point in time, this specific thing, and then this thing after it, like that's this very specialized art form. And that's what this company is really good at. ♪

All set for your flight? Yep. I've got everything I need. Eye mask, neck pillow, T-Mobile, headphones. Wait, T-Mobile? You bet. Free in-flight Wi-Fi. 15% off all Hilton brands. I'll never go anywhere without T-Mobile. Same goes for my water bottle, chewing gum, nail clippers. Okay, I'm going to leave you to it. Find out how you can experience travel better at T-Mobile.com slash travel. ♪

Qualifying plan required. Wi-Fi were available on select U.S. airlines. Deposit and Hilton Honors membership required for 15% discount terms and conditions apply. Look at your cup holder. It's empty and you're feeling thirsty. Head to a nearby convenience store and fill it with a Pepsi Zero Sugar Mountain Dew or Starry. Grab a delicious, refreshing Pepsi for the road. So in the beginning, I'm imagining...

You set up a TikTok. You've never been on TikTok in your life. You set up a TikTok account. They show you preset stuff that they know will begin to siphon you off into categories that lead to categories that lead to categories to eventually figure out what you like, right? I mean, I would, I would. It's just that simple. Well, so this goes into a lot of the politics around TikTok. So this goes into why TikTok just got banned in the U.S. or why it's being forced to divest or shut down.

which is like, okay, if you're building purely for commercial, if you have just purely commercial motives, right, then you want to do exactly what you just said. It's just like, I just want people to use this. I just want it to make people happy. I want people to feel, you know, I want them to come back. I want them to, you know, I want them, you know, whatever. I want them to, over time, I want to be able to put in the best ads and so forth.

And so I want to just like have the best commercially motivated service I can have. And yeah, I do that. And then, yeah, it like tests you, you know, it'll try you out in the early phases. It'll try showing you different things and it'll, and you react to different things in different ways. You know, so it'll, it'll, it'll, it'll see like everything you like or whatever comment on, but it also just see like, it knows whether you watch videos to completion, you know, it knows whether you've scrolled through, it knows whether you've gone back. And so it has all these signals that it can draw on. So, so if the company's being run for purely commercial purposes, that there's that. Yeah.

But what if the company were being run by a government and that government had geopolitical motivations and felt that it was basically providing a service that was being used by citizens of a rival country?

So, for example, such a government might have its domestic version of the service show things like young people going out of the town and meeting each other and forming families and exercising and being patriotic and going to patriotic marches and being enthusiastic about the domestic politicians and so forth. But in the other country, in the rival country, it might show a very different kind of content. Right?

Right. And it might show this demoralization op is a term you'll hear sometimes, which is like maybe it will show deliberately demoralizing content. And so maybe it'll show content instead of like political strife. You know, maybe it'll show, you know, real life crime videos. Oh, yeah. Drive drive by shootings.

drive-by shootings, maybe it'll show, by the way, you know, if not outright pornography, maybe it will show basically, let's just say socially, you know, potentially, you know, damaging content, you know, portrayals of young people doing bad things. Listen, it definitely doesn't show you the highlights of the spelling bees going on. Exactly, 100%. Now, and by the way, when you do a comparison of TikTok in the U.S. to the TikTok equivalent that they run in China, there's no question it's showing you very different things. Oh, it's insane. It's insane.

It's very different things now. But here's the problem. Like how much of that is on purpose being steered by the company and by the Chinese Communist Party? And how much of that is just because the population groups are different and people in China are just interested in different things, different kind of people, different kind of culture, people interest, different things.

The problem is we don't know because it's what's called black box. It's the algorithm. The algorithm is secret. And so you as a user or me as somebody trying to analyze this, you can't look inside the black box. They don't publish it. And so it just throws up the results it throws up. You can have theories as to why it's doing what it's doing, but you can't prove anything because it's a black box. There's no incentive for them to ever open up the black box because then it's not

proprietary anymore and anybody can use that algorithm. Well, that would be the commercial reason. The commercial reason to open the black box is that, right? It would be a threat to their business if everybody learned what they did. But also, if they were doing things that were intentionally manipulating populations for non-commercial reasons, then it would also show that that's happening, right? And so, you know, is TikTok doing that or not? I don't know. You know, I have my suspicions. Lots of people have suspicions. I genuinely don't know. You

You know, is the U.S. government willing to allow a black box Chinese service like that to operate in the U.S.? We just learned as of this week, no, they're not.

And so, again, it's one of these things where, and this is, again, obviously, the vice. This is where ByteDance, the company now is in a vice, right? Which is like, because what's going to happen, so ByteDance, the Chinese company now has a choice to make, which is, let's assume somebody in the U.S. wants to buy TikTok. It's a very valuable thing, and there are various people whose names have been floated as maybe wanting to buy it and turn it into a U.S. company, right? Break apart TikTok, have it be a U.S. company.

So let's suppose that transaction happens. You know, somebody, an American company buys TikTok. They will now have, they will now be able to open the black box, right? Because they'll now have all the code. They'll be able to see how everything works. And so if ByteDance was doing nefarious things, they could never permit a transaction like that to happen. Oh, and so suddenly they're saying maybe they don't want to sell it to the United States. Yeah.

Which is a curious conclusion because depending on the estimations, the estimations are TikTok on a standalone basis is worth $50 to $100 billion and maybe more. And so it's sort of ByteDance saying, well, maybe we'd rather literally incinerate $50 to $100 billion of value to basically prevent opening the black box. Hmm.

Draw your own conclusions. Exactly. So I would say whatever had been their plan to convince people in Washington and the American public generally that they were trustworthy on this stuff, it obviously didn't work because this law just passed. This is one of the very few laws in the last decade to pass on an overwhelming bipartisan basis.

and be instantly signed into law from the White House. I mean, the law was written by, I know the guy who actually wrote the law, which is a Republican congressman actually from Wisconsin. And, you know, so when's the last time Joe Biden like enthusiastically signed a law written by a Republican congressman, like, you know, not in forever. This one passed, I don't even know, it was like 95, you know, it was like, I don't know, it was like 95 to five or something, you know, percent votes. Whatever their PR strategy was, it like didn't work and is leaving people with this suspicion that the whole thing basically was

was essentially a Chinese communist, you know, plot from the very beginning. A PSYOP. Yeah, yeah, yeah. And by the way, like very effective. Oh, and then the other danger, well, here's the other danger, right? Okay, so there's also the danger of like, and this is a big motivation for the legislation passing because this is what like congressmen and senators are thinking about and the White House is thinking about, which is like, okay, it's doing whatever it's doing today and we don't know what it is because it's in the black box. But like, what happens if China invades Taiwan?

Right. And all of a sudden you're getting into like a potential shooting war situation. And what if all of a sudden the complete pool of all kids in the U.S. who are potential recruits to the U.S. military, right, are just seeing videos of like basically and maybe by the way, AI generated fake videos of like American servicemen committing military atrocities. Right. And then all of a sudden there's no recruits anymore for the U.S. military. Right. And so there was there's another layer of deeper threat underneath this. And I think that's also what motivated the government in this case was to was to head that off.

Yeah, you can only imagine also the stuff that they know and they've gotten classified that we don't know. Yeah. So there was that, do you know the, do you know the Grindr story? Yeah.

Tell me that story. Okay, this one's fantastic. Okay, so there's a famous app called Tinder. Lots of people go on Tinder. They meet special someone's, maybe multiple, maybe several special someone's a week. It's got lots of controversy around that, but that's the thing. It turns out there's a gay version of Tinder called Grindr. Oh, as TV's Grindr, I'm very well aware of Grindr. Okay, goodness.

Okay, good. Well, you know, different people are aware of Grindr at different levels of specificity. Yes. So Grindr is basically the gay dating app, and it works like Tinder. You know, it's just swiping in the whole thing, but also it's location-based, right? It has this incredible location-based thing, you know, so you can find people, you know, kind of nearby, and it's this very successful thing. And so about five years ago, a Chinese gaming company, mobile video game company... Oh, and I should back up. I should say a second. If...

If you're the owner of Grindr, you have a problem, which is you can make money. You can run Grindr as a private company and it generates cash and you can make money that way. But the U.S. stock market is still pretty leery of some of these kind of sin stock things. And so it's very hard to take a thing like that public or to sell it. And Google's not going to buy it or whatever. It's just a little bit too out there. And so you have what's called the exit problem, which is like ultimately there's no way to get the money out

And so the owner's grinder at the time had this problem. And so this Chinese video game company showed up and bought it. And it was a little bit of a stretch. And I actually met the guy. I actually met the guy. I actually met the guy who bought it. It's kind of a very interesting character and very Chinese. And they show up and buy it. And then literally they buy it and everybody's just like, whatever, gay dating app, China, whatever, doesn't matter. And then it occurs to people in Washington. It's like, okay, hold on a second.

Like, it's on the, it's on, if you're gay and you use Grindr, like, it's on everybody's phones. Okay, you know, the U.S. government employs 4 million people, you know, something like 2 million people in the government have security clearances. You know, some number of them, I don't know, whatever the number is, 100,000 or something, have top secret security clearances, you know, and these are everything from, you know, diplomats, spies, you know, military commanders, etc.

And, you know, some percentage of them are gay and some percentage of them are on Grindr. And it knows who they all are. Unbelievable.

Unbelievable. Right? And so, and by the way, like, you can get a security clearance, you know, today, just fine in the U.S. You can be gay and you get a security clearance, but, like, you have to tell them that you're gay. Like, because, you know, in the past, you couldn't get a security clearance if you're gay because you were subject to blackmail. Today, if you just say, yeah, I'm gay, they're fine with that because it's no longer subject to blackmail. But if you haven't told them you're gay and you've kept it a secret or maybe you're married and you're secretly gay or, you know, bisexual or whatever...

Like now you're actually subject to blackmail. And so all of a sudden the Chinese know the names and the real-time locations of all U.S. government officials and everybody with U.S. security clearances that are gay and on this service.

And so, you know, there's this old Russian, KGB had this concept called the honeypot. Of course, sure. It was the Red Sparrow movie, but, you know, it's a recent example of this. So, you're the very attractive person of the opposite or same sex, whatever your preference is, and they kind of seduce you into it.

an illicit relationship. And so if you view that way, Grindr is like a giant honeypot, right? It's basically like a giant data mining exercise for all this activity. And so the U.S. government basically did an oopsie, and they basically did a forced divestiture of Grindr after the fact. And a year later, they came back and they forced...

And they forced the Chinese company to actually divest it and sell it back into the U.S. And so anyway, it's like, oh, and then I'll give you one more example. Strava, this got very entertaining. So there's this app called Strava, which is an exercise app. It's the running app, right? And so Strava is the app that you run on your phone. And if you're like a 5K guy or a marathon guy or triathlete or whatever, you run Strava and it gives you like all the different running routes.

and it tracks all your running and then it gives you all the stats of how long you ran and your speed and uphill and all that stuff. And so anybody who's into running uses Strava. But Strava is another one of these apps. It knows who all of its users are and it has all the locations of all of its users all the time. And then it has this giant data set, basically, of millions of people running

you know, all kinds of places wherever they go in the world. And so it turned out the Strava dataset was actually revealing the location of secret classified US military bases overseas because you would literally have like a group of people who were doing like a run, like a rectangular run in like a remote area of Afghanistan that was supposed to be like completely unoccupied. And if you see like 30 American service people on Strava doing the same run every morning, you know what they're doing. It's just they're running around a military base.

Right. And so all of a sudden, right, all of a sudden, you're literally outing, you know, these classified facilities. So anyway, so the data on these services, right, you add all this up, right, TikTok, Grindr, Strava, all these things, you know, you're getting a map of basically the activity of everybody on planet Earth.

including people who are in, like, really important jobs who have, like, you know, really sensitive information who are in, like, a position of, like, you know, command and control over the military. And so, yeah, it's the sort of national security implications of all this are becoming, you know, kind of very vivid very fast. It's unbelievable. And obviously, we're doing the same on our end. Although...

We don't have our version of TikTok. We're not able to mine the Chinese population like they're clearly mining ours, I don't believe. So our equivalent, our companies like Twitter and Facebook cannot operate in China. This was another part of the whole TikTok thing, which is China will not allow those companies to operate. Actually, that's not totally true.

China will allow those companies to operate in China if they give China all the user data. And in fact, Google actually famously pulled out of China. So the Google search actually used to be available in China. And then basically the Chinese government basically said, we need to be able to block box. We need 100% access to all Chinese user data

because we have to actually be able to see what all the users are doing. And of course, China has a whole censorship regime. And, you know, in China, you get like set to jail for like saying, you know, bad things about the government. And so, you know, we're trying to like organize. It's actually funny. China is nominally a communist country, but they will send you to jail if you try to organize a union, like a workers' union. Like they have no sense of humor about actual socialism. Workers, yeah.

Actual workers. They'll send you straight to jail. So if they want to track a union organizer, they would just go to Google and they just say, give us the data. And by the way, give us his location data and give us all of his Google searches and everything he's been reading and everybody he's been talking to and all of his Gmail and then we'll go arrest him. And actually, there was a famous case. There was a famous case where Yahoo, which used to be at the time was one of these really big services. Yahoo in the early 2000s before this all became clear

Yahoo was available in China. And actually, the Chinese government actually got, there's a famous case where they got access to, I think, his Yahoo email account. And they had information in there. And then they actually arrested a guy and executed him, a dissident. And the CEO of Yahoo at the time had to testify in front of Congress. And they actually made him do a ritual apology in Congress to the family of this poor guy who got executed. And so anyway, in the wake of that,

Google basically said, well, we can't operate in China because we can't operate under these constraints. Right. Because like in the US, if the government wants to come get your emails from Google, they have to come with a subpoena. Sure. In China, they just come. Right.

Right? And so anyway, so Google famously pulled out of China and said, we can't operate under that. And then every other American internet company that's tried to operate in China has been confronted with this problem. And most of them, including Facebook and Twitter and many others, have refused to cooperate with that. And so effectively, the Chinese market is off limits to our companies in a way that our markets are not.

at least in the past, have not been off limits. And so this TikTok ban is sort of an equalization of who gets access to what, although it's a big deal for us to sort of lower the storm shutters now in an equivalent way to what the Chinese have been doing for a long time. Do you think that it's the beginning of an equaling out of...

what we're doing versus what they're doing? Like, what will be the cascading effects? My sense is there will be some. Well, I'll just start by saying, like, I'll just start by saying, just declare, like, I think there's no moral equivalence whatsoever. Like, I think that, you know, the Chinese Communist Party is a full dictatorship of

you know, there are no, you know, three branches of government. There are no checks and balances. You know, the courts basically just work for the government. They do whatever they want. You know, it's a military dictatorship. And so it's, you know, we have our issues in the U.S., but like we're not that. And so there's no moral equivalence as far as I'm concerned. But to your point, I would say the policies are starting to resemble each other more and more. And I think part of that is just quite simply, it's just, I would say it's a

It's becoming, I would say, a unanimous view in the American political system that there's basically, we're in a new Cold War with China. And we're entering, you know, and, you know, the background here is from

you know, about 2000 to about 2020, you know, the U.S. government's kind of policy towards China was to try to help them open up. And so basically like have lots of trade and do lots of collaboration and joint ventures and investments. And actually the government used to, you know, encourage people like us to do more business in China. Oh yeah, absolutely. And that

And by the way, the theory of that was, you know, yes, China is not a free society. And yes, China is somewhat a geopolitical foe of ours. But the theory was if they become more capitalist, they will also become more democratic. That's right. Right. And so if you engage in more business with them, that's good because it will cause them to open up and become more liberal, more democratic. You know, I think what most people in Washington would say today is that that didn't work. And actually, specifically, it was Trump. So Trump in 2016 basically said that didn't work.

This is a totally unfair one-way street. They're actually not becoming more democratic. They're just taking advantage of us. This one-way trade thing is basically a jib, and we should stop doing it. And then what's interesting is it's one of the rare issues where actually Democrats actually were like, oh, actually Trump is right.

And actually the Biden administration has actually been more anti-China even than the Trump administration was. And so what you have now, and this culminated in the TikTok thing, you have sort of Democrats and Republicans competing to be the more anti-China. And so what they basically say is, yeah, it's a new Cold War. It's like the Cold War with the Soviet Union. Hopefully it stays a Cold War and hopefully you just have these two different political social systems and these two different giant militaries and they just kind of each stay in their corner.

And it doesn't, you know, kind of go bad. But, you know, like, you know, maybe it will. But the great experiment that all they need is more Starbucks over there and everything will be fine is over. Yeah, that's over. Like I would say nobody in the U.S. political system believes that. But they did. But they, listen, well, my kids, my kid is older. My boys are older than your young son. But like when I remember when they were in high school,

Everybody was learning Mandarin and everybody couldn't wait to go over and go into business over there. And it was just, it was the wild west. And then the streets were paved with gold and all of that. That was like, that was what you were taught. Yeah, that's right. Well, in fact, my son, JJ, he is learning Mandarin. So we're still on the progress. So to your point, like we started the program back when everybody thought that.

Right. Back when he was like two, he started doing immersion Mandarin. So he's like seven years in now. But yeah, and my theory is either he's going to be able to go do business in China or, you know, work for the CIA. That's right. You know, either way. But it's leaning more to the CIA. As of right now, it's leaning more in that direction. Yeah, exactly. And then look, this, you know, the specific flashpoint issue is this Taiwan thing, which is China is, you know, China's basically declared its intent. You know, China basically says Taiwan's never not been a part of China.

Right. Yes. You know, they actually get quite mad at you if you say that it's actually separate. And then they've declared their intent, like, very clearly that they plan to basically go get it back, which, by the way, they did to Hong Kong. So this would be, you know, sort of a second one. I got a great Chinese...

Taiwan story. This is why people come on to this podcast. They want to hear a geopolitical Chinese Taiwan story from Marc Andreessen and Rob Lowe. They're not interested in Brat Pack or West Wing lore. They don't want any of that shit. They want this, and I'm going to give it to them. So I'm lifelong friends with Kenny G. Kenny G, there's no bigger star in China

No bigger story. In fact, Kenny G has a song called Closing Time that is played in every business in China at closing time. That's so sad.

And everybody, it's like the fucking national anthem. And when they, you know, the Chinese, they do what they're told. They hear it, they get the fuck out of the malls. It's closing time. Kenny G's playing. Beloved. Kenny, and I love him, but Kenny pays attention to anything that makes it within a hula hoop radius of himself. So Kenny is somewhere in the, in the,

Far East. And he's on a walk. And he meets these very nice, energetic people who want to take a selfie with him. And he gathers... And they're all camped out. They're all... It looks like they're having a party. He thought they were having a street party. He thought. Takes a selfie with them. They're pro-Taiwanese protesters. Yep. He posts it. Ouch. He has not been back to China since...

That's terrible. That's awful. I'm laughing because it's so Chinese that they would ban him, that they would terrorize him for doing that. Duns over. Yeah. Persona non grata. Yeah. That's amazing. They were nice people having a street party. Yeah. He thought. Yep.

Amazing. Yes. They take that rather seriously. Yes. Okay. I need you to commit to coming on the show again. Can you come on again? Sure. Of course. Okay. Before we go, I want you to give me the average person listening to this in two to three years.

How is AI going to change their life? Oh, so... I know this is a whole podcast, what I just asked you. I realize that. Totally. So I'll give you three things. So one is a fantastic doctor on call all the time. An AI doctor that is happy to talk to you at any point. Any three in the morning you wake up, you're worried about something. It is as knowledgeable and skilled as the best doctors in the world. And it will spend as much time as you want talking about

you know, whatever. And like, I don't know, there's this thing on my, you know, this thing on my skin and I'm worried about it. And okay, snap a photo of it, upload it, and it will, you know, it'll tell you what's, you know, like just like a world-class concierge doctor all the time. With literally the knowledge of the ages.

Yeah, exactly. Yeah, exactly. Like full, full, full, full knowledge and everything. And then ability to like help, you know, coordinate care. And it's like, okay, you really need to go in and see somebody for this or not. Don't worry about it. Like all that. So, so, so that number one. Number two, same thing, lawyer. Everybody in the world gets a world-class lawyer.

on call all the time. And so, and that's, you know, that's everything from, I got this parking ticket, I don't think it's fair, and what should I do about it? You know, to like, you know, I don't know, I need to sue my landlord or, you know, whatever the thing is. And again, you know, I think you're like infinitely knowledgeable and patient. But let me ask you a quick question. Doesn't there, but there has to be an infrastructure. It's probably one of the things you're investing in. And okay, and yes, the AI lawyer says, yeah, you got screwed in this parking ticket 100%.

but they can't do anything about it. They can only advise you. So funny, there's a whole funny story. And actually you should have this guy on too. So there's this, one of our founders has this, has an AI app for, that's just like this. It's called Do Not Pay.

And it closes the loop because it plugs in all of the systems. So anytime there's any sort of system like court filing system or scheduling system, customer service system, it plugs into those. And so it can't do everything. You may still have to show up in court. But if you get a parking ticket and you want to get out of it, it will tell you step by step precisely what to do to get out of the parking ticket, including what courtroom to show up in at what point and what to say to the judge. Amazing.

Right. And then, and it does the same thing actually when you deal with companies. And so you want to, you know, you're trying to get out of your Comcast subscription or something and they're not letting you because they put you in infinite, you know, infinite voicemail, you know, voice. These are the kind of, these are the kind of angel investings, you know, you go for 45 of them. If, if 10 of them go, you're, it's Eureka. It's stuff exactly like this. Yeah.

Yeah, yeah, yeah, exactly. If you just Google, Google do not pay just to see it and, you know, for people listening. And it's just like it has this like incredible, it has now, I don't know, hundreds of thousands of things like this where it will do things for you like this. And so to your point, it won't do everything for you, but it will do a lot and it will guide you through all the steps. And so it's like a huge equalization. The way we think about it, the way he thinks about it that I agree with is it's an equalization of power of the individual versus the state or the individual versus the large company. Yeah.

And he's going to do everything he can to have it. He's actually offered, he did this hysterical thing. He's totally for real, by the way. He's doing this for real. So normally it's this thing you use as a consumer for medical billing or whatever, visa applications or whatever it is you're trying to deal with in your daily life. But he wants this to be a generalized robot lawyer. And so he put out this thing. He's like, okay, I think we have the AI lawyer working

He says, I will pay any lawyer who has a case pending before the U.S. Supreme Court a million dollars to go in and argue in front of the U.S. Supreme Court with an earpiece. And my AI lawyer will tell you everything to say. And, you know, we'll see how well it does in a live case. And the legal profession, like, freaked out. And lawyers are like, how dare you? And you're perverting, you know, the sanctity of the court system and the whole thing. And so Josh said, OK, fine, I'm going to up it to five million dollars.

Amazing.

or you're in a conflict with a hospital over a bill, and the bill comes with 28 pages of single-spaced fine print. As just a normal person, how are you supposed to confront these giant institutions? And see, this is the part in your manifesto, I recommend it to anybody who hasn't read the manifesto. Google it. It's great. But I think we need to have this discussion because this is the part of AI that people aren't really focusing on.

Yeah. Like, people with means, we have all this already. Yeah, that's right. We have it all. We have it. I have a really fancy, expensive doctor. Right. But people in South Central LA don't. That's right. They will, but they will now. They will now, exactly. Yeah, that's right. And I think that's the part that people have not fully focused on or they wouldn't be rabble-rousing to the extent that they are.

Yeah, that's right. Well, and by the way, look, there's going to be controversy around this because like, you know, guess who's not going to be in favor of all this happening is like all the human doctors, all the human lawyers, right? So, you know, there's going to be like, there's a big fight that's coming in both of those professions, you know, as to whether this will be allowed or banned, right? And to your point, like, I think it's important for people to understand this because the people, you know, this is one of those classic things. The people with a specific interest are organized in one, you know, kind of against this. Everybody else who has an interest in this actually happening, they're not organized.

And so there's some danger that these things get banned out of the chute. And then the third one I would add, which I think is also very important, is a tutor for kids for their education, right? You know, teacher, coach. And so, you know, and again, this is the kind of thing, what does every rich parent have? They hire their kids tutors, right? And so the kid's having trouble in algebra and you hire a tutor and it may be, you know, some grad student from local university and they come in and they help the kid

you know, figure out algebra, but like most parents can't do that. And so what if every kid has an AI teacher, tutor, coach, mentor who's with them their entire educational career, who knows them inside out because it's, you know, kind of been with them and learning from them the whole time, that is completely friendly, completely devoted to the kid's success.

Right. You know, completely, genuinely devoted to the kid's success and then willing to work with the kid on any anything that comes up. And how do I figure this out? And what do I do with this? And, you know, and again, knows everything about math. And so it's very it's going to be able to teach the kid anything that he needs to know. By the way, that's already started the Khan Academy people, which is a nonprofit, very successful online kind of thing with lots of online education videos.

for free. They now have an AI tutor that they now provide for free, which is an early version of this that works really well. So this is already starting to happen. I have many, many, many more questions. This has been amazing. I don't want to take up any more of your time. I know how busy you are. This has been flipping fantastic. Look at all these questions I had for you that I didn't even get to because it was just such a fun conversation. We're going to do more of it for sure, though. Yes. Do it again.

We'll do it again. Amazing. Mark, this is great, man. Thank you, brother. This was so fun. Fantastic. Thank you for having me. Wow. Okay. How gnarly is Mark? When he puts that fire hose of knowledge on you, you're just like, wait, wait, I'm trying to follow. So fun. I feel so blessed to be able to talk to people like that and learn. And I learned so much.

In this talk, we are going to do more of it because we need to really dig down in AI. I almost feel like it's a cliche when I say the phrase because AI, AI, AI. But it's as you just heard, it's not a cliche. It's the incoming tsunami. All right. You know what time it is. It's time to check the lowdown line. Hello. You've reached literally in our lowdown line.

where you can get the lowdown on all things about me, Rob Lowe. 323-570-4551. So have at it. Here's the beep.

Hi, Rob. This is Amy from St. Louis. Love the podcast. Love your work over the years. My husband and I have a habit of watching the West Wing. We can't find anything else that we feel like watching for the night. And we recently watched The Beat Camp from Season 4. And in it, you do an impression of Josiah Butterfoot. And it occurred to me, of all the impressions I've heard you do,

Oh, I have many characters up my sleeve that I have not unveiled on the podcast for the very reasons that I was scared to death to imitate any of the characters.

I remember that scene vividly. And because, look, I knew and have known Martin since I was... I've known Martin Sheen since I was 13 years old. So I know him in a way that none of the other cast members do. And, you know, you know somebody well enough. And then as an actor, when you work with somebody...

as closely and as long as all of us did on the West Wing, you know their schtick, right? We all have schtick, everybody. I don't care how good you are, you have schtick. If you're bad, you have schtick. Everybody has a schtick. My good friend, Mike Myers, used to call it a monkey trick. Mike Myers would say, you know, I'm really good at this one monkey trick. And so the question was, do I do Martin's monkey trick?

Like when I'm imitating Bartlett, do I like how hard do I go after him? Right. And I think I remember going like halfway on my Martin impersonation, who I love, incidentally. But you didn't say in your call whether you liked it, whether you I mean, you noticed it. So that's good sign.

But I thought it was, my Martin Sheen is pretty good, but it's been a long time since I've done it. So I'm not going to break it out here. But if you keep listening, I am going to find a way to do it at some point and do my others, which I have not yet unveiled. Thank you so much for the call. Thanks for listening to Literally. Thank you all for listening. We'll get back to stupid entertainment people very, very quickly.

Not next week, because that person is not stupid. But there will be some coming. I promise you that. I will see you next time here on Literally With Me, Robbie Lowe.

You've been listening to Literally with Rob Lowe, produced by me, Sean Doherty, with help from associate producer Sarah Begar and research by Alyssa Growl. Engineering and mixing by Joanna Samuel. Our executive producers are Rob Lowe for Low Profile, Nick Liao, Adam Sachs, and Jeff Ross for Team Coco, and Colin Anderson for Stitcher. Booking by Deirdre Dodd. Music by Devin Bryant.

Special thanks to Hidden City Studios. Thanks for listening. We'll see you next time on Literally.

All set for your flight? Yep. I've got everything I need. Eye mask, neck pillow, T-Mobile, headphones. Wait, T-Mobile? You bet. Free in-flight Wi-Fi. 15% off all Hilton brands. I'll never go anywhere without T-Mobile. Same goes for my water bottle, chewing gum, nail clippers. Okay, I'm going to leave you to it. Find out how you can experience travel better at T-Mobile.com slash travel. ♪

Qualifying plan required. Wi-Fi were available on select U.S. airlines. Deposit and Hilton Honors membership required for 15% discount terms and conditions apply.