Here's what I would encourage people to do. Here's the thought experiment to do. Write down on a piece of paper two lists. What are the things that I believe that I can't say? And then what are the things that I don't believe that I must say? And just write them down.
What happens when startups don't just sell the tools, but decide to take over the entire industry? On today's episode, Mark Andreessen, co-founder of A16Z, sits down with Jack Altman, co-founder and CEO of Lattice, to unpack how the venture industry is changing from small seed funds to multi-billion dollar barbell strategies, and what that means for founders, funders, and the future of innovation.
Mark explains how the classic playbook of picks and shovels investing gave way to full stack startups like Uber and Airbnb, and why the biggest tech companies today are not just building tools, but replacing entire sectors.
He also talks about the realities of fund size, venture returns, power laws, early stage conflict dynamics, and why missing a great company matters far more than backing a bad one. And then it gets even bigger. Mark dives into AI as the next computing paradigm, US-China geopolitical risk, and why Mark thinks we're in a capital T test for the future of civilization. This episode is about asymmetric bets, ambition at scale, and the deep forces reshaping tech and power. Let's get into it.
As a reminder, the content here is for informational purposes only. Should not be taken as legal business, tax, or investment advice, or be used to evaluate any investment or security, and is not directed at any investors or potential investors in any A16Z fund. Please note that A16Z and its affiliates may also maintain investments in the companies discussed in this podcast. For more details, including a link to our investments, please see a16z.com forward slash disclosures.
I am so excited to be here with Mark Andreesen. Mark, thank you so much for doing this with me today. Jack, it's a pleasure. So what I wanted to start with was the topic of small funds, big funds. We had Josh Koppelman on the podcast, and he made a point that resonated around fund size, the outcomes in venture, and sort of just like looking at the math of all of it. And I think as venture funds have grown, it sort of spoke to a lot of people about like,
kind of what the plan is and sort of how tech is going to go. And so I guess to start, I'd be curious to hear your thoughts around that whole dynamic. Obviously, you know, you've got a big venture firm. And so I just want to hear kind of your perspective on this whole topic to start. So start by saying, like Josh is a longtime friend and I think is a hero of the industry.
And I say that because, you know, he started First Front Ventures back in the very dark days. I forget the exact year, but, you know, back during the dark days of after the 2000 crash. And in fact, you know, there was a period of time back there when, you know, the total number of angel investors or seed investors operating in tech was maybe 8%.
And, you know, actually Ben and I were two of them. But, you know, this was sort of the heyday of, you know, Ryan Conway and, you know, kind of Reid Hoffman and a very small group of people who were kind of brave enough to invest in new companies at a point in time when, you know, basically everybody believed the internet was over. Like the whole thing was done. And so I just think that was an incredibly heroic, brave act. It obviously worked really well. You know, it turns out by low, sell high actually is a good strategy. It was very good. Yeah.
It's very nerve wracking when you're trying to do it, but it does work. And he had brilliant timing for when he started. And, you know, the companies that he supported have gone on to become incredibly successful. And we've worked with him a lot. So, you know, we're a big fan of his. And then second is I would say I didn't actually I heard I heard there was a discussion. I never as a rule, I never read or watch anything I'm involved in. That's good. Well, it wasn't about, you know, I totally missed it.
And to summarize, basically, what he was saying is he coined this like venture arrogance score idea. But basically, the idea is, you know, if you're going to own 10% of a company at exit, and you want to have a 3x fund, and you're probably going to have a parallel of outcomes, you basically need your big outcome to be like really big. And so like, how's the math shake out? And basically, you know, the question he was sort of posing broadly is, are the outcomes gonna be much bigger, you can own a lot more, you can hit a lot more winners, but it was sort of like that math question. So I'll say a couple
of things. So one is, look, venture is actually a customer service business in our view. So start with this. So it's actually a customer service business. There are two customers. There are the LPs and there are the founders. And we think of them both as customers. And so, you know, at the end of the day, the market's going to figure this out and the LP money is going to flow to where obviously they think the opportunities are. And the founders are, you know, as you know, the best founders definitely pick who their investors are. It's actually very unusual, right? Asset class. It's the only asset class in which the recipient of the capital picks the
Yeah. You know, actually cares where the money comes from and picks it. So the market will figure this out. I think the big thing, to respond to your general point, I think the big thing is the world has really changed. And so, you know, modern venture capital in the form that we understand it is basically –
You know, there were examples of venture capital going back to like the 15th century or something with like, you know, Queen Isabella and Christopher Columbus and whalers off the coast of Maine in the 1600s and so forth. But modern venture capital was basically a product of the 50s and 60s. Originally, this guy, Jack Whitney from the Whitney family sort of created the model. George Dorio, who's a MIT professor,
created a version of it. And then, you know, and then the great, you know, the great heyday of the 1960s VCs, Arthur Rock and those guys and everybody that followed Don Valentine and Pierre Lamond and Tom Perkins and so forth, Gene Kleiner, you know, all those guys, basically, basically from that period, it's called the 1960s through call it 2010. There was like, there was just, there was a venture playbook and it became a very well-established playbook. And it sort of consisted in two parts. One was a sense of what the companies were going to be like,
right? And then the other was what the venture firm should be like. And so the playbook was the companies are basically tool companies, right? Basically all successful technology companies that were venture funded in that 50 year stretch were basically tool companies, right? Pixel shovel companies. So mainframe computers, desktop computers, smartphones, laptops, internet access, software, SAS databases, routers, switches,
you know, disk drives, all these things, word processors, tools, right? And so, you know, you buy the tool, the customer buys the tool, they use the tool, but it's a general purpose technology sold to lots of people. Basically, around 2010, I think the industry permanently changed. And the change was the big winners in tech more and more
are companies that go directly into an incumbent industry, right? Like insert directly. And I think the big turning point on this was like Uber and Airbnb, right? Where Uber could have been, like Uber in 2000 would have been specialist software for taxi dispatch that you sell to taxi cab operators. Uber in 2010 was, screw it, we're doing the whole thing.
Airbnb in 2000 would have been booking software for bed and breakfasts, right? Running on a Windows PC, right? And then Airbnb is just like, screw it, we're doing the whole thing. And so, and you know, Chris Dixon came up with this sort of term, the full stack startup, which he kind of meant. But the other way to think about that is just you're actually, the company is delivering the entire thing.
basically promise the technology all the way through to the actual customer. Which is basically quicker to get there. Also, I suppose you get more margin capture when you do it that way and you just get the technology seeped in rather than having to sell it through. Is that the idea? Prior to 2010, there were two kinds of tool companies, consumer tool companies and business tool companies, right? So, you know, B2C, B2B.
Right, as we called them in those days. And, you know, the consumer side was great, but like, you know, consumer, you know, it's just like selling, you know, video games and consumer software is great. You know, flying toaster screensavers, it was great, but there was only so far, you know, that was going to go. And then the B2B side for things like taxi dispatch or for, you know, bed and breakfast bookings. The problem is like you're selling advanced technology into incumbents that are not themselves technology companies.
And so are they actually going to take those tools and then actually build the thing that the technologists know should actually get built? More modern version of that is what you see now happening with cars, right? So who's going to build the self-driving electric car, right? Is it going to be an incumbent who's able to adjust, who's buying good components to be able to do that? Or is it going to be a Tesla or a Waymo that's going to do that? Same with SpaceX and NASA, I suppose. Exactly, yeah. There are many companies that sell...
technological components that go into rockets, but was any of that going to lead to the existing rocket companies making the rocket that's going to land on its butt and then, you know, be relaunched within 24 hours, right? And so, and by the way, same thing, Airbnb, or Uber, had you sold the Uberized version of taxi dispatch software to the taxi operators, would it have resulted in the Uber customer experience? And so I think basically what happened was, and there's sort of, you know, these, you
Peter says these things are overdetermined. So it's a bunch of things that happened, but it was sort of the, it was sort of the smartphone completed the diffusion kind of challenge for getting computers in everybody's hands. And then mobile broadband completed internet access in everybody's hands. And then the minute you had that, there was just no longer, you just had this ability to get directly to people in a way that you just never had. You didn't have to like have a giant marketing campaign. You didn't have to,
you know, have a giant established, you know, consumer brand. And so there was a way to kind of get to market that didn't previously exist. And then, you know, and then look also consumers just evolved. And, you know, people, especially, you know, kind of Gen X and then millennials were just much more comfortable with technology than the boomers were. Yeah. And they, you know, the sort of Gen X was entering, you know, and boomers or millennials were kind of entering their consumer prime at the time this happened. And then you started having these big successes. And so you started lining up Uber, Airbnb and Lyft and SpaceX and Tesla. And, you know, you kind of, you start stacking these up. And at some point you're like, all right,
there's a pattern here, right? There's, there's a thing that's happening. And, and that's what's happened. And we're 15 years into that. And what's happened now is basically that idea now has blown out basically across every industry, right? And so, so, so the tech industry used to be a relatively narrow tools, picks and shovels business. Today, it's a much larger and broader and more complicated business
basically process of applying technology into basically every area of business activity. The result of that is that the companies are much bigger. Like when you're the whole company, when you're both the picks and the shovels to yourself of the whole company, you're much bigger. And that changes venture math. Yeah, you eat the market, right? And so Tesla ends up being worth more. Like there have been points in time in the last five years when Tesla has alone been more valuable than the entirety of the entire auto industry put together.
Right? And SpaceX is, you know, like, you go through this and Uber is worth far more than the totality of every black cab operator and taxi cab company that ever existed. Airbnb is worth far more than the bed and breakfast industry ever was. And by the way, it turns out some of these markets just turn out to be much larger than people think. Right? When we do a retrospective on our analysis over 15 years, like one of the things that's been hardest for us to do is to do market sizing.
And sometimes we overestimate market size, but it's... More often it's the other way. More often, well, for the winners. Yeah, yeah, yeah. More often it's the other way. I guess the net blend is that you underestimate it. Yeah. And this goes to venture economics that we'll talk about. So the core thing on venture, the core thing on venture bets, right, is because venture doesn't run on leverage. Yeah. Right? Because nobody will bank. Yeah, right. Nobody will bank a startup or a venture firm for leverage because there's no assets when these things start. Yeah. It's asymmetric. You can only lose one X. Yeah. Yeah.
But you can potentially make a thousand X. And so that means, right, then there's two errors in venture. There's the error of commission where you invest in the thing that fails. And then the area of omission where you don't invest in the thing that succeeds. And of course, just in the math, overwhelmingly, the error that matters is the error of omission. And so if you run an analysis, and by the way, lots of people did this, you run an analysis that says ride sharing is only ever going to be as big as taxi cabs. That leads you to the error of omission and not making the bet. And therefore the difficulty of market sizing. Yeah.
in your view is this only is does that only apply up to a certain size or you know when you look at some of the rounds that now happen at huge valuations and companies that would otherwise you know be a large ipo like let's say somebody's raising 10 billion at 100 billion or something
Does the power law still apply up there? Like, how do you think about that type of round? Or do you see venture capital sort of turning into private equity at some level at the higher end of things? Yeah. So I think there's two questions kind of embedded in there. One is why aren't these companies public? Right.
That's one question. And then the second question is like, even whether they're public or not, like, can they actually- Is it still the lose one, win 20 type of dynamic? Yeah, so I think there's a bunch of ways to look at that. So like the smartest public investors I've met with basically have the view that the public market actually works just like the private market with respect to this dispersion of returns. The extreme case I'll make sometimes is it may be that there's no such thing as a stock. It may be that there's only an option or a bond.
Right. Like, so, so, so, and the reason is because there's, there's fundamentally two ways to run a company. One is to try to shoot the moon. One is to try to build for the future. And then the other way is to try to harvest the legacy. Right. And if you're shooting for the moon, the big risk, the big, then you're the big risk of that is, you know, you might fail, right? You might, it might not work, but if it works, you have this telescoping effect in the,
public market just as much as you have in the private market. And historically, the returns in the public market have been driven by a very small number of the big winners in exactly the same way they've been driven by that in the private market. In fact, you see that playing out right now in the S&P 500. So one of the things I've been saying for years now is the S&P 500 is no longer the S&P 500. It's like the S&P 492 and the S&P 8.
So there's like 492 companies in the S&P that have no desire at all, right? Just like watching their behavior to like really charge hard at the future. Like they don't want to do it. They won't do it. They're not doing it. And then eight are betting everything. Eight are all in, right? And then I always say, you know, who are they? And everybody always knows who the eight are because it's completely obvious who the eight are because they're the ones that are building all the new things. And then again, if you disaggregate like public market returns over the last 10 years, you see this just dramatically
you know, explosion of value among the eight and you see a relatively modest growth of the 492. So even the S&P 500 is like having a portfolio of like bonds and options. Yeah. And it's like it's like incredibly barbell. And so I just I think and then people people get cynical on this and they say, well, you know, if not for the eight, you know, the stock market. You're like, yeah, but that's the whole point. That's the whole point. Right. If you have a healthy, functioning, capitalist economy, the whole point is some number of these things are going to go down.
This is like when someone says, ah, they're not a very good investor, but they invested in name that hundred billion dollar companies. They got lucky. Well, you're like, okay. Yeah. That's the point. That's the job. That's the desired outcome. That's the thing. You know, any of us who, you know, it's like, you know, kind of the classic joke, like joke of venture is like, isn't there just a way to invest in the good companies and not the bad companies? It's like, yeah, like, okay. For 60 years, we've been trying to figure that out. Here's a fun fact in the final analysis over the last 60 years.
Every one of the really great venture firms through that period missed most of the great companies while they were investing. The best firms in the world, whether it's Kleiner Perkins in the 90s or Benchmark in the 2000s or Sequoia in the 2010s or whatever, they just like flat out missed most of the winners in each cohort.
And on one hand, you're just kind of like, wow, I can't, can't you do better than that? But you've had these super geniuses for a very long time trying to do better than that. And I, you know, we could have a whole separate conversation about why this is so difficult. The thing you said about companies building, you know, the whole stack, roll-ups are super popular. Should I, is it?
Is it fair to take from what you said that you're bullish on that strategy or not necessarily? And basically just, you know, to walk out. And I mean, you know, instead of, you know, building accounting software and selling it to the accounting firms, just buy an accounting firm, become an accounting firm, AI, FI yourself, which I think is becoming like a more popular strategy. Do you like that? Or is there a nuance why it's different to buy something rather than build it yourself from the beginning? What do you think of this whole roll up thing? Yeah. Let's, let's come back to the venture question. Cause I was still, I was still winding up into that. But however, uh,
This is actually also relevant to that. So, yeah, so there are a bunch of really good firms that are trying to do this roll-up thing. I mean, the opportunity with it is kind of very obvious. The challenge with it is just cultural change of an incumbent is just a legacy company is just really difficult. Charlie Munger was once asked, you know, a few years ago, he said, you know, GE, I think was the company that was going through a big
issue at the time. And he was asked at a shareholder meeting, how would you fix the culture at GE? And he's like, I have no idea. I don't even know how you would change the culture at a restaurant. Yeah, that's funny. Right? Like, how do you do that? It's really hard. Right? It's really hard. Yeah. And so, you know, you have to have a theory on that. I mean, people, they do have people doing it do have theories. I think we're much more oriented towards just trying to back
Well, I think it gets a little into this like private equity. It's a little bit of the venture private equity blend I see happening is related not even just in dollar size, but in the mindset here. Well, this is where I go back to my bonds versus options thing. Like fundamentally, the way I'd always describe venture is like fundamentally, we are buying long dated out of the money call options. Yes.
which like seems completely insane except when they pay off they pay off like spectacularly well but like a lot of them expire out of the money and like you know yeah you know statistically top inventory capital has a 50 plus percent yeah yeah yeah okay yeah i just want to get your hot tape i really wanted to hear about this but yeah we can go back to the venture method because i think there's a lot more in there okay good so so look so so anyway so what's happened is the world has changed the the the the number of companies that are that are being founded that are going to be important it keeps expanding the number of categories that those companies are in keeps expanding those companies are more complicated now
because they're full stack, they're in these incumbent industries. And then the winners are getting bigger. And again, you just look at that in the market. I mean, look, we have, you know, of the S&P eight, they're like, they're all venture backed, right? Every single one of them was venture backed. They are on any given day, any one of them is bigger than the entire national stock market of countries like Germany and Japan and the UK, right? And so the telescoping effect- The numbers are just absurd. The telescoping effect of victory is just incredible.
Right. And so what Ben and I did is we looked at it and we kind of, we started our firm kind of as this was happening and we looked at it and we said, all right, like this is different. This is, you could sit here and do things the old fashioned way, but the world is moving on. And then this goes back to the customer service aspect. The founders who are starting these kinds of companies need something different. Yeah. It's not sufficient anymore to just, you know, to have investment.
the state to have investors who were operating the way that they were investing, you know, for the previous 50 years. That's not the value proposition that they need. That's not the help that they need. And so there's a different way to do it. And so I think what's happened is like the industry, the venture industry, it had to restructure in order to basically accommodate the change in the market.
Now, having said that, I don't think that's an argument that it's just therefore big firms win everything. That's definitely not my thesis. And by the way, that's also not how I'm deploying my own money, which I'm going to talk about because I'm living what I'm about to say. Which is I think what happens is what Nassim Taleb calls the barbell.
And the way to think about the barbell is basically you basically draw, you basically have a continuum. And on the one side of that continuum, you have high scale. And on the other side, you have high specialization. And what you see in industries that mature and develop in this way, including many industries in the last hundred years, basically what happens is as they mature and enter their kind of full state, as they kind of flower, what happens is they often start with generalists that are neither subscaled nor particularly specialized specialists.
And then over the fullness of time, what happens is they get disintermediated and then there are scale players on the one side and there are specialist players on the other side. The most obvious example of this in everybody's lives is retail. When I was a kid, there were these things called department stores. Pretty good selection at pretty good price, but not a great selection and not a great price, right? And then sitting here today, those are all out of business. They're just gone. And it gets crushed by Amazon on one end and then like,
amazing retail on the other end. Exactly, exactly. Right. And so and why do you go to Amazon or Walmart or, you know, the big and by the way, there were even these big box guys, you know, Toys R Us and so forth. And then over time, like Amazon and Walmart, even if that because when you go to Amazon or Walmart, what you get is just like an unbelievable selection of basically anything that's a commodity, right? You just buy at like super low prices and it's basically impossible to compete with that.
if you're subscale on the one hand. And then your point, and then the specialist retail experience is like the Gucci store or the Apple store, the $15 candles. - They gave you some Perrier when you walk in. - Oh, they love you. Like they're so happy to see you. Exactly, right. They'll do private showings for you and they pour the champagne and it's like an entire experience. And so what's happening is, and again, you see this in like the return, you just look on this return standpoint,
this is what's happened. This is where, this is how the value is. And then what happens is that just like gaps way out and it never comes back together again. And then what the consumer does is they build a portfolio of their experiences. And so they buy things at unbelievably cheap prices at Walmart and Amazon and then that gives them more spending money to be able to spend on the boutique. So this middle, the, the,
the bar that's in the middle that's kind of screwed. Yes. What is the mechanic by which they're in trouble? Is it because the customers go away? The founder customers go away. Yeah, the founder customers go away and then the office. Who are neither getting sort of like the size and scale value nor are they getting like a special focus of some sort. Correct.
Exactly. Can you do focus? Can you be a specialist with a $2 billion fund, let's say? So obviously we're at scale, but we do have a specialist approach inside the scale. And so we have investment verticals, they're discrete teams. They have, in some cases, discrete funds. And by the way, they have like trigger puller funds
trigger authority, they can make investment decisions. Like we don't run the firm where Ben and I sit and decide is this a good investment or bad investment like our specialists who make those. And you basically determine that by this is the size we think you can function. This is the biggest you can function as a specialist in a highly successful way. And then we're just going to put a bunch of those together. Is that like what defines the size? Yeah, well, so it's sort of it's stupid. Yes, yes. But it's two parts. One is what's the what's the external view is what what's the size of the market opportunity? Just
How much money does this strategy, does this vertical need? How many companies are going to be? How many different, you know, kind of, how complex is it? And then the other is the internal dynamic, which is like, you know, you want to, like, if you're going to have a team, you need everybody around the table being able to have a single discussion. And that puts natural limits on how big that can get. What's your limiting strategy?
reagent to building an even bigger firm? Is it number of productive partners that can do this then? Conflicts. Conflict policy. Conflicts. Conflict policy. That's the single biggest issue by far. So if you had 50, if you had all the great GPs all wanted to work here,
And you had like, that would still be the issue. Yeah. There would be issues. There will, there would be issues for sure to your point that would come with. So what's the conflicts thing? The conflicts thing. So the conflicts thing is the mainline venture firms forever, meaning, meaning the firms that do series A, series B, series Cs, especially series As and Bs. Yeah. The relationship with the founder is just so. It's too deep. So it is too deep.
And if you as a venture firm invest in a direct competitor, it's just it's a giant issue that the founder you're already invested in will be extremely upset with you. By the way, do you think that's practical? Do you think it's all emotions? Like, do you think it's correct that firms shouldn't do conflicts? I would say when we were startup founders, we felt this very deeply. It's just it's OK. So when you're a startup founder, I'll channel the other side. When you're a startup founder, the whole thing is so tenuous, right? It's just like, is this thing going to work? There's like 18,000 things can go wrong. Yeah.
People are telling you no every day, no, I'm not gonna come work for you, no, I'm not gonna invest in you, no, I'm not gonna-- And then your board member invests in a competitor and you're like dagger to the heart. Dagger to the heart. And then you literally, what happens is the founder is you have to go to the all hands meeting and explain why your investor has given up on you. Yes. Right? And you go in there and you do some song and dance about it, and they're just, and your employees are just like, your employees, basically your employees look at you and they're just like, you the founder are so weak and lame.
Right. You can't even get your board member to not invest in a competitor. Exactly. What about the marginal stuff, though? Because like, you know, all these companies are near each other. They blend. They evolve over time. So like, how does this how does this play out on a practical level for firms? It almost never plays out the way that the founders think it's going to play out. And I say that in two dimensions. Number one, the company is historically what we've seen is the founders who think that they're directly competing with each other generally end up not doing so because one or the other of them changes strategies and then they diverge.
Which, by the way, is natural because it's like specialization. The companies specialize, they end up not competing. But the other thing that happens is two companies that were not competing that you're already invested in pivot into each other. Yeah, and then they're mad at you. Yes, and then they're very upset. And you have to remind them that like that, you know, you didn't know that that was going to happen and it's not your fault.
And then they're still upset. And so I would say the founders are not, the founders, and also we have very low predictability of terms of where the conflicts are going to be, but that doesn't ameliorate any of the emotion at the time. And so it doesn't actually help. It doesn't help for us to explain to the founder, oh, don't worry about this guy who you think is directly competitive because he won't be in a year. Yeah. Because you can't prove that. And the issue is in the moment. What does that leave your, how does that impact your strategy? Meaning like-
If you know conflicts are this huge issue and you've got, you know, a big aggregate fund and so it's very important to catch winners and then you invested in, you know, Blue Origin, which is really good, but SpaceX is, you know, bigger or whatever happens. Yeah.
What does that imply for your strategy when it comes to like, should we, you know, doing seeds and A's and things like that versus like, say, you know what, let's just wait till like the D. Let's have D be our early stage. That's right. So the most obvious thing you do is you're just like, oh, we just need to wait because we need to wait for clarity. Just don't deal with this whole issue. Right. Just wait. Just wait. Keep delaying and keep delaying until it's obvious what the answer is. If it's big, it's going to be really big so we can buy later. But then the problem with that is, all right, now you're out of the venture business.
Right. Because now you're doing, as you said, now you're basically doing series Ds. Now you're a pure growth investor. And by the way, there are very good pure growth investors. But like our determination is to stay a venture investor. Yeah. Because we think that's kind of the whole point. Why is it so important? Is it just because it's what you like? Or is there a strategic reason that it's important to stay doing early? So we've always wanted, I mean, that's the way we've always thought about it. We've always wanted to kind of be the founder's best partner and like to be the one who's like the closest in, the one that can really be relied upon, the one that's going to be around for the longest amount of time, the
the one who they can really trust. - And it only happens early. - Yeah, like it's, yeah, it's your early guys. And so it's hard to insert after that. And then look, the other thing is like, there are great growth firms that do invest later and have done very well, but we just think there's so much information at the early stage. Like, so for example, when we make a growth investment, because we have the active venture business that we have, by the time we make a growth investment, you know, we have either invested in the company for several years or at the very least we've met with them repeatedly.
over time and so we just, we end up with just like enormous amounts of information. And then the other thing, by the way, is, you know, there's kind of time arbitrage, which is, you know, sometimes the right answer is just like, okay, just invest in SpaceX or whatever later on. But sometimes the answer is no, there's actually a new thing, you know. - Totally. - Do you invest in the MySpace growth round or the Facebook seed round?
If you're not in the early stage, you won't know that because you won't see the early things. And then by the way, the other thing I'd just say is financially, one of the things people say that is inaccurate is they say if you're running a big fund, you're not going to have the time to spend on the early stage opportunities because you can't justify it before you're putting the money. But that's actually not true in venture because the aggregate dollar return opportunity on early stage is just as high as any growth investment, right? Because if you get the right venture investment and you can make $10 billion on the upside case, it's definitely worth my time to spend with you. So I spend as much
time as I can with the early stage founders, you know, for that reason. So the barbell, there's, you know, there's big on one end, there's something sort of like me on the other end, selfishly, I'd love to know, like,
You know, I would assume you think it's better to be the big version. But, you know, if you were conditioned on needing to be me at the small end of the barbell, like how would you approach it? They're both good. They're both good. This is the thing is they're both good. They're both good. And if I were for some reason not doing this, I would immediately do what you're doing. Right. So that's good to hear. Yes, 100%. And then I would say I actually invest this way. So my liquid assets are basically tied up in either A16Z funds on the one side or I run a very aggressive personal investment program in China.
basically angel and early stage seed funds. And it's because I believe in the barbell. I believe in the barbell so much. And so, but the conflict thing I wanted to explain because that's the issue. So the big part, like we do seed investing, it's just we have this problem every single time we're looking at a seed investment, which is like, are we really fully convicted?
that this is going to be the winner. Even at seed, it creates a conflict for a board seat. There's debates. There's always debates on this. It's like, you know, do the seed ones care as much? Do the growth ones care as much? Do the crypto ones care as much? What I tell you is it's not a logical question. It's an emotional question. And we're just very sympathetic to the founder that needs to be able to...
justify their, you know, authority. You also definitely can't ask while you're making, like if somebody asked me while they were making the investment, hey, is it okay if we invest in a conflict in a couple of years? I'd be like, what are you talking about? You know, we've done these things. We've tried to, we used to have this thing. We used to have this separate branded thing called A16C seed. And we were like, well, we have a different conflict policy on this. And it's, it's a great in theory, but it's like, no, it's A16C. And so the way I think about it basically is like the more successful you are as a, as a, as a venture firm, the bigger the issue this is going to be because the more the people that you were investing in are going to care. Yeah.
And so it's just, it's like the downside of success, but like success, you know, right. The only people who, like the only investors you don't care whether they invest or not is if they, literally, if you don't care what they think about anything, right? If they just don't matter at all and everybody knows that they don't matter at all. So there's that. So therefore it can be simultaneously, both of these things are true. Number one is we still, we definitely do lots of early stage investing and we will do, we do make seedbeds.
But it's just also true that we can't structurally for this reason, we cannot do all of the seed investments that we would like to do. In fact, we can't even do a tiny fraction of them. It's just like strategically, we just, structurally, we just can't do it.
And so, and again, this goes back to the barbell. So that means structurally, it's the same reason why Amazon can't give you the champagne experience, right? It's the same thing. They can't, they're not set up for it. They can't do it. It's not a scaled strategy. And so what has to happen is there has to be the other side of the barbell. There has to be the specialization and intense focus and deep relationships.
Yeah. Right. Thing. And that's the role of the angel investor and the seed investor. And that's, of course, in startups, that's incredibly important because that's the most formative, right, fraught time in the life of these companies is when they're first getting started. Right. And as you know, right, half the time, these are people who haven't, you know, they haven't started a company before. They haven't run a company before. Some of them haven't had a job before. Yeah. And so, like, they need to learn a lot and they need people to work with them on being able to do this. And they need to figure out how to actually, you know, do these things.
And so there have to be, and there are like incredibly high quality seed investors, angel investors on that side of the barbell. The big firms presumably, if we succeed, we succeed by generating large numbers of aggregate dollars and a very good percentage return. The seed investors have this perpetual opportunity to just absolutely shoot the lights out, right on upside. And you can, there are seed funds that generate like 200X, 300X returns, right? And so these are both good strategies.
they're both adapted to the current reality of the market. There's just two things that fall out of that. One is the death of the middle, which is it just doesn't make sense to have the old fashioned, you know, series A, series B, six GPs, $300 million fund sitting on Santa road, waiting for people to walk in the door. Like those days are over and those funds are, you know, those funds are shutting down like that, that model is going away. And then the other thing that happens that causes some of the tension is the,
what does a successful seed investor do? He raises more money and wants to become a venture investor. But then you're going from one side of the barbell back to the middle and you're creating that same problem again. And I think that's where the tension is coming from. I also feel like the mechanic that happens a lot of times is when you grow the fund, you raise a huge fund and then you start deploying it into things just because you've got to deploy at some pace. And so the threshold for, "I've got to deploy 400 million this year,"
and I only see $700 million worth of investable things, I'm going to do four-sevenths of them. Versus, you know, presumably if you only had to do one-seventh of it, you would, you know, you'd pick better, hopefully. Yeah. Which I think is a huge mechanic to it. So I think that's part of it, but I think the related thing is your competitive set has changed. Yeah. And what we find with seed investors who migrate up and then regret it later, what we find is that they didn't realize was their competitive...
Right, because now they're going for bigger, more competitive rounds against you and Sequoia. Yeah, all of a sudden, okay, now you're competing for series. $15 million, be good luck. Right, right, exactly. And so it's just like, and look, like market fundamentalist, if you have a better value proposition than Sequoia, you should go off of that. But I just, I would not accidentally end up competing with Sequoia for series A's. Like I would just say that's a bad way to live. Yeah. And I think that is what has happened to a bunch of the seed funds that have gotten larger. Why is it so rare for somebody to break through and get, I mean, you did it.
And that's one that happened in the last 15 years. Maybe there's a couple others, maybe. But why is it as rare as it is? It seems like almost more rare than a new big company in a way. Yeah, that's true. In fact, our analysis actually when we started was there actually hadn't been, I think there had been two firms, Andy Radcliffe actually. I mean, Thrive also. So Thrive was, yeah, they were after us. I mean, they've done great. But in the 30 years before us, we think that there were only two new VCs that actually punched through to become top tier companies.
In other words, VCs that were not either firms that were built in the 60s and 70s or firms that weren't derivations of those firms. Founders Fund? No, no, no, no, no. The Founders Fund started at actually around the same time we did. Okay. They were a little bit earlier, but around the same time. But I mean over the preceding like 50 years. Okay. Seven Rosen.
You won't even know. This is sort of the thing. You won't even recognize you. I need to read a book or something. So seven Rosen was the venture firm, the famously funded compact computer, which was the big, the big, the big winner. And then they went on to become a successful firms. Got Ben Rosen early, early leader in this space. And then there was a firm called Hummer Winnblad, uh,
which was a software specialist firm in the late 80s, early 90s. Those are the only two that punched into the top end while they were operating. Wow. Neither one of them, you know, sustained it. But they got there. They got there for a bit. But that was like the success case. Right. So this is a little bit like Elon looking at the history of the car industry and saying, you know, Tucker Automotive in the 1950s. So it's so rare. It's very, very rare. So why is it that rare? Two reasons I think it's rare. So number one, there's the intimate reason for it and then a sort of macro reason for it. Intimate reason for it is just...
Like, you're going to have this incredible, as the founder, you're going to have this incredibly intimate experience, you know, very close trust relationship with whoever you're working with. And it's like, you know, can you reference them? You know, do they have a history and track record of the kinds of behavior that you need and the kinds of insight, you know, that you need? And it's just like, it's very hard to do that. It's very easy for an existing firm that has a long track record of success to prove that. It's very hard if you don't. So that's like the close in reason. But then the other reason goes back to the way the world is changing is that
We always believe the thing that you want from your venture firm is power. So the thing is a startup that you want is you want them to like fill in all the missing pieces that you don't yet have when you're starting a company that you need, you need to succeed. And so you need power. And so you need power. It means like you need the ability to be able to like actually go meet customers and have them take you seriously.
You need the ability to go get publicity and like, you know, major, you know, channels, you know, if used to be media, now it's podcast and be able to like get taken seriously. You need to be able to be taken seriously by recruits, right? Because there's thousands of startups recruiting for engineers. What makes yours stand out? I sometimes describe it as venture firm is providing a bridge loan of a
Until you have your own brand that's big or bigger, you know, for your own space than the VC, you're borrowing your VC's brand. Exactly. And that has been very effective for a long time. And that was how we looked at it when we were founders. That's why you did media from the beginning. Yeah. Oh, that's one of the reasons. It's one of the reasons, yes, but a very, very, very powerful one. Yeah. Very, very major one. Yeah. And then, by the way, you also need ability to raise downstream money, right? You're going to have to need to raise money again. And
So they either need a lot of money or they need to be connected to a lot of money. Yeah, exactly. Right. Exactly. And so you just better if they just have it. Yeah. It's like being full stack. Well, then, by the way, now you're getting also like, again, you think like tools companies just never got into like, for example, politics. Right. Or just let's just say global affairs, global events. Like what's happening with, you know, like what's happening? How do you navigate the world?
Right? How do you navigate Washington, you know, when the regulators show up and they want to kill you, like, how do you navigate that? Or you're like, it's to get in some, you know, giant fight with the EU or what? Like, so, so the, especially these full stack companies, they're, they're, they're getting involved in like very complicated macro political geopolitical situations, like much more early. And they have to like, in some cases, they have to like escalate up to like, you know, senior government officials, heads of state.
you know, major heads of sovereign wealth funds, they need to get to, you know, CEOs of major companies, you know, how do you get to the CEOs? You know, you're a new AI company and you're trying to redefine, you know, visual production for movies. How do you get to the studio heads? Yeah, right. The studio heads just don't have time to meet with a thousand startups. So where are they going to meet with you? Right. So basically it's projection of power. And this has been one of our theories, how we built our firm is optimized for maximum amount of power in order to be able to give the startups access to it.
Right. Both the startups that are already in your portfolio, but also the startups that don't even exist yet. Right. And this goes to why the scale thing matters so much. It's just like, all right, there's just there's a scale aspect of power. There's a big difference between being able to get to everybody who matters and not. Why is it rare for people to be able to accumulate power, even if they were like, let's say everybody was trying to do it. It's not like everybody could do it. What's the cause of the rarity to be able to build enough power in that sense?
It started with you have to want to. And so we met with all the GPs of all the top firms, basically, when we were starting out because we wanted to see who we could be friends with. And it worked very well in some cases and not well in other cases. But one of them told us, this is a GP at a top firm in 2009. And he said, yeah, the venture business is like going to the sushi boat restaurant. All right. And so the sushi boat restaurant is a sushi restaurant where they've got the boats. They've got like a water track. Like a conveyor belt. Conveyor belt, right. And the little sushi boat comes by. Like a wadsum.
And there's a tuna roll and there's a shrimp roll and there's this or that. And he said, basically, you just sit on Sand Hill Road. And you're like, we're going to crush these guys. And the startups are going to come in. And he said, you know, if you miss one, it doesn't matter because there's another sushi boat coming up right behind it. And he's just like, you just sit and watch the sushi go by. And every once in a while, you reach into the thing and you pluck out a piece of sushi.
And we walked out of that saying, like, what the hell? That's funny. Like, in what industry? 2009 or something? 2009, yeah. Like, that was a very common, this again, this was the mid-sized venture. One of the reasons, when I came, like, look, in 1994. I mean, it might have kind of been like that. It was. It was. When I came to Silicon Valley in 1994, I had never heard the term venture capital. I didn't even know the thing existed.
And then my business partner, Jim Clark, explained it to me. And I was like, there are guys like they're just sitting there waiting to give you money. But you see this and you're like, this is going to get eaten alive. Of course, this is absurd. Like the minute anybody takes this seriously, it's all going to change. Right. And so it was this very clubby cartel, you know, basically kind of thing. And again, it was fine as long as the ambitions of the industry were constrained. And then again, look, the tools companies, they didn't need all the power. They needed some of the power. Right. But they didn't need all the power.
You know, they weren't dealing with like governments, right? Or, you know, these sort of big macro issues, you know, at least, you know, in the early years. Well, okay, so here's another thing that's happened is just the world is globalized. Like, so startups 30 years ago, you would spend your first decade just in the US and then you would start to think about Europe and global expansion. And,
And now you just, you have to think about being a global company upfront. Cause you're gonna, if you don't like you're other people are gonna do it. - Yeah. - Right. And so you just, you have to chin up as an entrepreneur. Like the expectations are much higher than they used to be. - Maybe one final question on this topic of fund size. And then I want to go to AI.
What do you think, and I know you've thought about this a lot, what do you think is the limiting factor for the creation of a lot more really big companies? Do you think it's founders? Do you think it's capital? Do you think it's market maturity? Do you think it's underlying tech stuff? Like if you had to pinpoint the one or two things that you think would allow for there to be way more big companies, like what is it?
So there's sort of the holy trinity of venture startups, which is, you know, people, market and technology. And I think the answer is sort of all three. And the way I would describe it is there's some limiting issue with just markets, just how many markets are there? How big are they? How ready is the market to take something new?
Then there's the technology question, which is, you know, when is the technology actually, like for the venture perspective, technology moves in stair steps, right? And so things become possible in the world of smartphones that just weren't possible. You know, you couldn't do Uber when everybody had a laptop. You had to wait until they had phones. Yeah. Right. And so technology moves in a stair step. You get these, you know, paradigm shifts, platform shifts. And those just, they come when they come. Yeah. And until they come, you can't do it. And then the people side, you know, and this is the one that, you know, I say, you know,
vexes me the most, which is like, okay, like how do you just get more of great, great founders? Yeah. Right. Um, and I think part of that is, you know, you, I think there is definitely a training thing that is real and getting people into the right scene in the right way. And like the thing that my commenter does or the thing that deal fellows do, like those are real things. Um, and those help a lot, but also, you know, there is an inherent, you know, there are just certain there's, there, there are not infinite number of people running around who have the
You probably figure there's a lot of people who could have built big companies who haven't, though, and hopefully a lot. Yeah. Yeah, I don't know. Some. I don't know. Some number. But there must be people who are just like in academia or government or education who are just doing something completely different, who if they were attracted to startups would have built a big company.
So yes, but then the other question is like, well, okay, then why didn't they? Why didn't they do the things required to get themselves in that position? Well, it could have been then like 2001. It was just like too many people were too scared to do it or didn't know about it or whatever. But what does that tell you about the people who didn't do it? Yeah. They were heard. I can tell you who didn't listen to that, right? It was Mark Zuckerberg.
Are there more good... But let's just press this point harder for a moment, which is like, I always describe this as like, I always call this the test with a capital T, which is like, okay, like if you're not in position to do the thing, it's the fact that you're not in position to do the thing meant that you've already flunked the test. Well, I guess the question would be, is there a subset of people who could build
Facebook, who other than being too scared to do it, would have had all the other ingredients. And so when everybody's not scared, you get more Facebooks. You know, there's a line in the movie. I actually never saw the movie, but there's a line in the movie. If you could have built Facebook, you would have built Facebook. Yeah, yeah, there's a line there. Yeah, yeah, yeah. That's right. That was a good line. Right. And so this is the thing. It's like...
Are there more great founders today than when you were, let's say, in net? Like, do you think there are more now than there were 20 years ago? I believe there are. But like, maybe there's, how many more are there, right? Is it five times more? Is it like 50% more? Or is it? Well, so look, the number of wins is increasing. So we used to talk about the 15 a year that matter. That number is probably, if you do the analytics, probably up like 10x. There's like 150 companies. 150 companies a year that like really matter. And the reason is because there's so many more sectors now.
Right. So you get the industry maturation. And so kind of by inference, there kind of have to be like. You're saying the markets are better more than you're saying the founders are better. Well, maybe a little bit of both. Look, also, I think the founders are getting better. Part of the founders getting better is they have. Better training. They're all on the, well, to start with, they're just all online. Yeah. So when I showed up here in 1994, like literally there's like three books in the bookstore. Right. None of which were that great.
Yeah, it's not that the DNA is better. It's that now the ecosystem has matured to teach people better. Yeah, and like people come in and they've watched every video, you know, they've watched every episode of, you know, your podcast. Hopefully. Right, and they just walk in knowing all this stuff. And then, yeah, look, the white commentator didn't exist and, you know, that definitely helps. And, you know, Teal Fellows didn't exist and that definitely helps. There's, you know, Brian Eno has this great term, scenius, you know, scene plus genius, right? And so it's just like, you know, the individual genius on his own is always, it's always, you know, it's hard to just get,
get things done. Some people do, but it's difficult. It's more often, more often in a profession where you're seeing creativity happen. There's almost always a scene, you know, as you know, Silicon Valley is definitely a scene in that way. People, people come here and they just, they kind of get, I don't know, they just get better. They just, you know, they meet more people who are like them. They're able to aggregate together. They learn from each other. So yeah. So look, the founders are getting better. There's more of them, but is, is there, does that mean there's now 10,000 as opposed to a thousand?
I don't know. And there's 8 billion people on planet Earth. Why are we debating whether it's 1,000 or 10,000? Yeah. Right. And so that I don't know. I would hope over the next years and decades we'll all figure out a way to go make sure we get everybody who can do it and get them to do it. That's a good segue into AI. Do you feel that we're now at the beginning of what is like the new next important, you know,
paradigm? Like is this cloud but on steroids? Oh, yeah, much, I think much, I think much larger. And I'll explain why. So, um, yeah, so so I described, you know, I described described before, right, you know, the triangle people technology market, the technology is ultimately the driver is the tech, the technological for venture, the technological step function changes drive drive the industry.
and they always have, right? And so if you talk to the LPs, you can see this. It's like when there's a giant new technology platform, it's an opportunity to reinvent a huge number of companies and products that now have become obsolete and create a whole new generation of companies, often generally end up being bigger than the ones that they replaced. And so in the venture returns map this, and so they come in waves, and the LPs will tell you it's just like, yeah, there was the PC wave, the internet wave, the mobile wave, the cloud wave. Like that was the thing. And then by the way, in venture, when you get stuck between waves, it's actually very hard.
Right, because you've seen this for the last like five years. Like for the last five years, it's like, how many more SaaS companies are there to...
found like just we're just out of ideas out of categories yeah right and so it's when you have a fundamental technology paradigm shift that gives you an opportunity to kind of rethink the entire industry it would have been very sad by the way if the ai breakthrough didn't happen like the state of venture would be sad i think three years ago this was i mean so when we were talking to rlps three years ago we're just like basically like you know we're in you know we're so uh chris dixon has this uh framing he uses he calls it your adventure you're either in uh search mode or hill climbing mode
And in search mode, you're looking for the hill. And it was search mode. Right. And three years ago, we were all in search mode. And that's how we described it to everybody, which is like, we're in search mode and there's all these candidates for what the things could be. And AI was one of the candidates, right? It was like a known thing, but it hadn't broken out yet in the way that it has now. And so we were in search mode. Now we're in hill climbing mode. Thank goodness, yeah.
Big time. Yeah. And then, you know, look, like I said, on the technology breakthrough itself, I think a year ago, you could have made the argument that like, I don't know if this is really going to work because LLM's, you know, hallucinations, you know, it's great that they can write Shakespearean poetry and hip hop lyrics. Can they actually do math? You know, can they do, can they write code? And now it obviously is. And now they obviously can. And this, I think for me, the turning point, the moment for certainty for me was the release of O1. So O1 from OpenAI, the Reasoner, and then DeepSeek R1. The minute I, and those happen kind of back to back.
And the minute those popped out and you saw what's happening with that, um, and the scaling law that was around that, you're just like, all right, this is going to work because reasoning is going to work. And in fact, that is what's happening. Like it's, it's, it's, you know, and, and I would say, just say every day I'm seeing product capabilities. Yeah. You know, I'm seeing new, new technologies. I never thought I would live to see like really profound, um,
I actually think the analogy isn't to the cloud or to the internet. I think the analogy is to the image of the microprocessor. I think this is a new kind of computer. Being a new kind of computer means that essentially everything that computers do can get rebuilt, I think. So we're investing against the thesis that basically all incumbents are going to get nuked.
Yeah. And everything is going to get repurposed. Just across the board. Just across the board. Now, we'll be wrong in a bunch of those cases because some incumbents will adapt. But power law, the things that are right will be super right. Will be super right. Exactly. And then look, the AI makes things possible that were not possible before. And so there's going to be entirely new categories. By the way, is your mindset there that you should just bet on? Like, obviously incumbents are going to win some percentage and startups are going to win some, but it's basically the dominant strategy as a venture capitalist to just
to bet that startups are going to win it all and go for the power law? Yeah, that's right. That's right. And again, the reason is, you can remember two customer sets. The way the LPs think of us, the way the LPs think of us is as complementary to all their other investments. Yeah. And so our LPs all have like major public market stock exposure. Like they don't need us to bet on...
incumbent healthcare, you know, whatever company, right? They need us to fit a role in their portfolio, which is, you know, to try to maximize alpha based on, you know, based on disruption. Yeah. And then again, and then just again, the basic math adventure, which is you can only lose one X, you can make a thousand X and you just like slam that forward as hard as you can. So when you have a moment in time worldview like this, do you,
You know, as a firm leader, do you give a directive that's basically like, hey, everybody, we need to deploy in this kind of way right now? Or do you just build a system that's always picking birds out of the flock from like the bottoms up and you just like, well, they're smart. They're going to see that every opportunity is good. Like how much is it like a top down system?
versus, you know, the market's just obviously good all around. Yeah, so we don't do, like I said, we don't do top-down investment decision-making. So Ben and I aren't sitting saying, you know, we need to invest in category X or we need to invest in this company versus that company. And we don't run, we have a legal investment committee, but we don't run a process where they come to us to get approval.
Because you're letting the leader of each group sort of make those changes. Yeah, and often in those groups, it's actually delegated for the individual GP or check writer. And the reason for that is we just think that the knowledge of knowing what's going on and which one's likely to win is going to be focused in the mind of the person who's closest to the specific thing. But do you have like a risk slider? Are you like, hey guys, let's get a nine right now? So this is the funny thing. So venture is the only asset class in which the leaders of the firm are in...
position of trying to get the firm to take more risk, not less risk on a regular basis. Because the natural orientation towards any kind of anybody who's in an existing business, there's a natural organizational incentive to try to reduce risk because you just want to like hold on to what you have and not upset the apple cart. And so Ben and I are generally on the side of like take more risk.
One of the applications of this is an old Sequoia adage, which is they say, when in doubt, lean in. So, for example, you see this, I'm sure, when you do it. It's just like, okay, there's this thing, there's this company that is potentially very interesting, but there are these issues, right? And it's just like, it's too early and this and that, and this weird guy's got a weird background, and this, that, that, and he's in a, you know, whatever, I don't know, issues, and, you know, there's a hair. Yeah. You know, there's hair on the deal. Yeah.
There's no hair on the GP. That's funny. That's good. But there's hair on the deal. The founders tend to have really good hair. They're saying the deal and it's just like, all right, like, what do you, how do you calibrate that? Right? And again, the history of venture is when you see something that's very promising and there's a lot of hair on it, sometimes when you invest, it's going to go to zero because the hair is going to kill it. And then sometimes when you invest, it's going to be,
But it's like something where you're like, I love that. I hate that. It's much better than, yeah, everything's fine. 100%. And the way we describe this is invest in strength, not in lack of weakness. Or another way to think about it is it's not good versus great. It's very good versus great. Differentiating good from great is very straightforward.
differentiating very good from great is actually very hard. And again, the risk reducing way to try to do that is, as you kind of alluded to, would be kind of the checkbox thing, which is like, very good team, very good market, very good this, very good that. And then you have this other one where it's like, they've got six great things and nine like horrible things, right? Yeah. Okay. Which is the better bet? Totally. Usually. Yeah. Usually it's the thing with the greater strengths. Statistically, by the way, this shows up in the return data from the LPs, which is the top decile firms have a higher loss rate
than everybody else, which is called in baseball called the Babe Ruth effect, which is the home run hitters strike out more often. Yeah. So the top performing venture firms statistically tend to have a higher loss rate than the mediocre firms. Right. And it's for this reason. They're willing to invest in the thing that just looks like completely nuts, but has that magic something. Yeah.
And so when Ben and I think about trying to get the team to take more risk, it's almost always, it's basically either that kind of thing, which is like, look, and what you're doing is you're telling the person closest to it, go with your gut. If your gut tells you there's something magical here, like go ahead. It's okay. Cause we're going to have some losses. So it's okay to make the bet. If it breaks because of the hair, that's fine.
But then the other form of risk we try to do and I do this a lot is just, you know, I am trying to push the firm constantly is like go earlier. Yeah. Right, because again, as we discussed earlier, the natural inclination is to wait, right? And it's like, no, no, no, go earlier. Like we do actually want to make these, you know, we'll make some seed bets, but we definitely want to make like a lot of A bets.
And again, we're going to lose a bunch of those. We're going to screw those up and miss the winner or whatever. But like we have to do that because we have to get into some of these things early. We have to, you know, get the level of percentage you get in the A. Yeah, I mean, I guess there's risk that's of the flavor of like do things that are more asymmetric where there's hair, but also brilliance. There's also the flavor that's just like,
Well, sometimes something I struggle with is the deals where I just barely said yes and just barely passed. I'm like, I don't actually have that much confidence that I can tell the difference between those. There's another flavor of sort of be more aggressive, which would just say, like, just do a higher percentage of those ones where you're like right on the line. Do you give that kind of guidance? Do you think like that, too, where you're like, it's not just do the more out there things and we're swinging for the fences, but it's also like, let's just do a little bit more right now in general. Yeah. So we used to run this process we call the anti portfolio.
The shadow portfolio. And so the shadow portfolio was, we used to track this statistically for like the first five years, exactly on this point, which is every time we do an A, every time we do pull the trigger on an A round, let's put in the shadow portfolio, the other company we were looking at at around the same time that we didn't end up pulling the trigger on. And then let's build up representative, like build up the Earth 2 portfolio. I'm so curious. Headway.
Well, so the good news is it turns out generally that the main portfolio did better than the shadow portfolio. But the shadow portfolio was close. It was a good book. Did really well. Yeah. Right. Exactly the point. And so and then you're OK. So then you're just like, OK, you're not that smart. But you're just like, OK, obviously, what does that mean? It means do them both.
Right. And again, this goes to the thesis of like, how big should these firms get? It's just like, well, if you had the opportunity to do both the portfolio and the shadow portfolio, you should do them both. What's the constraint on that as we discussed is complex. Yeah. But generally speaking, you should try to do both. Yeah. And by the way, this is the, this is the, I don't know if it was Josh or the other, the other podcast that they were talking about this, but
I saw a reference to like a statistical analysis of like win rate or whatever, percentage returns or whatever, or percentage of wins. It's just like, in venture math, it doesn't matter. The thing that matters is were you in the next big thing as early as you could get in and buy as much as you did? Like that's the only thing that matters because if you don't do that, you miss out on the thousand X gain.
the one X losses don't matter. They wash right out. Yeah. And so this idea that somehow there's some like virtue to being like a, you know, small, you know, we only make a few bets. We have a higher percentage. It does. Yeah. How much is that? I'm glad people think that that's a, I would like to encourage people to, to think that that's a virtue that they should shoot for. It seems like it's very hard to assemble lots of,
you know, very good productive GPs into the same firm. It's just objectively rare. Yeah, that's right. You've done it, but it's like doesn't happen very often. Yeah. Do you, I guess my first question on this is, do you think of,
just finding greatness and then you can't really teach it much, you know, so you're basically just going to like hire people and see how it goes? Or do you think that it's about creating the system and conditions in which people do great work and you can actually create good investors? Yeah. So I think it only works if there's a point, like if there's a reason why you would have the aggregation of GPs in the first place. And our answer to that is power, right? Our pitch to GPs as to why they should join us as opposed to go to a smaller firm or start their own thing is,
If you come here, you just like plug into this engine that's just like massively powerful. And so everything that you do, the effects of it are going to just be like blown completely out. It would be much more satisfying and you're going to be able to actually help the companies a lot more. And you'll probably see more companies anyway. Yeah. So everything probably gets better. Yeah, that's right. That's right. And by the way, you know, some people want to have colleagues. Some people don't want to have colleagues, but some people do want to have colleagues and you'll be working with people you like and, you know, who care about the same things you do. So, but there has to be a, there has to be a point to it. And of course it's, you know, it's on us to keep proving that, right? Because, you know, the devil's in the,
of whether they'll actually buy that. But so far, a lot of really great people have. And then, yeah, and then the second part of the question is like, okay, who do you put in those roles? Historically, we had, our old model was basically we only hire GPs. We were not developing and we could go through why that was the case. We changed that like eight years ago. We now develop our own GPs. We've evolved to where I think that's working quite well.
I think the answer to your question is it's a two-part question is there's some level of just objective, you know, are they good at doing the job? Here's a big thing we focus on when we evaluate them, which is, you know, it's fine to invest in a category like five years early or like whatever, something goes wrong, like that's fine. What's not fine is you invest in the wrong company and you could have invested in the right company. Yeah. Like at the moment you made the investment, you made the wrong decision in that moment.
of which one you should invest in and you could have known. And so it's like, did you do the work to fully address the market? - How do you handle the fact that like you don't know that until like six years later and now you're going back and you're like, hey, you made this mistake six years ago, this isn't gonna work out now. - So it's generally, so that is a giant problem. And I would say that when we started actually, when we talked to our friends in the business, what they said basically was, they said, number one, you don't know if somebody is a good GP for 10 years.
'cause you don't have the return data. And then they said number two is nobody ever wants to admit that they made a mistake. And so they never actually fire anybody. So what they do is they just keep them on the masthead and they just kind of gently like, you know, retire them out. They sit and pollute. One of the guys running one of the big firms 15 years ago told me he said they hired a partner. He said they hired a partner, it's an older firm. So they hired a partner in 1984.
um who was like a big deal at the time in the industry and you know the lps were very fired up about it and he said he then proceeded to just like nearly ruin the firm over the next 20 years that's crazy because he said he wanted he said all of his investments were bad but then it was even worse that he talked them out of all the other good investments they called it and he said we couldn't get him out you know the reputational damage was too great so
So this is a long run. And then by the way, a lot of these firms are partnerships. The problem with the partnership is, a partnership sounds good. The problem is you end up with lots of internal dissension and then you can't make decisions. So this is a big issue. I guess what I would say is like, for example, the thing I talked about, it's just like, it's not a, it's what I just described as a process issue, not an outcome issue, right? Which is like, are you doing the work? Right? Like it's an actual job.
Like, are you doing the work? If you're not doing the work, it's relatively clear you're not doing the work. And you're probably not doing the work, not just on one thing. So you do try to really look at the inputs? Oh, yeah, very much so. We evaluate the inputs just as much as the outputs. What do you do as an investor? I'm sure you've had this at some point where the inputs are not particularly good.
They hit this one outlier thing. The outputs are objectively now good. And so you're looking at that situation or the inverse. So this is the other part of it. The other part of it is I think there's just a subjective criteria for venture, which is just, are you good at it? Yeah. And like, do you have taste? Yeah. Which is unquantifiable. This is one of the nice things about your model too, where like you, somebody gets to make a call versus in these partnerships, I think it would be very hard.
when nobody gets to make calls like this. Because at some point, someone has to just like make a determination on this stuff. Yeah, that's right. And then even, you know, and even who even made the call, you know, gets lost. Yeah, so I think there's a taste thing. And then look, I think there's also just like a network cohort branding thing, which is these startups come in waves. And it's not just new technology, it's also new people. And they, you know, these new scenes form and like, are you in the scene or not? Right? And if you're not in the scene, like...
Yeah. I can't fix that for you. There's also a ton of path dependence, it seems like, where like you make an investment that gets you in the scene. Now other founders want to work with you because you invested in this really cool company. Right. And then it just snowballs and you're like, well, I can't go back and, you know, change history and get you into the snowball. Yeah. Yeah. Like, and again, this is what I was going to call this. This is the test with a capital T. So it's just different versions of the complaint. Right. So you brought up the one of the founder who's like, well, I could have done this, but I wasn't in a position to do it. Right. That's your own fault. Yeah.
There's another version of it, which is this is sort of the anti-VC narrative is these VCs are so arrogant, they don't see my unique genius. Right? Right. You know, the VCs are only as a good critique, they always apply against Paul Graham is, you know, he wrote this post on pattern matching and he always gets attacked. It's like, you know, he pattern matches. He's not looking for quality. He's just looking for pattern matching. And like, you know, it's like, founders don't match the pattern. It's like, raising is very important for founders to understand. Raising money from venture capitalists is the easiest thing you will ever do as a startup founder.
We are sitting here with checkbooks waiting to write checks. We are dying for the next person to walk in the door and be so great that they convince us to write the check. We don't care where they come from. We don't care what country they're from. We don't care. None of it matters. It's just like, do they know what they're doing? Are they going to be able to do it? We're just dying for that. Everybody else they're ever going to deal with, candidates and customers and downstream investors and everybody else is going to be much harder to deal with than we are.
And so if they can't pass the test of raising money, like...
they're not going to be able to do it. And it's the same thing with a GP. Like, if you can't network your way in and make good investments, that's the job. Totally. Okay, on that point, because there's going to be, I completely agree with what you just said about how it's, you know, the easiest part of building a company. There's going to be a lot of, you know, frustrated founders hearing that who are like, why can't I raise, you know, what's going on here? One of the things that I'm really, you know, you've done this for enough time now. When founders, you know, get a pass note,
it's usually about something that's related to the market or the product or whatever. And a lot of times it's what you just said, which is that like, I just want the founder to be great. Right. But nobody says that. Nobody says that. And so they don't get the actual feedback. And so I guess this whole dynamic of like people aren't giving yet because it's, you know, what they're saying is not you're not great, but it's I didn't perceive you as great or something like that. Is there some way for there to be a more honest response
useful back and forth around this? Or is it just one of the impossible structural things and founders just have to go around frustrated that people are saying the market's too small or it's too big or whatever. And really what it is, is they're just not landing as great. I mean, it's like, yeah, I mean, I know you think your baby's beautiful, but I think he's really ugly, right? Yeah, yeah.
You know, this kid's going to have a really hard time in life, man. He's really, really unattractive. And it's really hard. It's really difficult. And by the way, you embedded two things in there. One is like, you know, one is do they come across as good, which in theory is fixable. But the other is like, yeah, some people are better than other people at doing this. Definitely. And some people should not be started. Some people should actually just like be on a team. Yeah, sometimes it's a correct assessment. Sometimes it's an incorrect. There are some people who in the early days can't write. You know, there's a lot of great people who now we all know are really great, but they couldn't raise a lot of money. So they must have shown up in 60 VC meetings is not great or whatever.
And look, VCs make, and again, yeah, exactly. It's like, we don't, we don't know. Yeah. We make lots of mistakes on a mission, you know, so we, we, and like I said, most, even the great VCs most of the time are screwing up. Yeah. And so that's all true. The thing I always tell founders is the, it's the, Steve Martin was asked this question about becoming a great standup comic and he wrote this whole book, a great book called Standing Up, which he talks about this and he says the secret to being a great, he said the secret is be so great they can't ignore you.
Yeah, if your business gets good enough and you prove that you're really good, you don't have to show up in the one hour with a VC. It's very impressive. You just proved it on the field. We're dying for people to come in and just be like, wow, right? And just be like, I cannot believe how good this is. I can't believe how good this product is. I can't believe how much the customers love it.
I can't believe how much this person has gotten done on a very small amount of money. So it's the exact same thing if I'm a talent aid. I'm just dying for the young comedian to get up on stage and make me laugh. I also think the founders who like really struggled to like raise a round or two and then the business got working, I think there's like a, there's a real strength that comes out of that. So it's not the worst thing that ever happened.
Yeah. No, no, no. Like having said that, like there's breakage along the way. Like there, there are. Yeah. Also it sucks. It's like really unpleasant. Yeah. I had to have it. It sucks. Yes. Yeah. So, but like, you know, like I just say like, I, you know, having been a founder, like it's an, it's an incredible privilege to be in a, in a, in a, in an industry and in a world and in a country at a time.
when you can actually do this. Yeah. Like, so, you know, in most of history, in most places, you just, this kind of thing can't happen. And then, you know, we are genuinely trying to find the anomalies, right? Like, our business is defined by anomalies. It is true. The thing you said about it's like an audience that wants to laugh. It's totally true. So desperate. Yeah. I can't wait for somebody to finally tell a good joke. So on AI, I want to talk about not just the startup side, but maybe like a
Just some of your takes on like the broader lens of AI. I guess my first question is around AI going wrong. And I know this is like a very hard thing, but I'm just sort of for fun, really curious what you think. You know, the downside case that people are very afraid of would be something like AI embodies humanoid robots. And now we have a Terminator situation on our hand. It gets agency. We have a big problem. You know, that's one end of the spectrum. The happy path is that it's just like the
the sickest software that anybody's ever seen. And like, it's a tool that humans use and everything's great. Do you think about this? If so, do you have any opinion on it? Or are you just like, it's going to be what it's going to be? Start by saying it's an important new technology. Any important new technology is what they call dual use. It can be used for good things. It can be used for bad things.
the shovel. It can dig a well and save your life. You can bash somebody over the head with it and kill them, fire, you know, the computer, the airplane. You know, the airplane can take you on a most marvelous vacation with your new spouse. It can also bomb, you know, Dresden. Right. And so it's just, I mean, atomic power was the big one because atomic power could be unlimited clean energy for the entire world or it could be nuclear bombs.
Right. As it turns out there, we just got the bombs. We didn't get the unlimited clean energy. And so like that's just like generally true. These things are double edged swords. The question is like, all right, like what are you going to do about that? And are you going to like somehow put it back in the box? Are you going to somehow like try to constrain it and control it?
The nuclear example is really interesting because there was a very big concern around obviously nuclear weapons and then nuclear, there's a kind of big moral panic that developed around nuclear power. I mean, we kind of messed up with that. We very badly messed up with it. And what happened was the green movement in the 60s and 70s created something called the precautionary principle, which the same kinds of people are now trying to apply to AI, which basically
which basically says, unless you can prove that a technology is definitely going to be harmless, you should not deploy it. And of course, that literally rules out everything, right? That's just like no fire, no shovels, no cars, no planes, no nothing, no electricity. And so, and that is what happened to civilian nuclear power, which is they just, they killed it.
The story I tell on that is President Nixon in 1971, the year I was born, he saw the oil crisis coming in the Middle East. He declared something called Project Independence. He said the American needs to build 1,000 civilian nuclear power plants by the year 2000 and go completely clean, carbon zero,
completely electric, cut the entire, you know, they had electric cars 100 years ago. So it's just obvious you just cut over to electric cars at some point. And basically we need to do that. And then we're not entangled in the Middle East and we don't need to go, you know, do all the stuff there. He then created the EPA and the Nuclear Regulatory Commission, which then prevented that from happening.
They absolutely killed the nuclear industry in the U.S., right? And then the Germans are going through the new version of that with Ukraine, which is they keep shutting, you know, Europe ex-France keeps shutting down their nuclear plants, which just makes them more dependent on Russian oil. And so they end up funding the Russian war machine, which invades Ukraine. And then, you know, they're worried now it's going to invade Russia. And so you
The social engineering, I would say the moral panic and then the social engineering that comes out of this, the history of it has been quite bad, like in terms of its thinking and then in terms of its practical results. Yeah. I think it would be a very, very, very big mistake to do that in AI. To like regulate early. Yeah, yeah, yeah, absolutely, 100%. To try to offset the risks in order to like, and then cut off the benefits. So start with that as number one. Number two, I just say, look, we're not alone in the world. And we knew that before, but especially after DeepSeek, we really know that.
And so there is a two horse race. This is shaping up to be the equivalent of what the Cold War was against the Soviet Union in the last century. It is shaping up to be like that. China does have ambitions to basically imprint the world on their ideas of how society should be organized, how the world should be run. And they obviously intend to fully proliferate their technology, which they're doing in many areas. Yeah.
And the world, you know, 50 years from now is going to be running on, you know, 20 years from now is going to be running on Chinese AI or American AI. Like those are your choices. You think that's how it'll basically play out? Yeah. Yeah. It's going to run on one or the other. How will that play out? Like, let's say it's one or the other. So AI is going to be the control layer for everything. So my view is AI is going to be how you interface with the education system, with the healthcare system, with transportation, with employment, with the government, with law, right? It's going to be AI lawyers, AI doctors, AI teachers. Okay. Yeah.
Do you want your AI teacher, you want your kids to be taught by Chinese AI? Really? Yeah. Like they're really good at teaching you Marxism and Xi Jinping thought. Like, you know, there's another way to put it. It's the culture's in the weights. Yeah. Right. And so like how these things are trained and like who they're trained by like really, really deeply matters. And so, and by the way, this is already an issue in lots of countries because they're like, number one, they may not want Chinese AI, but number two, do they want, you know, super woke Northern California AI? Right.
It's another open question, right? So there are big questions on this. And so I just think like there's no question, like if you had a choice between AI with American values versus the Chinese Communist Party values, I mean, for me, it's just crystal clear where you'd want to go. By the way, there's also going to be direct military. There's a direct military version, national security version of this, which is, okay, do you want to live in a world of all CCP controlled robots and drones?
and airplanes and cars. I mean, is that really what you want? Warfare and defense, I guess, just is going to fully go AI over the next 20 years or something like that. I think that's very much true. And I think this, you know, robots plus AI, basically. There's this signal, you probably saw the Ukrainian attack on the Russian airplanes.
So those are autonomous drones and then they were doing AI targeting of structural, the right structural points to be able to attack the planes and destroy the planes. And so yeah, 100% that's happening. This is a major issue with our defense doctrine with respect, for example, to potential invasion of Taiwan. If an aircraft, Ukraine has been fielding AI piloted jet skis. So they take a jet ski, take a jet ski, put an autonomous pilot on it and they strap it with explosives. And you could send out 10,000 of those against an aircraft carrier.
Right. And by the way, and you could just keep sending them, right? Because there's no loss of what you just keep sending them until you get through. And so, yeah, so the entire, I think the entire supply chain, the entire defense industrial base, all the doctrine of warfare, all changes, you know, the idea of human beings in planes or on submarines just doesn't make any sense.
it's all going to change. And then the symmetry or asymmetry between defense and attack is going to change. You used the word dual use. And obviously with like previous technologies, you know, they got used.
At some point, I'm wondering, does it blend from getting used to being the user? Like if like a business, a benign business example would be if you could tell an AI, hey, I want you to, you know, hey, prompt, I want you to build me a software company, you know, make it roughly do this, serve these users and run that for the next five years and just wire me the money to this bank account. Go.
And if that worked at some point in the middle of those five years, is it doing its own thing? Are you telling them what to do? Does that also happen in a warfare scale? And I guess that's maybe the thrust of, to me, where it turns into something more
particularly when you get into, you know, the embodied version in warfare where it's just like, you know, the prompt is like, hey, just, you know, fight this war for the next year or something like that. Yeah, that's right. So the good news, the domestic version of it is straightforward, I think, which is we have...
You know, U.S. law, Western law has a concept of responsibility, accountability. If you use a machine to do something, it legally is your fault. That's your problem. But by the way, if the machine goes wrong for reasons having to do with not with you, then it's a manufacturing, it's a product liability issue. The manufacturer is liable. But if you use it,
you know, if I buy a shovel and I bash you over the head with it, right? It's my, you know, yeah, the shovel killed you, but like I'm to blame. And so I think that your example of the autonomous corporation, I think legally, the legal system is perfectly prepared to deal with that, which is, yeah, you, that was, it was your bot. You set the whole thing up. It's your fault. And so there's a natural, there's a natural constraint. I think there's a natural constraint on that.
The most obvious version of the military version of the question is autonomous targeting and trigger pulling. Right. And this has been an issue in drone warfare for the last like 15 years, which is, is there a human in the loop on pulling the trigger? Right. So predators flying overhead, sees the bad guy. Okay. How is the decision made for the predator to launch the missile on the bad guy? And by the way, the way that worked for a very long time was it actually had to be an Air Force combat pilot who would actually pull the trigger on the drone.
very specifically. Even if he wasn't otherwise responsible for like operations of the drone, you'd still get somebody whose job it was to make those decisions in the loop. There are a lot of people in the defense field who are like, it's absolutely mandatory that in all cases it is required for the human being to make the kill decision. And maybe that is the correct answer. There's a very powerful argument as to why that should be the case because it's the biggest decision that anybody can make and
even if you don't believe in like the Skynet scenarios, just the idea of a human being not being responsible for that decision sounds ethically, morally very scary. There is a counter argument, which is human beings are really, really bad at making those decisions. Right. And so any self-driving cars thing, if it's safer than a human driver, then like who's, you know, yeah, there will be accidents, but there's fewer. Correct. And so every post,
of any combat situation that you read or any war later on, you discover all these shocking things. So one is friendly fire. Like there's just huge amounts of death caused by friendly fire, people shooting at their own troops. You see they're confused. Number two is, you know, fog of war. It's just like, it turns out the commanders have very little idea what's going on. They had some battle plan and immediately go sideways
They literally don't know what's going on. They don't have the information to be able to make decisions. Everything's confusing. Number three, the physiological impact of stress. It's one thing to be on a shooting range making these decisions. It's another thing to be like, you know,
have like a severe leg wound coupled with, you know, adrenaline, you know, overload coupled with two hours of sleep the night before. And like, is the human, is even the highly trained person making the decision right? Yeah. And then there's just like a more basic thing, which I think this is like a World War II retrospective. It's something like in a lot of combat situations, it was estimated only like 25% of the soldiers even fired their rifles. Wow. Like just generally a lot of people just like don't,
act. Right. And so anyway, so you, you, the more you look at this, you're just like, wow, the human being is actually really bad at this. Yeah. And then you, and then all these other issues around collateral damage, you know, and they should, you know, accidentally shoot the civilian. And so, so yeah, you're back in the self-driving car situation, which was like, all right, if you had, if you're, if you could, if you knew you could get better outcomes by having the machine make the decision, better, safer, less loss of life, less collateral damage. And so I, and I would say, I don't believe I have an answer to this, but I think that is a very fundamental question. I guess this kind of actually feeds into the
The next topic, which to me is I think like
tech has now gotten to a place where with the government and politics, like it's sort of now undeniable. It used to kind of be an underdog. But now for reasons like this and a bunch of others, it's just like too important to like not be in the mix at like the national stage now, which I think has really like changed the dynamic even insularly for Silicon Valley. Because now, you know, people are, you know, looking at what people are doing, not just like in tech, but pretty broadly now. Yeah, that's right.
Yeah. So I would say I deeply agree with that. I believe it is mostly our fault. Like the current situation is mostly our fault in tech, which is there's an old Russian, a little Soviet joke, which is you may not be interested in politics, but politics is interested in you. And so I think we, we, we, and I would include myself in this. I think we all got complacent or a lot of us got complacent between like 1960 and 2010 that basically just said, we could just sit out here. We can do our thing. We can talk about how important it all is, but like, it's never going to, you know, these are never going to be big social or, you know,
cultural or political issues. And we can just kind of get away with not being engaged. And then I, for all the reasons we've discussed. You're saying, and then once it was undeniable, we weren't prepared? And then we weren't prepared. We weren't even, I would say remotely prepared. And then they're using metaphor, the dog that caught the bus and the dog is being dragged behind the bus, tailpipe in his mouth, doesn't know what to do with the bus. And look, you know, geography, I think has a lot to do with this. We're 3000 miles away. You know, it's just hard to get there. They don't come here very often.
And yeah, so I guess I would say like it worked. Like we actually, we always wanted to build important things. We actually are building important things. There are obvious political, cultural, social consequences to them. If we don't engage, nobody's going to. And then by the way, the other thing I'll say is, you know, it's not like there's unanimity even in the industry on a lot of these issues, right? And so there's, you know, I would say two giant divisions right now, big companies versus small companies.
Yeah, you know, there's often do not have aligned incentives right now, and aligned agendas. And then the other is, you know, like just on AI, obviously, there's a big dispersion of views, even in the industry, I guess this probably goes to why it's important for, to some extent, at least some VCs to have relationships with the government, because big tech has the resources with themselves, small tech can't. And so if this is the state of the world, we actually
as an industry need somebody to be doing it on behalf of little tech. Yeah, that's exactly right. That's why we're doing what we're doing. Yeah. On media in particular. I thought it was really interesting. I can't remember how many years ago, but
biology many years ago started talking about like some fracturing about, you know, the sort of relationship between tech and the media was going downhill. I think this was mostly talking about media and inside tech, but I think probably also at the major publications and at sort of a larger scale. From my read as often, you know, I think this was right. And from where I sit, it seems like it did kind of continue to
degrade the relationship. What's interesting to me recently is I've seen a little bit of life, you know, in the sort of tech publication stuff, but it's actually been from the inside. And so like Eric, who you just brought on as GP is awesome. And he's been really good at doing this. TBPN is really cool. And I don't think I've seen something like that pop up maybe ever.
inside tech. What's your read, I guess, within our bubble of like the sort of tech media relationship and where it's been? So my background in this is I, you know, I have a weird kind of history because of what happened in the 90s. But, you know, I started dealing with the national press and the tech press, business press in 1993, 1994. And I did an annual press tour to the East Coast, you know, probably a week out of each year.
usually in the spring. And, you know, what that means is you kind of go around and you meet with all the publishers, editors, and reporters, you know, cover everything. And I would say the, basically the stretch from 94 to 2016 was generally like, I thought it was like a quite healthy, normal, productive relationship. You know, like they would run, you know, they would do investigative reporting and they would run stories I don't like, but generally they, you know, the major publications in each of those categories were trying to understand what was going on. And we're trying to kind of be, you know, honest brokers and trying to
you know, kind of represent what was happening. And so the meetings were like super interesting. They always wanted to learn. They always had tons of questions. They were super curious about everything that was happening. That was great until 2016. It was the spring of 2017 that I went on the press tour and it was like somebody had flipped a light switch. And they were like across the board, like unbelievably hostile, like unbelievably
like unbelievably, like completely, and across the board, like 100% sweep. Do you know why? Absolute hostility. I think the obvious answer is Trump. Trump got nominated and then got elected and then they blamed tech for both of those. By the way, there are a bunch of other factors, including that was when the, that was when the,
it's actually the there's a business side to it which is there was the fear that the internet was going to eat the news business in the 90s actually didn't happen and actually 2015 i think was the best year in history for like revenues to like newspapers yeah um and then it was really after 2015 social networking went big and then the their businesses started to collapse and you know they started having lots of layoffs and so that didn't help yeah and then you know look they would say look that was also you know they would say hey smart guy that's also when you started doing all these things that actually matter more right um and so you know that everything we've been discussing like
the tech industry changed. And so, you know, you're going to get a different level of scrutiny because you deserve it. You're doing different things now. The political thing was just a giant swapping factor. And they, and you know, this is a big, you know, I don't want to get into the politics per se, but if you just,
You know, this whole thing ran in parallel with everything that's like in Jake Tapper's book about, you know, like, so it's just like they just they got locked in on a mode of interaction that just became very polarized. Yeah. And very polarized, very lockstep. And, you know, from the outside, you just you read it and you're just like, wow, these people, they're all like really wrapping themselves around an axle. I think one of the other hard things is as the truth has become more accessible by people.
other people, you more often see something in the news that you know about and you're like, wait, that's super backwards. And then somebody posts about how backwards it is. And now, you know, you see a clip of, you know, some major publication and, you know, here's the truth and everybody can tell. And it's like, okay, so should we just believe the rest of it or not? And
I think the truth fact-checking went way up too with social media. That's right. And I would say there, you know, the cliche has been, and there's some truth to the cliche that social media is where lies spread. And there is some truth to that. There's a lot of lies that spread on social media. But the other side of it is what you're saying, which I think is right, which is the truth spreads on social media. And so the way I describe it is social media is an x-ray machine.
And exactly to your point, like anytime there's, and you see this in any domain of activity right now, is anytime there's a thing and there's just like evidence that it's just not the way it's being portrayed, it is going to show up. People are going to see it. Yeah. And that is, there was this guy, Martin Curry, who wrote this book called Revolt of the Public in 2015. And he was a CIA analyst who did what's called open source analysis for 30 years, which was
studying basically what was in newspapers and magazines for the purpose of political forecasting. And his prediction in 2015 in his book was that basically social media was going to completely destroy the authority of all incumbent institutions. And the way that it was going to do that was it was going to reveal through this X-ray effect that basically none of them deserve the credibility. Do you think that's kind of happened? I think that's exactly what's happening. Yeah. And I think there's...
statistical evidence that's happening. Gallup polls, they do an annual poll now for 50 years on trust in institutions of every different kind of major institution, including the press, and all the numbers are collapsing. In light of widespread social media, what would be the correct sort of
function or role of like journalism? I mean, look, I'm a believer in like the original, I like the original idea, right? Like I'm, I don't know. I'm a romantic. I like, I like what, I like what journalism says that it is. I would like it to be like that. I like what the universities say that they are. I would like it to be like that. I like what the government says that it is. I would like it to be like that. Which should be just to like name it. Yeah. Well for journalism, it's just like, all right, number one, like tell us correctly and accurately what's happening. Well, actually there's a, there's a conflict at the heart of the journalism question, which is that journalists say two different things.
There's one is they say, you know, basically be fair and objective. Right. And then the other thing they say is they say like hold power to account or they'll sometimes say they have this phrase, they'll say comfort the afflicted and afflict the comfortable. And like there's an inherent like are you an objective truth teller? Well, yeah, I was going to say that has nothing to do with the truth. It's just unrelated to the truth. Exactly. And so there was already a conflict at the heart of the industry. And there's a selection process where the people who go into journalism tend to be
critical by nature, right? They tend to want to be on the outside looking in to be critical because if they didn't, they wouldn't be journalists. They would, right? And so there is an issue there. But look, like, do we need people to tell us the truth? Yes, we do. Do we need people to hold a powerful account? Yes, we do. Like, I would like them to do that. Do you think they can be like for-profit corporations? Because I mean, I think another problem is they're getting all their distribution on social media. Eyeballs are what drives the revenue. People want to, you know,
So that also is unrelated to the truth. In fact, it's antithetical to the truth a lot of times. Yeah. So there's two mentalities come out of that. One is, yeah, the profit incentive warps it and you want it to not have a profit incentive so it could be true to itself.
The other argument is if you don't like for-profits, you're really not going to like nonprofits. Yeah. Because at least for-profits have like a market test. At least there's like some discipline. Nonprofit just becomes somebody sort of like, this is my agenda. I'm going to do what I feel like now. They go arbitrarily crazy. They go arbitrarily nuts. It does sound worse. Yes. And they're completely unaccountable.
They're completely unaccountable, right? In fact, it's the opposite of accountability because of the tax break. You are actually paid as a donor to invest in the things that are the most unaccountable. Interesting. Right? And then they can spin into like crazy land. Yeah. And they don't come back. They don't come back. Yeah. There's a history here. Yeah. They don't come back.
And so it's weird because like the citizen journalism thing is like a helpful fact check. It's like good to have. And sometimes it does feel like it's not quite sufficient to tell the full story on everything all the time. So I do think that there's an important role. I just feel like it's it still feels like it's very in limbo right now. So here is a theory that would be a reason for optimism, which is the last eight years were basically it.
it was basically the human animal adapting to the existence of social media. It was basically the assembly of the brain and you slam eight billion people in a chat room together and like, it's just like we're not used to it, we weren't wired for it, we're not evolved for it and just like, oh my God, everything goes bananas. Yeah. Marshall McLuhan, actually the great media theorist, he talked about this, he had this term called the global village.
is what happens when everybody gets networked together. And actually what people miss about it is he didn't mean in a good way. Because the nature of a village is basically gossip and innuendo and infighting and reputational destruction and civil war. Like that's what happens in a village, right? Which actually functions at a certain size. Yeah, like up to 150 people, you can kind of deal with that. You know, at the size of like New York City, it actually gets quite complicated. At the scale of the world, it's like a disaster. It's a disaster, right? But you could say, look, like we went through this eight-year period where like,
Everybody went, just say everybody went nuts. Everybody went nuts in like a thousand different ways. And then, but maybe that was just, we had to get used to it, right? Maybe we just had to adapt to it. And like, if you talk to, I don't know, if you talk to like young Zoomers now, you know, a lot of the time what they'll tell you is, yeah, we don't take any of that stuff seriously. Yeah. Like I just, of course you don't believe what you see on, you know, whatever it
Yeah, which is wild. It's just all ops. Like, of course, it's all ops, like whatever, right? And they just have like, they're adapted. I'm glad people know. It's just like, that's a crazy state of the world. Yeah, right. Yeah, exactly. Probably how people feel about like the news too. Well, so this is the thing on the news. So then this is the other thing on the news, which is, was the news ever as we were told that it was? And so my favorite example of this is people always cite Walter Cronkite as being the great truth teller. And the thing that they cite for you young people who used to be on TV is,
I've heard of him. I have not. He was this guy where he would show up on TV. Everybody would say, oh, my God, he's going to tell you the truth. Like he was like he was like the voice of the truth. And the way that he built that reputation is because he went negative on the Vietnam War in 1968. In 1968, he came out and he said the Vietnam War is unwinnable and we need to pull out of this. And they aired all these reports that showed that that was happening. Everybody said he's the guy who told the truth and hold power to account, you know, tell the truth. Well, it's just like the problem with that is he went negative. The fact that he went negative on the war in 1968, right, he was positive on it before that. Right.
Exactly. What did he know the day before he said that, that he wasn't sharing? Yeah. And like, and then by the way, what else happened in 1968? Which is the White House went from a Democrat to a Republican. So the Vietnam War was created by Kennedy and Johnson, and then it was inherited by Nixon in 1968. And isn't it convenient and interesting that he went negative on it when it became Nixon's war as opposed to being Kennedy's, Kennedy's and Johnson's war?
And so then it's like, all right, like what was actually going on there? What was happening in the preceding five years? And was he actually on his side the whole time? And then there's just the reality of it, which is I grew up in rural Wisconsin. We always thought the press was out to get us. Like we always thought the press was like the coasts basically passing, sneering judgment on the center of the country. Like we never believed like the stuff to start with. And we were always like, where I grew up, people are like super resentful of the stuff in the media and how it portrays them. And so I think there's also like a more fundamental underlying issue here, which is, you know, objective truth is a hot topic.
Like objective truth is a high bar. Yes. People have agendas. Yeah. Like maybe we just need to get all this out on the table. Particularly in politics, objective truth is not really how a lot of like people like, oh, that's a lie. I'm like, well, it's not a lie. It's just like an interpretation of a situation that like I wouldn't characterize. But like, sure. It's not like that. And these are complicated topics. You know, ordering a society is a complicated topic. Right. And the functioning economy is a complicated topic. And it's just not so easy to understand. Right.
And so I think part of it might, the optimistic view would be humanity adapting to being in the global village is basically just taking on a little bit of a more humble attitude, basically saying, all right, look, there's not gonna be, we're not gonna have a lot of objective truth tellers running around. We're not gonna have, but also at the same time, we don't wanna be in a complete panic about everything all the time. And we need to kind of be able to, you know,
take a deep breath, touch grass, be a little bit more skeptical, be a little bit more open, be a little bit more understanding. Right. And so maybe we're starting... And by the way, I think that's happening. I mentioned that Jake Tapper, without getting into partisan politics, but the Jake Tapper book...
I happened to went to an event that he did this weekend out here. And like, it's like that book and the reaction to the book. And if you watch the interviews on YouTube and the crowd response to that book, like it feels like people are just like, oh, like if we just take a step back for a moment from like all the intense partisanship of it all, like there's actually some, like maybe we can get back a little bit more. I thought it was, that book is a very positive step forward towards just a little bit of a calmer approach on these things. And then by the way, the other book I'd promote on that is
the Ezra Klein book on abundance, which I think is, I think as a, you know, somebody who's supported a lot of Democrats for a long time, I think it's like the most positive, you know, kind of manifesto that's come out basically saying, you know, no, like we need, you know, whether you're on the right or the left, like we need to actually build things. And I think that's also a healthy moment. So sort of related to this topic, a little bit adjacent, but I saw you talking about preference falsification recently. And I think this is like a super interesting topic in general, but particularly in the last couple
I don't know, call it five-ish years, I think a lot of preference falsification became made apparent. So I'd be curious first to hear a little bit about what you think happened over the last some number of years where these changes happened. Maybe we can start there and then I've got to follow up on it. Yeah, so the preference falsification, just to sketch an outline, it's when people, it's actually, there's two different elements of it. It's when people are required to say something in public that they don't actually believe or
or they are prohibited from saying something in public that they do believe. So again, commission omission issues. And then the theory of it, there's this great book by Timur Kuran on it. The theory of it basically is it's easy to think about what this happens in the case of a single person, which is are you telling the truth or is your public statements mirroring what you actually think or not?
The thing that gets complicated is when that happens across a group or across a society. And the thing that happens is if there's widespread preference falsification in society, you not only have people lying about what they actually think or hiding it, but you also, everybody loses the ability to actually know what the distribution of views are. Yeah. Right? And he says basically, if you look at the history of political revolutions, a political revolution happens when a majority of the country realizes that a majority of the country actually agrees with them and they didn't realize it.
it right so that whatever system they were in had convinced them that they were in a very small minority and then you get a at some point there's you know the boy who points like a catalyst there's a catalyst catalytic moment and then and then basically there's what's called a preference cascade right um and then um and then all of a sudden it's like the correct prisoner's dilemmas box to live in all of a sudden flips everybody realizes that at once
Yes, exactly. And he said, you can see this in, you can see this like in a crowd with like a speaker, controversial speaker, where basically like you'll have a controversial speaker and then there'll be silence in the crowd. And then one brave person will start clapping. And that person is like a severe peril because if they're the only asshole standing up,
clapping like that's it they might get killed yeah but then if if it cascades then a second person starts clapping and then a third and a fourth and a fifth and then you get the snowballing effect and then the entire auditorium is clapping and then and then that's everybody realizing that they actually are on the side of the majority which they didn't realize before yeah
By the way, this is what comedy, this is actually why comedy is fun. It's what comedy does well. Because people can't control the involuntary response of laughter. Yeah, exactly. And so when you get an entire group of people in a room laughing out loud at something that individually they will all swear. They can't help it. It's not funny. They can't help it. That's a great point. And then the stress relief from that, because they all know that they're part of a, they've rebonded the community, right? You're actually back and being a part of a community. And it's just such an incredibly powerful feeling. Yeah. Yeah. Okay.
So it's very easy to apply this theory to like the Soviet Union, right? Or like the Eastern Europe in the Cold War or whatever. Maoist China. It's a lot trickier to apply this theory to your current society. I believe that we've lived in an era of like intense preference falsification. I think the last five years, probably the last 10 years were like way more intense preference falsification than the preceding 40 at least. Probably going back to, I don't even know.
I mean, you have to go for sure back to the 60s, if not like the 1920s or something to find an analogous period. I think this period was characterized both by people who were saying things they didn't believe, but critically not saying things they didn't believe. I think there are many reasons this happened. And look, this has happened many times in history. And so a lot of people want to say this is caused by social media. Right. Well, when you phrase it the way that you said, it actually makes a lot of sense when it's just if people are going to be in a part of this prisoner's dilemma matrix, right?
it actually just gets caused by nothing other than itself. Like it doesn't really need an outside catalyst for people to get into their own box. That's true. Although there needs, I know it's a good question. Does there need to be some kind of oppression? Does there need to be some kind of motivation for the cascade to have started where people end up in that box? It's social pressure. So, yeah. So,
Specifically, I think the thing that happened the last five years was... I guess it needs to be a high stakes enough issue for it to matter. Otherwise, it's just like who cares whether you think like the clouds are pretty or not. Yeah, that's right. So at least has to be that. Yeah, and the way I think Tim McCrown would describe it is it needs to have like political, social, cultural salience. Yeah. Like it needs to get to something fundamental about how the community is organized. You know, we call that politics, but you know, this predates even the concept of politics, right? And so...
And by the way, look, like you don't even necessarily want to say that all preference falsification is bad because like, you know, I don't know that you want everybody out telling the truth about everything. I don't think you do. I think at least in like a, like social, like a lot of social graces come from people saying it's great to meet you when I didn't feel like saying it was great. Your baby, I believe your baby is very effective. Exactly. So some of it's good. Yeah. So, yeah, but, but, but yeah, as your point, you get wedged in this box. And so I think the specific thing that happened. So the good news is,
Preference falsification in a lot of totalitarian societies was administered at the point of a gun. You say the wrong thing, they shoot you. Yes. That for the most part is not what happens in our society. What happens in our society is the sort of nonviolent version, which is ostracized, canceled, ostracized, reputation is ruined.
fired, become unhirable, lose all your friends, lose all your family, can't ever work again. Still really bad. Still really bad. So you said it sounds pretty bad. Very bad. And so, and it just turned out, I think part of, you know, the optimistic view would be part of adapting to the existence of social media was social media just turned out to be among other things, a very effective tool
to destroy people reputationally, right? And this is the social media mobbing effect, right? We're not all familiar with it. And you think that helped create basically more false preferences? Yeah, big time. Do you think it also unwound them? Well, so this is the thing. And this is maybe the thing that happened in the 2024 election, right? Which is just like, oh, okay, like we don't have to live this way anymore. You know, certain views become safer to say out loud. Also the censorship regime, like,
We lived under a very specific censorship regime. Even in tech for 2024 election versus 2016, regardless of what you think and who you wanted, at least everybody can agree that it was taboo to support Trump in '16 and it was not taboo to support Trump in 2024 in tech. And so something changed there. Something changed. Peter had this great line in 2016. He said,
because he was one of the only people, you know, maybe the only person in tech who was actually pro-Trump in 2016. And he said, this is so strange. He says, this is the least controversial contrarian thing I've ever done. He's like, half the country agrees with me. He's like, I've never had a point of view on anything else in my entire life where half the country agrees with me. And yet somehow this is such a heresy that I'm like the only one. And so, yeah, so there was that, that definitely changed. And then I just think in general, like I said, I think they're optimistically, you could just say there's a process of adaptation.
Right. Where it's just like, all right, we're just like if if we all just decide that we're just not going to like live life by mobbing and scapegoating and personal destruction. And just because somebody is offended by something doesn't mean it's going to destroy it. You know, somebody says one thing, it's going to destroy their lives. Like we don't you know, you don't have to do that.
Do you think it's basically been unwound now or do you think there are still a lot of falsified preferences? I would say it's radically different than it was two years ago. I would say there's still a lot of falsified preferences. But again, I would say I think probably in any healthy society, there's lots of falsified preferences. Do you have any guesses for something that is currently falsified that will become unfalsified or is too hard to call it? Sure. Yeah. Sure. Okay, great. But it's far too dangerous to say. We'll move on. Yeah. Cool.
Gosh. But again, when you ask that, that is a very key question. Here's what I encourage. Break the fourth wall. Yeah. Here's what I would encourage people to do. Here's the thought experiment to do. Just write down two, at least in the middle of the night with nobody around, doors locked. Write it down on a piece of paper and let's pull it out in 10 years? Write down on a piece of paper two lists. What are the things that I believe that I can't say? And then what are the things that I don't believe that I must say? Hmm.
And just write them down. Yeah. And I bet, you know, if you're a reasonably introspective person, you know, the quote unquote NPCs can't do this. Like if you're a reasonably introspective person, you know, most of us probably have 10, 20, 30 things on both sides of that ledger, right? And again, most of those are things where you've got to, you know, I don't know, like you don't want anybody to ever see that piece of paper. Maybe five or 10 years from now, we'll be back and everybody can reopen their papers and we'll see, and it'll be safe to say whatever people wrote down at that point. Exactly. Okay. A few final topics I wanted to ask you about.
One is you're probably in a spot to be giving just sort of life or career advice to young people a lot now, both in general, but also maybe specifically with like AI and like the current
set of tech, you know, changes right now, what do you most often find yourself repeating to a really smart, you know, recent grad about, you know, if they're like, what should I be doing with my career? If they get the chance to ask you that. To start with, I never took any advice. So. Advice, yeah. There's something there. But a lot of people do. So maybe. Okay, fair enough. That's like the, you know, if you could have built Facebook thing. Maybe, yeah. Maybe the best people probably shouldn't take any advice. Okay, well, the rest of us. But, um.
I would just say, especially for young people, you know, and again, I say this, like people are very different. Like I believe very deeply. Some people are very happy being in the middle of chaos. Some people are very unhappy being in the middle of chaos and they will actually get themselves out of a chaotic situation as fast as they can. Other people love chaos so much that if they don't have any, they will create it. Right? And so like you have to, you know, there's a level of understanding here where, you know, like not everybody should be in like a high position
growth high risk tech company because it might just be too nuts. Yeah. So I don't think there's a one size fits all, you know, kind of thing at all. Having said that, let's narrow it. So the young person who wants to kind of be in tech, I think a big part of it is, I think it's like run to the heat, like, or the seed thing we were talking about, like, where are the interesting things happening? And that's a conceptual question. And it's also like a place question and the community question, network question. Yeah.
And so, you know, run to that as fast as you can. And it doesn't mean running to the fads, but it means trying to identify. Trying to get into those hot network or ideas or projects, basically. Yeah, yeah, exactly. And look, there's a geographic component to that. And I think we all kind of wish it wasn't the case, but there really is. And AI, I think, has very successfully unwound the geographic dispersion of what was happening in tech. In a huge way, yeah. In a huge way. It's kind of slammed everything back into Northern California. I don't think that's good technology.
really, um, for a lot of reasons, but I think it just is the case. And so I would say like, you know, if you're going to like do AI get here. Yeah. And then look, and then the other thing is it's the Steve Martin thing. Be so good. They can't ignore you. Like time spent on the margin, getting better at what you do is almost certainly better than most of the other uses of time. The old adage of you are the average of the five people you spend the most time with is also true. You want to do that. Uh, so you want to, you know, pick, pick, pick that carefully. And then I guess what I would say is, uh, when I, when talk to people about like what kind of company to go to, um,
there are certain people who should only be in a raw startup and there's some people who should only be at a big company. I think the general advice is the, it's the high growth companies. It's the companies that we would describe as between like being between like series C and series E probably or something. Yes. Where it's like they've hit product market fit, they've hit the knee on the curve and they're on the way up. On average, that's gonna be the best place to go because you're not gonna have the downside risk of a complete wipeout usually. Yeah.
people who get into that position, like at those high growth companies, if you're talented, you can pick up new responsibility very quickly. Yeah. Okay. Next is your Andrew Huberman thing that I see on Twitter. Like what's, I actually can't completely parse what it is. What's going on with that? So we have a completely fake beef.
We're good friends. We're very good friends. Um, and our actually neighbors, neighbors in Malibu. And, um, I've been on his podcast and like, we're very good friends. Um, but, um, but you don't follow his protocol. I don't do anything that he says. I don't do a single thing that he says. Um, I, with one, one exception we'll talk about, but yeah, I don't, I don't, I don't do any of it. You know, he says maintain a regular sleep schedule. I,
You're all over the place. I'm all over the place. He says, always get up, you know, get up, you know, see sunlight as you can. I'm like, no, I don't want, that's the last thing I want to do when I wake up is see sunlight. You don't drink caffeine for the first two hours of the day. It's like NFW. It sounds like torture. It sounds like being in a North Korean concentration camp. Like I can't even imagine.
You drink a lot of coffee? A lot of coffee. Hot plunge, cold plunge thing. The cold plunge is miserable. I'm not doing any of that shit. You think it's good for you though? Oh, I'm sure it's good for you. I'm just not going to do any of it. It all sounds just completely miserable. That's good. The one thing that he says that I do is stop drinking alcohol.
And I would say I am physically much better off as a result, but I'm very bitter and resentful. It is. Towards him specifically. Why'd you do that one? Because it's much better for you physically. Yeah. It really is. It fixes sleep and energy problems. So is the most tolerable of all of these, and you're like, fine, I'll do one? Well, no, it's completely intolerable. It's horrible. Okay. I don't recommend it. I think it's a horrible way to live. Yeah. I'd much rather be drinking alcohol.
Does he think even like a glass of wine at night is bad? He does, yeah. Just all of it. He did one of the great... He's actually had, I think, big influence on the culture. And this is very... In seriousness, this is very positive, I think, at least for health. So he did this big thing on... There's all these... So what happened is there's all these alcohol... There's all these fake alcohol studies, basically. Basically...
This is like red wine and then it's like all heart protective and all this stuff. And it basically turned out that really sick people either drink a lot or nothing. And then healthy people tend to drink a little. Yeah. Right. So one is healthy people tend to be very well disciplined. Right. And then I guess is that correlation or causation? It's all in the sample set. So it turns out there's no health benefits to alcohol. That was all completely fake.
In other words, just because healthier people drink a moderate amount of alcohol does not mean that drinking a moderate amount of alcohol makes you healthy. I see. Michael Crichton called this wet streets cause rain. Okay, wet streets, rain. Yes. So for some reason, unhealthy people stop drinking alcohol. Unhealthy people stop drinking because they're like in the hospital. Because they can't handle this. Yeah, their doctor says if you keep drinking, you're going to die. Or by the way, they drink a lot, right? And then there's this fundamental thing, which is healthy people tend to be very disciplined, right?
But discipline is not. Discipline is like a big inherent component to it. Yeah. Right. And so people who are disciplined who drink moderate amounts of alcohol also do moderate amounts of exercise, also experience moderate amounts of stress. Also, you know, you go to the doctor on a regular basis. They take the medication they prescribed. They live all aspects of their health in it. I guess it'll take a while to see, but it feels like it should be a good thing that...
Andrew and other people have gotten so many more people interested in health. It's good for, it's good physically. Right. Yeah. Might not be good mentally. No, I'll try. I'll be funny again. It's, it's, it's catastrophic emotionally. It's, it's made me a much less happy person. Do you think, are you actually, you think that? Well, so I really, it's the, it's the, it's the alcohol is a time, thousands of years, people have been using it. Number one, two,
fundamentally relax. And then there's a very important social lubricant component to it. And the de-stressing could be healthy. Let's just say maybe it's not accidents, the birth rate is crashing at the same time. I don't think Andrew would argue you should not live your life purely maximizing for just physical health. That'd be a miserable way to live. I mean, it's like, what are you going to do? Just like never leave the house, never take the risk crossing the street.
And so, you know, he certainly doesn't judge people for drinking modern-day alcohol. He just says, "Look, scientifically, you have to understand it is a poison." Now, having said that, as you know, speaking of scenes, as you know, the displacement thing that's happening is people in our world, they're not drinking alcohol. Instead, they're doing hallucinogens.
Why are you saying? It's not necessarily an improvement. As you, Jack, know very well. Yes, tell us about your latest ayahuasca. You're so much different than you were last time I saw you. Your personality has clearly completely changed. Yeah, I do feel different. So the other theory would be there's a law of conservation of drug use, which is every society is going to pick some drug and abuse it. And apparently in our case, it's going to be like LSD and mushrooms. It seems like a good one. Okay. Okay.
Okay. My last question. When I tweeted out a request for questions, I got almost ratioed by one question. So I'm going to ask this one like nearly verbatim. It was by an Anon named Signal. If you were frozen for 100 years and you woke back up and you looked around, what would...
be the piece of data that you'd want to know that would tell you whether or not your dominant world view turned out to be correct in the fullness of time yeah so i will pick a very unfashionable answer to this and i would say united states uh gdp just like straight out u.s gdp because i would say embedded in that is the question of technological progress which is if you have rapid technological progress you'll have rapid productivity growth which means you'll have very rapid gdp growth
If you don't, you won't have rapid GDP growth. So you'll see that in the GDP numbers immediately. You know, number two is, you know, well, number two would be just like our market's a great way to organize. And the U.S. is the best market. And so, you know, is that going to keep working? And then third is, is the U.S. going to be a great country? And you are long all of this? I am very long all three of those. I am very convicted on all three of those. But, you know, if I'm wrong about something big, it's going to be something in there and it will show up in that number. Mark, this was amazing. Thank you so much again. Good. Awesome. Thank you, Jack.
Thanks for listening to the A16Z podcast. If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash A16Z. We've got more great conversations coming your way. See you next time.