We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode BG2 with Bill Gurley & Brad Gerstner | NVDA, Chips, AI Compute Build Out, Impact of AI on Big Tech, Earnings & Macro Set-up | E03

BG2 with Bill Gurley & Brad Gerstner | NVDA, Chips, AI Compute Build Out, Impact of AI on Big Tech, Earnings & Macro Set-up | E03

2024/2/22
logo of podcast BG2Pod with Brad Gerstner and Bill Gurley

BG2Pod with Brad Gerstner and Bill Gurley

AI Deep Dive AI Insights AI Chapters Transcript
People
B
Bill Gurley
B
Brad Gerstner
Topics
Brad Gerstner: 本期节目讨论了AI技术对大型科技公司和市场的影响,以及与之相关的芯片、计算能力和投资等问题。他认为,AI技术将重塑各行各业,带来巨大的机遇和挑战。他分析了谷歌、微软、Meta、苹果和亚马逊等公司在AI领域的竞争优势和劣势,并对这些公司的未来发展方向进行了预测。他还讨论了宏观经济环境对市场的影响,以及投资者应该关注的长期趋势。 他认为,投资者应该关注长期趋势,而不是短期波动。他以Booking.com为例,说明许多投资者因为过于关注短期数据而错失了长期投资机会。他还分析了英伟达等公司的短期股价走势难以预测,但可以关注长期趋势。 他认为,AI技术将带来巨大的市场机会,但同时也面临着挑战,例如成本增加、利润率下降等。他还讨论了AI应用的商业模式尚不明确,可能是订阅模式或免费模式,也可能包含广告。 他认为,AI应用中的“记忆”功能和“执行操作”功能将带来显著的提升,并对苹果公司在AI领域的战略进行了分析。他还讨论了投资者需要评估公司估值是否反映了其未来增长潜力和风险,并对谷歌、微软、Meta、苹果和亚马逊等公司的估值进行了分析。 他还讨论了AI芯片制造的挑战,以及各国政府对AI技术的投资需求。他认为,建设AI芯片制造工厂的难度很大,与软件行业不同,芯片设计和制造并非易事。他还分析了台湾在芯片制造领域具有竞争优势的原因,以及美国在AI芯片制造领域面临的挑战。 最后,他认为,AI技术将带来长期而广泛的影响,投资者应该关注长期趋势,并利用市场波动带来的机会。 Bill Gurley: 他在节目中主要从宏观经济和技术两个角度分析了AI对市场的影响。在宏观经济方面,他认为当前实际利率处于高位,这表明美联储正在努力抑制经济增长。他还指出,Larry Summers关于美联储可能进一步加息的言论具有挑衅性,这表明市场对经济软着陆的预期可能过于乐观。 在技术方面,他认为AI技术将带来巨大的市场机会,但同时也面临着挑战,例如成本增加、利润率下降等。他还讨论了AI应用的商业模式尚不明确,可能是订阅模式或免费模式,也可能包含广告。 他还对谷歌、微软、Meta、苹果和亚马逊等公司在AI领域的竞争优势和劣势进行了分析,并对这些公司的未来发展方向进行了预测。他还讨论了AI芯片制造的挑战,以及各国政府对AI技术的投资需求。他认为,建设AI芯片制造工厂的难度很大,与软件行业不同,芯片设计和制造并非易事。他还分析了台湾在芯片制造领域具有竞争优势的原因,以及美国在AI芯片制造领域面临的挑战。 最后,他认为,AI技术将带来长期而广泛的影响,投资者应该关注长期趋势,并利用市场波动带来的机会。

Deep Dive

Key Insights

Why is the cost of providing answers through AI significantly higher than traditional search results?

Providing answers through AI is more expensive because it involves generating detailed responses rather than just listing links. Traditional search results cost about a third of a penny per query, while AI-generated answers can cost up to 4 cents per query, and refining queries can increase costs up to 50 times more than traditional search.

What impact does AI have on Google's core business model?

AI poses a significant challenge to Google's core search business by increasing costs and potentially reducing revenue. AI-driven answers are more expensive to generate, and users may click on fewer ads, leading to lower revenue per search. Additionally, Google's market share in search is likely to decrease as competitors like ChatGPT gain traction.

How does AI enhance Meta's core business?

AI has significantly improved Meta's core business by enhancing ad targeting and user engagement across platforms like Facebook, Instagram, and WhatsApp. AI-driven tools have allowed Meta to recover from Apple's privacy changes, leading to better monetization and increased user interaction.

What is the significance of NVIDIA's role in the AI compute build-out?

NVIDIA is central to the AI compute build-out, with its GPUs being essential for training and inference workloads. The company has seen a surge in demand as the world shifts towards AI-driven infrastructure, and its data center market share has grown significantly. NVIDIA's advanced chips, like the H100 and upcoming B100, are critical for meeting the increasing compute needs of AI applications.

Why is Taiwan a dominant player in semiconductor manufacturing?

Taiwan's dominance in semiconductor manufacturing is due to its unique labor model, where workers are willing to work long hours and live in dormitories, leading to low churn rates and high productivity. This contrasts with the U.S., where higher churn rates and labor norms make it difficult to operate competitive fab plants. Taiwan's cultural and labor advantages have made it a global leader in chip production.

What is the projected growth in global data center infrastructure to support AI?

The global data center infrastructure is expected to grow from $1 trillion to $2 trillion over the next four to five years, driven by the need for accelerated compute to support AI applications. This growth includes both new data centers and the replacement of existing ones with AI-optimized infrastructure.

Shownotes Transcript

Translations:
中文

Would you rather have an assistant with the intelligence of like Einstein, but they have no access to the internet and they don't know anything about your history? Or would you like to have an assistant that's just above average intelligence, knows everything about you, and they can use the internet? Okay? You would choose that.

Hey, great to have you here in person. Good to be here. I remember when you were right up this hill. Yeah. You guys were over here. So I had a couple thoughts this morning. First, I got to take a ride in Full Self Driving 12. How was that? It was mind-boggling.

I think this is going to be a bit of a chat GPT moment for full self-driving. But what it really just reminded me of the magic of this moment, Tesla rebuilding their models for how they do self-driving around imitation learning and all this interesting stuff going on over there.

You know, I think they probably made more progress in the last 12 months than in the last seven years in terms of what's going on there. And it's going to be rolled out here. It's already rolled out to, I think, 5,000 people. And so, like, people are going to start experiencing that. And I think we're having more and more of these moments because this substrate we're going to talk a lot about today, AI, and just the compute, like what it unlocks.

The second in prepping for this pod was how bad you make my head hurt. You know, I was thinking about this. You know, what I love about this pod is like it's a forcing function. You know, you and I talk all the time. You're always challenging me. We're always comparing notes. But now with a little bit of structure around it, every couple of weeks, we have to think about some topics. Today, we're going to talk a lot about...

think AI and compute and chips and its impact on big businesses. And honestly, I liken it to an athlete, you know, and they say, in order to be the best I can possibly be, maybe Kobe, I want to practice against the best. No, but it's like, listen, the reality is, it's like running 10 miles a day to get ready for a big game. Like, if you're in this business and you're not exhausted,

with the analysis you're doing, the thinking you're doing, particularly at moments like these to try to gather this data and try to gather edge, then you're probably not going to end up on top of the heap. Yeah, I agree. And I think that having a topic or an idea that you want to fully flush out and be able to talk about causes you to place a few phone calls, read a few PDFs and before you know it,

but you actually realize you've learned something you didn't know, you know, five days earlier. It's a, you know, I think the little, you know, pull the screen back a little bit. I mean, you and I've talked,

You know, we're interacted five, 10 times a day over the course of last week on these topics. And then we turn over a rock and we find more data, more information. We share that with one another. It leads to another conversation, you know, and the combined networks allow us to ask a lot of the smartest people in the world, the questions we need to be asking to try to figure out this moment. So it's been a lot of fun, but, but it does give me a bit of a headache.

Hopefully a good one. So, so we remain, we remain kind of in an earning season. And so what, what, what's happened in the past few weeks that you think's super important? Yeah. Um,

Well, we have a few stocks that have run a lot. Meta, Nvidia up 30%, 40%, even with the pullback that we had today. But the truth is the NASDAQ hasn't really moved that much. I mean, I think we're up 3% or 4% through today. If you look at the median stock, I think it's up about 1%. In fact, I think we have a chart here just on the dispersion that we see in the NASDAQ. And so remember last year was like this risk-on moment, a mean reverting moment.

for all of technology. And this year, we're really starting to see the winners and the losers. We have some software companies that reported after the bell tonight that are down a lot because they're not seeing the AI pull forward that may be an Amazon or a Microsoft. So that's my first takeaway. My second takeaway is, you know, against these higher prices for some of these companies, you know, the backdrop looks a little bit more challenging.

So we had a CPI print that came out last week that ran a little hotter than people expected. The 10 years back up to 4.3. Remember, at the end of the year, I think it had gotten down to 3.5. And then we had what I thought was a really provocative tweet at the end of the week from Larry Summers, where he said the next move by the Fed could be higher. Now, why is this so provocative? Well, the market is betting for sure. The only debate about the soft landing has been when is the Fed going to cut?

Right. And so you have Summers come out and say, hey, I think the next move could be higher. That would be a shock to the market. Was he being provocative or do you think there's real data that suggests that the soft landing isn't a foregone conclusion? Well, listen, Larry was spot on right in 2022. Okay.

I think last year he was a little bit too aggressive as to where he thought rates were going to have to go. At the end of 22, I think he said maybe they could have to go to 6% or 7%. But I'm humble in the face that the future is unknown and unknowable. We don't know. That's the truth of the matter. So as investors, we have to try to distribute these probabilities. And so if I go back and look at this,

Real rates, right? So real rates, this is the restrictiveness that we have in the economy. So this is effectively the interest rates we have less the expected inflation rate in the future. They're as high as they've been since the fall of 2007. And the last time they were higher than that.

was in the summer, August of 2000. What was going on August of 2000 and the fall of 2007? Well, the economy was on a heater and the Fed was trying to slow it down. So that's the level that the Fed currently has its foot on the brakes. And every month that inflation comes down, if it does, then the restrictiveness goes up.

So the Fed, if inflation is coming down, then the Fed does nothing and its foot goes harder on the break. So that's why Powell has said we have to cut rates just to stay equally restrictive. So for Larry to be right, we would really have to see a reversal in inflation.

which I don't think many people forecast or see, but I think the important takeaway is this. As investors, I know, and you're already probably saying, God, how did Brad sidetrack me on macro? I don't want to talk about this. I often think about that famous saying, if you don't do macro, macro does you. But when I think about it in this moment, it's just to say stocks have run up a bunch at the start of this year. Okay.

The backdrop has gotten a little less predictable. There's now this tug of war that's going on. So I think we're going to have to see both of those things play out. And then, of course, this week, the monster that comes tomorrow, Bill, is NVIDIA. In fact, CNBC is screaming every day, whichever way NVIDIA goes, so goes the market. Now, I don't think it's quite like that. But one of the things that I was thinking about in regard to this is

because we were making a bet that AI was for real, that training workloads were going to be large, and that these inference workloads were going to kick in. And as investors, we often take what we call this private equity approach to the public markets, which is, let's get the big trends, the phase shifts, the super cycles right. I think about, you had me over to Benchmark. This is years ago.

And you said, Brad, will you come and talk about Booking.com and the case you do at Columbia Business School in the old Graham and Dodd class that I teach with Chris Begg on occasion? And the thing I try to teach the students in that class is why did all the analysts on Wall Street miss Booking.com, right? Miss Priceline. Now, remember, Priceline was a billion-dollar company in the public markets. Today, it's $120 billion.

120 bagger in the public markets. I mean, there aren't many venture capitalists that ever get 120 bagger, let alone a public market investor. And the takeaway in the class is everybody and all the analysts on Wall Street were so focused on how many hotels were they going to add in the quarter, right? And there'd be a lot of volatility around the number of hotels added in the quarter. Nobody really took the time horizon to say in 5, 10, 15 years,

How much more the offline world is going to book their hotels online and how much bigger that's going to be. So often the short-term trading, they would get the long-term conviction right, but they would end up trading out of the position. So I look at NVIDIA tomorrow and the honest to God truth is we have no edge on a quarter or on day-to-day trading of these things.

I think we do believe that the amount, and we're going to talk about this later, the amount of compute that's going to have to be built in the world is way bigger than consensus estimates currently forecast. But I think tomorrow it's going to be really interesting. What could really move? I mean, they're sold out, right? And their production's known. So it's just-

Pricing that could be different, correct? Well, I think there's so every hedge fund, every long only person, they track all this data, right? So the COAS data, the order books, the H-100 data. And I think what people are seeing, and there's been some tweets about this, is that the lead times...

on NVIDIA H100s are going down. So what might you think if the lead times are going down? You would say, oh, the demand must be going down or the supply must be going up, catching up with demand. And we've all been trained that every supply constraint is ultimately met with a glut.

Right. So everybody's just the wall of worry around NVIDIA is when does the glut come? We've pulled forward all this training demand. It's dark fiber like in the year 2000. I think those things are not accurate. But of course, I have no idea what this means as to tomorrow. So I think there are just tons of questions about AI, chips, inference, how much of it's going to be going on. I know we're going to hit on a bunch of that today. So, you know, we stirred the pot last week.

a little bit or two weeks ago by questioning the consensus view on Google, which is that they're going to be a big AI winner. I think we called it the $2 trillion question. I tweeted about it. Yeah. You know, Mark Schuster chimed in and said, you know, I'll take the side that they're going to be an AI loser, you know, but...

Why don't we dive in a little bit? You had this good idea. Hey, let's look at these large cap tech companies through the lens. Are they a winner? Are they a loser from AI? Yeah. So let's start with Google. And I wanted to back up a little bit and borrow a framework from one of my close friends and someone that I think a lot of people have listened to and learned from around investing, Mike Mobison. Yeah.

And years ago, shortly after I first met him, he was teaching a class at Columbia. And they start him in the sky. Paul Johnson started talking about an acronym. They titled cap competitive advantage period. And, uh,

What they would do is they would take a company's market cap and they'd look at the trends in the company and they would back into the number of years into the future that Wall Street was telling you this company was going to have a competitive advantage. By basically counting the number of years it would take in free cash flow to build into the market cap.

The point that he made is that different businesses have different amount of durability. And so Coca-Cola might only have a 3% growth rate, but it might have a 40 or 50 PE because everyone is willing to bet.

that 75 years from now you'll still see Coca-Cola on the shelves. Whereas, you know, you look at a company that all of a sudden faces the innovator's dilemma or faces disruption, and this cap can close super fast and it has dramatic impacts on the market cap of the company. I remember when BlackBerry first got in trouble,

um, the, the, the valuation just retrenched so aggressively that many people got fooled into thinking it was a value play because it was trading at 10 times earnings. Yahoo. Same way. Yeah. And, and, and what was happening is the people in the know were saying this company's competitive advantage just became quickly undurable. Yes. I'm not sure undurable is the word, but you understand the point that I'm making. Yeah. Oftentimes,

Because we have a lot of high growth stocks in Silicon Valley, and so they get assigned big multiples. Multiples are a byproduct of a couple of things, how fast you're growing, because we're trying to forecast those free cash flows into the future. But also to the second point, the durability, because we have to assign a discount rate. What's the probability that we're actually going to be able to collect those annuities sometime in the future?

And so the less confidence we have about the future, the higher the discount rate. And so even though you may have high growth, we have to discount it a lot. And Mike has gone on, I think, to talk about optionality, especially around tech companies. Sometimes you have a platform position that increases the optionality you're going to be able to move into other fields. And therefore, that would also be a positive. But-

Other people have questioned why these tech companies have high multiples at all because they're so susceptible to tech disruption, in which case you could argue the other side. But anyway, the reason I thought this would be an interesting way to talk about some of the large companies and AI, I don't think there's a single person out here that is arguing that AI is not some kind of fundamental technology.

Phase transition, Clay Christensen's disruptive wave or whatever. And in fact, I think the number one way you could probably commit Hari Kari as a public company would be to, on your earnings call, say, we think AI is fantastic.

Full of shit. Like we don't want anything to do with it. And so everyone's forced to have an answer. And I think, you know, one example that's pretty obvious to everyone is Microsoft. I think it became very clear that

to a lot of investors when they learned about an llm and what it was capable of and the fact that it could help write code and then it could help you write a paper you know and it could help you with creative endeavors people looked at the microsoft portfolio of assets especially where they make money around the office suite right and said and and the developer community where they also you know control a lot of the the ides that are used to program and that

And they said, oh, this is easy. Microsoft will be enhanced by this. And we should also add their adeptness at moving quickly with the open AI relationship. They were in front of this early. Absolutely. And so all three of those things, you go, oh, they're a net winner. And lo and behold, their stock went up and they had multiple expansion. Yeah, I mean, I think that...

The first question we ask as investors is, is this thing real? And what do we mean by real? What I mean by, is the juice worth the squeeze? It costs me something to have AI if I'm a customer, right? And is the productivity gains that I'm getting as a business worth it?

So I was talking about Tesla, start off the conversation. Well, if this model and this compute and all of this capability allows me to develop full self-driving and to win the automotive market because of that, then of course, the juice is worth the squeeze. I think if Copilot allows my engineers to be 30%, 40%, 50% more productive, then I'm replacing human beings with machines. Of course, that's worth the squeeze.

And so I would say that as we sit here in the early part- - Well, I would even say it another way in the Microsoft case, if you're not using their tool and you're programming without it,

you're falling behind. And so it becomes a tool that you have to have to remain competitive. Yeah. And so to me, we really try to look at it through the lens of, is this company, if we look at their existing business, is their existing business enhanced or attacked?

Because of AI. And then what new business opportunities do they have? And if we go back to Google for a second here, because I think it's kind of this iconic study, because the consensus view was that they were going to be a huge winner in AI. And let's step back for a second here. 20 years ago,

The idea that you were going to be able to ask any questions, immediately get information for free, like Google gives us, was just beginning. And the gains to humanity caused by the revolution that Google really led around

around information discovery and how efficient and how quick they provided information discovery. It really just, it changed the world in every respect. It moved humanity forward back to Ridley's idea of ideas having sex, right? Like it just allowed us to have more ideas, collect more information, exchange more information.

I asked the team to do a little analysis. As investors, what do we think the right multiple should be for Google? And what are the things that are inputs into that? So if you think about this, Google does about 10 billion queries a day.

So a couple of queries are more than a query for every human on the planet. And it has the most efficient system in the world for doing that. I think if you pull up this tweet from Vivek, it'll show that the number of queries that are asked of Google has slowed down a lot. So they're growing at about 4% a year. Monetization is up a lot, 13% more ads on the page. We've talked about that.

And so you have a 17% CAGR around search. And so the first thing that we just say is that the basing engagement, even before AI had slowed down a lot, because you're already at 10 billion queries a day. So next we took a crack at a chart that says, how many of those queries are going to become chat GPT-like queries?

And so the black line here is kind of the number of queries that are information retrieval queries on Google. And the blue line is the actual and the forecast by us for the chat GPT-like equivalent queries that are going to occur. Now, where do we get that information? Well, first, we know a little something about the number of queries on chat GPT.

And we know that OpenAI and Google are working on these search integrated experiences. I think they call it SGE, where they're going to have answers like perplexity in line with search. I think that Microsoft is now doing this with the rebranded Copilot. So a lot of the information retrieval searches are going to be replaced with these chat GPT-like searches. And this is where it starts to get interesting.

Because two things kick in, Bill. Number one, it costs a hell of a lot more, right, to provide answers than it did to provide 10 blue links. So if you say like, what's it cost to provide 10 blue links? It's about a third of a penny or less per query. Now, what does it cost to do that for 750 tokens today? And of course, this will go down over time, but it's 10x more, right? It's 4 cents per

And then if you look at the refinement of queries that's really going on, so a lot of times, Bill, what they do is they'll send back these 10 blue links and then they'll use that as their prompt to re-query the engine. This could be up to 50x more expensive to serve an answer.

a high quality answer to the consumer versus 10 blue links. So your cost goes up a lot. Well, what about revenue? So I asked the question on the other side. Well, we know that revenue goes down. Why? Because I'm not clicking on all these ads on the page, right? And so revenue per search likely goes down. I think if you look at that in terms of what the gross margin is to Google, right?

The cost of serving going up, the revenue coming down. I don't know. You may take a 95% margin today on a business where you have 99% share. Your share likely goes down over time because you have people like ChatGPT you have to compete with. But worse yet, your margin on each of those queries goes down over time. I don't know what it nets out at, 50%, 60%. Now, mind you, for perplexity or co-pilot or meta, et cetera, that's a great business, a 50% margin business. Right.

And, you know, it reminds me what we've all said so many times. Google's margin is their opportunity. So the problem for Google is they have to do this because their competitors are forcing them to do it. Okay.

But it definitely is going to be a lower margin business. And they're unlikely to have the 99% share. So everybody has texted and emailed me, yeah, but they've got YouTube and they've got Gmail and they've got Gemini 1.5 and they've got all this stuff. And I stipulate all of that is true. But the business that today produces the vast majority of profits for the company

What percent do you think? I mean, listen, I think that search and YouTube produce over 100% of the profits because they have a lot of money losing units in the business, but it's over 80% of the profits in the business. And so when you think about that,

Now, listen, they've got great management. They can cut costs. There are lots of things they can do. I'm not saying this is going to occur overnight. But if you and I were talking with Clayton Christensen about the innovator's dilemma and we were analyzing this business, this would really be case exhibit number one. Now, the irony of the innovator's dilemma, Bill, is most of the companies that face it, they know they face it.

They know they face it. So the question is, why don't they do anything about it? I think some don't. But in this case, I think there's no doubt. Right. Of course they know. Of course they know. So the question is, they're trying to thread the needle.

Right. Can we somehow modify this in a way where we continue to grow our quarterly earnings? Because just setting the platform on fire and retrenching in the public markets and doing all of that. Very, very. And they are advantaged by one thing that the searches that are going away first are Wikipedia like searches that don't have much monetization. So it doesn't it doesn't the revenue doesn't come apart right away.

Even though you might be losing search volume and more importantly, like people start getting addicted to the answer right away, which is very different from 10 Blue Links. I think like the horse is out of the barn on answers.

Right. Like once consumers experience the magic of an answer, they're not going back to hunting and pecking for a roster for an athletic team through 10 blue links. You're just going to use it. And all of this just reminded me, lastly, of somebody tweeted out a Grantham quote this week that I thought was pretty interesting. But it's this if you pull out this tweet from from Charlie, it says the S&P profit margins moved down to 10.6 percent Q4 23, the lowest since Q4 2020. And then there's the quote from Grantham I love.

Profit margins are probably the most mean reverting series in finance. And if profit margins don't mean revert, then something has gone badly wrong with capitalism, right? I mean, what's happening here with Google, it's not that this is anomalous, right? When there are big pools of profits like exist in Google search, capitalism

capitalism has a way to redistribute those- Although I think people had given up. I agree. I think Apple and Microsoft had given up- 100%. Prior to this new reality. That's why Satya says, if you're in technology, if you run a company like Microsoft,

All the money is made in the two to three years around a phase shift. You cannot miss a phase shift. If you miss it, then you miss all the value capture for the next decade. And I might argue with AI, it's going to be even bigger value capture and disruption, and it's going to last longer than a decade. One other thing we don't know yet.

that I'd be interested in your thoughts on is what's the business model for this? Because right now, the premium versions of Perplexity and ChatGBT have a dollar amount. They're a subscription. This kind of looks like the Netflix type situation. So is it subscription or is it free? Do you want ads around your answers or not? Well, I mean, listen, remember the disruptors,

They don't have to generate a lot of margin on this because they earn no money on it today. So what do they have to do? They have to cover their costs. And these disruptors want to see Google dance, as Satya said. I was surprised when he said that. I was too. But listen, the fire in the belly is exactly what you need. I think Satya has founder-level fire in the belly.

about this moment in time. And I think he has it not just because he wants to see the stock price go up. I think he has it because people like Satya, they're post money. And what they care about right now is moving humanity forward. And they understand that they've worked their entire careers to get to this place where we go from computers acting like calculators that are modestly beneficial

to now computers helping us answer and solve the most perplexing and fundamental questions that we face. And so I suspect that they're going to underprice this. If I were perplexity, I wouldn't have any ads in the thing, right? No need to put ads in it. And I would attack and I would try to get share.

And so it would be surprising to me if we don't see Meta AI and Microsoft and ByteDance and Perplexity and all the others who are providing answer engines, ChatGPT, coming at them below margin. Now, let's answer the question. I'm talking about the shorter term while everybody is fighting to gain share. They'll price it so they cover their costs or maybe not. Maybe Microsoft's willing to eat it here for a while. And I continue...

As of right now, and I've played with all of them, on the consumer side, I don't think anyone has a...

pro perplexity came out it was a lot faster that was pretty cool but when i just look at the quality of the answers i don't i mean on different searches one might be better than the other but i don't see anything that's so holistically notable like i remember when google search came out like where i was i remember trying it versus altavista and yeah and it was like you could tell like oh this is

better. And I don't see that right now. I did have that feeling of ChatGPT versus a traditional Google search. You're saying among the answer engines. Now you have four or five highly backed companies competing in this consumer

AI space and there I don't see anyone yet. Now, as I said on our last pod, I think if you get this memory thing right, it could change. And since we did that, OpenAI published a release that says, "We're working on it. We're working on it. Here comes memory." But the promises were, I think, pretty thin relative to what people want. But, Bill, explain to people again, because I think this is so important. You and I are in agreement that this could be the next 10X moment.

with GPT-like experiences for consumers. So just double click again on what your understanding is of memory and why you think your sensibilities are that it's so important. - So I think there's two elements to this, and one of them is a user expectation thing, and then the second one is a technical observation. On the latter,

uh well let me let me get to that in a second but on the user expectation thing you know it's funny i keep i always go back to the movie her which i thought was just incredible but like you i think want to be able to talk to this thing and have it remember everything that that you ever told it and if

You had one that knew all of your emails, all of your contacts could remember your to do list when you're about to meet someone that could bring up the last four times you met with them and the reminders you left yourself at that point in time. Like you're talking, you know, we talk about the programmers, 30 percent productivity increase. This could be a human 30 or 40 percent. Like if you have this thing in your head that just remembers everything. So that's.

And by the way, and I said this last time, I think most people have just extrapolated AGI into infinity and think it's going to do all these things. But it's not doing it right now. And you know this because you and I have been talking about this for a couple of months now. If you talk to...

the people that are at the tops of these firms and you say, hey, why can't this thing remember everything I want? And they go, ooh, that's a hard problem. Exactly. And it turns out that just because of the way this thing works,

it would literally have to retrain every night on each, every individual user and training costs are super expensive right now. It's trained on the internet. It's regurgitating the internet. It's not training on everything in your database. And so there are people including open AI and all, I think everyone, this is another one of those things. I think they're all aware of this, but they don't know how to do it technically. And, and, and, you know,

I invite anyone to come on the show that thinks they know how to do it technically, if they'd like to correct us or whatever, I'd love, or if there's a startup that thinks they know how, please come see us. We both fund it. And you and I've talked about this. I mean, and listen, I think OpenAI opened the kimono a little bit. I think they're further out in front than they were.

revealed. But, you know, again, I come back to this idea that I'm really lucky out here. My assistant, Britt, she's been with me for 15 years. She knows me longitudinally. My likes, my dislikes, my family, everything about my kids, everything about hotels I've stayed at, rooms I want to stay in, et cetera. So my expectation of her is that she can take offload a lot of that because she has all that prior history, right?

And, um, but if every human had that, exactly, you think about the productivity unlock for humans, if you give that for free to every human in their pocket, and I'm convinced it's going to happen. But one of the things I would suggest, I had a really interesting conversation with some friends about Apple. Okay.

Because this is the giant, right? This is the thing in everybody's pocket. Nobody's talking about them, but they have so much information about me, okay? They have my contact list. They have my emails. They have my texts. They have all these applications. And so one of the things I'll just drop out there, I think a little provocative about what they may be doing.

Because I've read a bunch of stuff on Twitter about how they're building their large language model of their own. My sense is they're not doing that. Okay. My sense is, in fact, that if you think about it in the context of my assistant, right? So here's the metaphor I give to you. Would you rather have an assistant with the intelligence of like Einstein, but they have no access to the internet and they don't know anything about your history?

Or would you like to have an assistant that's just above average intelligence, knows everything about you, and they can use the internet?

Okay. You would choose that, right? You would choose the, and so think about what Apple's going to do maybe more like a small language model, like really understand all the language, really understand everything about me, really understand how I interact with all these applications. And then when I have a deep problem I need to solve, they can sub-agent it out, right? They can send me down the path of chat GPT or send me down the path of Gemini or send me to Meta AI for an answer engine if I want to go down that

path. But I think that there's this layer on the top that's just a different architecture, a different way of thinking about this that's going to be more like my assistant, Britt, that's just steering everybody in the right directions. I think Apple is superbly positioned to do this. But of course, you don't want it to just, you also want to be able to tell it things that

Just to remember this or mark this down or attach this to a note. And we've talked in the past about how an LLM could be a user interface disruption. And so you could imagine a small business starting with a CRM that is only voice.

Right, right. And you say this customer, this, and you just talk to it and you want it to remember. Right. But that has to be architected that way. Well, think about this. Like, you know, Brett Taylor's new business, Sierra. We're looking at a bunch of businesses in this space. Again, you and I are talking about it in the consumer landscape. Remember everything longitudinally about me, but what is a CRM? It's remembering everything longitudinally about your customers. Well, one thing I want to do just to wrap this up, because I think that you and I are analysts and, you know,

I think oftentimes in our business, people talk about it. Is this company good or is this company bad? And I think one of the things you and I think about a lot is distribution of probabilities. And is it reflected in valuation? So if you pull up this chart we did, which is the MANG comparison, it just shows...

The growth rates and the multiples applied to Microsoft, Amazon, Nvidia, and Google. And here all we did was take consensus numbers. So these are not altimeters numbers. Our numbers are higher for some and lower for others. But one of the things I just want to point out is at the top, this is the 23 through 25 expected growth rates. 14% for Microsoft, 12% for Amazon, 42% for Nvidia, and 11% for Google. So Google is already expected out of those four to be growing at the slowest rate.

But then what's interesting, if you come down here to the price to earnings ratio, right, you'll see that Google is trading at the lowest PE, right? 21 times 24 and 20 or 18 times 2025 expected. So all the things that you and I just talked about, Bill, about growth rate and durability of free cash flows into the future, I would argue a lot of these are already discounted in the stock.

People are already placing those bets. And so one might take the other side of that and say, yeah, Brad, yeah, Bill, I know all those things to be true, but they can cut a lot of costs and do a lot of things. And that could cause the multiple to go up. But if you go to the line under that to the peg ratio, because this is one a lot of people want to ignore, on a price to earnings multiple, for example, NVIDIA is a lot higher. But if you actually look at it on a peg ratio, this year it's a much higher

less expensive company. If you look at it on 25 peg ratios, it's just a little bit different. So there are two ways in which to look at future price to earnings multiples. One's growth adjusted, right? That tries to take growth out of the equation and just look at it in terms of strict valuation. So my big takeaway from this, Bill, is we're not here to pick on Google. We're just to say this is an important case study to watch.

about innovators dilemma. And it's clear to me that investors are already discounting it, that they have some of these headwinds coming.

And I think there may be an opportunity. I said the other day, if they manage to thread this needle, trust me, I'll climb on board that bus because I think there are tremendous costs they can cut out of that business. There's a lot of fitness they can drive into that business. And the real question is, how are they going to drive down the cost of serving these inferences and how are they going to monetize this in a way that-

Google controls the entire Android market, which is a big market. They have a competitor to Microsoft's Office Suite. Now, they have historically not invested a lot in that. It's not a big driver of their revenue. They could. They could all of a sudden triple down.

you know they were ahead in type ahead if you remember like i i my kids use those products and i was always on the microsoft and i can remember like it was finishing sentences for my kids like what was that right that was inside of of gmail yeah first yes and so they have assets that that they could bring to the bear and i and i and i think you know everything you said about apple is true like having the physical control of the physical device

seems real to me like meta like the notion that my ai would live in my whatsapp as a

person i like that doesn't feel intellectually perfect to me like it being in the phone yes feels perfect to me like this thing's with me all the time but let's talk about two things in that regard so we talked about you you talked about memory being a 10x chat gpt moment so you said gpt was one of these 10x moments to you compared to blue links

If we got memory, that would probably feel 10x like it. By the way, while you're there, I have to say one thing that relates to valuation. One thing that drives durability, going back to our competitive advantage period, is switching costs. If I start relying on one of these things as my memory, and I don't have a way to pull that out and jump to something else, I'm going to have to do it.

I'm stuck. Yes. Like I am hooked, locked, stuck. Right. Which means higher multiple. Very, very positive for the person that gets there. So I think I'm looking for memory as a 10X moment. The other 10X moment I'm looking for here, Bill, is actions, right? Going from answers to actions. And so let's talk about that for a second.

Yeah. You know, this company Rabbit's been making some waves. They have their version one out. I saw, I think Tony Fadell tweeted the other day, can't wait to get his hands on one. There's a bunch of cool demos online. We've spent some time with the company. Now, the thing that they have or that they talk about as a huge differentiator is what they call a large acquisition.

model, not a large language model, large action model. And basically think of it like cursor control, Bill. So if I say, and in fact, we did this demo upstairs when they were visiting, I said, book an Uber going from... But it was able to do it. It literally had trained on the behavior, cursor behavior of people using these apps. And it was able to book that without any other intervention by me. So I...

took to doing some research and said, could Apple do this? Because Apple knows exactly what pixel I'm using on the screen to hit a book button on booking.com or on Uber or whatever the case may be. Now, remember, Kaparthi talked about this when he went to OpenAI the first time. He worked on a project that he called World of Bits. And World of Bits, the iconic...

thing he tried to do there, and I think this was maybe five or six years ago, was to book a hotel. Could he get an AI to book a hotel? And he said at the time it was damn near impossible. He had to write all these very specific algorithms, had to try to figure out what every booking page looked like. And he said recently on Lex Friedman, maybe a year ago, he said, I think if I tried to do it now using the general capabilities that exist today, it would be a lot easier.

So I think that Apple's working on this. Clearly, startups like Rabbit are working on it. I think that is another 10x moment that's in front of us, which is we go from answers, where I'm just asking it for information, to actions. And once it can start booking my hotel, booking, reserving my restaurant, and then I just say, stop.

Same thing. Do it again. Right. Because it has a little bit of memory about my prior action. Those are really powerful. There's an element of this that's just a fancier version of screen scraping. Right. There's a there's a there's a hackiness. Yes. To this to this notion. And I have often.

said, you know, why in the world are we writing, in the self-driving world, are we writing, you know, millions and millions of line of code to infer the state of a traffic light? Like, why don't we just broadcast the state of the traffic light? And it would be three orders of magnitude less code. But guess what? I think we literally are going to bypass. I think if we had done that, that also would have been intensive.

right because then we had to wire everything up to be well here's where i think the world's going

We met with these robotics companies. We meet with Tesla, et cetera. Imitation learning, okay? They're not even going to know what the stop sign is or the traffic light is or the dog in the street. They're not going to write C++ for every one of those specific incidents. They're literally going to watch the behavior of the five-star human drivers for enough hours, and they're going to imitate it. All right, but you're missing my point back on the internet side, which is...

Having the AI move my cursor around and click and fill out things is not...

The most efficient way to do this, you would have APIs with these different services and a way to interact. And that's going to be an interesting evolution. And there's a number of startups working on this, too, on different ways to try and drive action. And some of them will sit on top of browsers and do that, or some of them might try and sit on top of your phone. Of course, Google and Apple will stop them from doing that.

I totally agree with that. It's funny. I was asking our analysts, right? 10 billion queries a day on Google today. I say, do the number of queries in the future go up or go down? Right? And I had somebody, if I say Arvin at perplexity, he said to me, well, the number of queries probably goes down because you don't have to ask it so many times. It'll just give you an answer. And I said, what about the positive reflexivity? Once I get the answer, I've got more questions.

Right. Like as long as it's fast and it's producing that information, I actually think actions and memory will unlock more interactions because it's so much more valuable to me. I'll start using it more and more for these future things. And I don't know. It'd be interesting to see there, you know, for a while we've had.

the Alexas of the world or whatever, you know, do integrations, right? And so maybe the possibility exists that if I'm an Uber customer and an OpenTable customer, that eventually I will tell them my favorite

LLM front end, and they'll come to some agreement there so that they can pass my registration information through and that that all happens seamlessly. But there's a lot of work to do to make all that happen. Right. I mean, I think there's a lot of agent to agent interaction that will go on, an AI agent representing both of these parties. But what's interesting about the action model, the hackiness that you talk about, right?

I imagine this will get solved by startups in some pretty hacky ways to begin, but then it will ultimately likely be solved at scale in more elegant ways, whether it's APIs or agent to agent interactions, et cetera. But we're starting to see real experimentation and I've had some of the early prototypes of actions actually coming to pass and I

that feels to me like the next two big breakthroughs are going to be this memory. And by the way, I said this last time and, and, and I, it's a subtle point, but, um, I don't think Google, you know, it's another issue in the, in the disruption. I don't think Google has treated its partners well in the search ecosystem. And so there's a lot of, of angst there and a lot of mistrust. And,

And so if open AI or perplexity came along and said, would you integrate and pass tokens? They might say yes. I think they're going to be more reluctant to do that with Google. I mean, at a minimum, we know they would probably like more competitors in the game of sending them leads, right? So, I mean, I think just the fact that you're a smaller player that you can be another source of competition and they're not so dependent on Google for upstream traffic.

is probably an advantage to you. Well, I know we're going to want to move to the topic of chips here in a second, but before we get there, we touched on Microsoft, we touched on Google, we touched on Apple, just by way of comparison. And people have heard me talk about meta a little bit in this regard. So again, the way we approach the analysis for all large cap tech

We said, is their existing business get better or worse because of AI? And then do their new business opportunities get bigger? Okay.

So in the case of Meta, unlike Google, Google has this massive, super profitable business that's under assault by answers and actions. In the case of Meta, we've seen their core business get better as a result of AI. Why? Because you're targeting videos now on Reels. You're targeting- And that happened pre-LLM. That was already happening. It was starting to happen. But the big difference really, I think, between ByteDance and Meta was that Yiming at ByteDance adopted-

an approach around AI and GPUs before Meta did. And I think Mark really made that transition about three years ago. You can see it in their CapEx spend. But the big question was, obviously, he had to spend the money before he got the results. And so investors like us, we're kind of holding our breath and we're saying, would this lead to better engagement?

Well, now we know. It's led to massively better engagement. And I'm not talking just on Reels. This is on the core big blue Facebook product. This is on WhatsApp. This is on Instagram. So they have these big platforms that are benefiting from both more engagement and the targeting of ads. Remember, this stock was at $90 and everybody said Facebook's dead forever.

Because Apple pushed through their changes that disabled their ability to really track people. And basically, because of AI, they've been able to backfill that monetization completely. Nobody thought, no investors thought that was going to be the case 18 months ago. So their core business got a lot stronger.

Now, as we look ahead, think about the new business opportunities that are in front of them. And I'm just talking about the things that Mark's talked about on the call. Number one, they've got tens of millions of business customers that now they're literally creating these customer service agents for AI agents for every single WhatsApp business. Now, we don't see that as much here in the US, even though WhatsApp is the fastest growing messaging platform in the US. But if you go to a place like India,

or Brazil, people are transacting. Some of the biggest AI engines, right? AI bot companies are being built on WhatsApp as a platform in Brazil and in India, where they have tens of millions of customers already using them. So these have become platform companies that are enabling vertical and horizontal bots, and they're going to build their own. They're going to build it for celebrities. They're going to build shopping agents that assist me buying things on Instagram. I always see all these things I like on Instagram, but it's a

pain to actually buy the things on Instagram. The one click never got that easy. Now I think you're going to see shopping agents that assist in doing that. And then just think about this, content creation, Bill. Whether you're an advertiser, just think what we saw this week with Sora, text to video. I mean, now think of this in the context of an advertiser trying to drive emotions. Or a creator. Or a creator, right? So my sons are creators on these platforms.

This is going to unleash monster amounts of creation in the world at lower costs. And so all of that, I think, benefits their core business. You have these new businesses that they get to move into that I just mentioned. Then, of course, I thought another interesting thing from Morning Brew, I think the pod that Mark was on last week, he talked about the meta AI glasses that all my analysts have, right?

He said, you know, most people, they looked at Mark taking the video, reviewing the Vision Pro from his couch. And, you know, that got a lot of clamor on Twitter. But the fact of the matter is Mark said the way you ought to think of VR and AR really is as your desktop or your laptop. But the meta AI glasses, he said, think of as your phone.

right? Because I'm going to be able to text, I'm going to be able to call, I'm going to be able to listen to music, I'm going to be able to order my Uber, I'm going to be able to do all these things from those glasses, and I don't have to pull out this rectangular thing, or I keep it in my pocket or whatever. I think that's why you're seeing such incredible demand for those. And of course, the form factors will change, and it'll evolve over time. But that's an entirely new line of business. So this is a company that's spending $20 billion a year on these other businesses that haven't been generating a lot of return. And

I think now the market's starting to assign some value to those businesses. But we should be fair, right? Because YouTube benefits from those same dynamics that you talked about. And if you're talking about units of...

of being the phone or like Apple and Google already have- 100%. Huge install base, like what, four orders of magnitude to the number of Ray-Ban glasses? Yeah, no, no, no, for sure. But I think the question is from where you are today, right? And so I'll stipulate YouTube will be a better business in the future. Content targeting will be better, ad targeting will be better, right? And

As long as Google is able to backfill the core of search like we just discussed, then it's going to be worth more in the future. There's no doubt about it. And of course, in terms of just their basic research and development around AI, what they did with Gemini 1.5, et cetera, I mean, they have incredible talent and resources. The only liability is they have an incumbent business that is a monopoly business with monopoly profits. So that we can move on, let's do a fast drive-by. I'm going to do one on Apple and

and then you do Amazon. - As a reminder to everybody, just our opinions, not investment advice. - So for me, Apple, you could argue they have the best asset in the world in this phone. And if you look at the user base of this compared to the Android user base, it's just perfect.

Right. And they they've been doing Siri for a while. And so you connect those two things. You say, shit, like they put an LLM on top of this. They could get to all the data. Well, I assure you could give a permission to read your email. So you could literally get to all the data. So that's a massive positive. Now, the negative is they haven't been known for Internet services. You look at Spotify relative to Amazon Music. Siri's kind of been.

It's been terrible. Not evolving, right? It's very much like it was the day it came out. And so it would almost require a pivot of, like Mark did on cost, you'd almost have to see Tim come out and say, we're making a massive pivot. Right.

like 180 degrees we're going to be all in like like we're putting our best engineers on this and and until that happens i think it's a doubters camp yeah well i mean clearly it's been an underperformer this year um i mostly related to china market yeah yeah you have china market but you have all these concerns in the market going just as to what's the durability of revenue going to be in the future you clearly have but they have the assets imagine if you took like

five of the top AI people at one of these companies and they were there. They were there the way Tony Fidel was there early on for the iPod. If you had that- Listen, for the first time, they have real challengers, whether it's a Humane, a Rabbit, a Meta AI Glasses, these other things. I'm just saying the door has been cracked on the ecosystem.

Sirius not evolved. Right. So they I think they're the first to acknowledge that. I think they are going to try to disrupt themselves about that. I'm not sure whether we'll see a big breakthrough moment this year. I think we'll definitely see announcements this year about AI on the edge, running on the phone and in all these other things. And we'll start to see they'll start to crack the door on this.

To me, the big breakthrough on Apple is if they can run a $5, $10 billion parameter model on the phone, on the edge, without consuming all of my battery, which there's a lot of talk that they're going to be able to do that. They can maintain some memory about me, and then they can show me the early part of actions on this device. It will unlock a huge new device cycle.

Okay. And that's what drives this stuff. I was the one supposed to do. I'm going to ask you a quick question. So on the e-commerce side of the business, does AI help or hurt? Amazon. Yeah. Okay. And how much? Yeah, I think on AI, when I think about retail e-commerce, I think about it from two directions.

First is Apple has been in the business of AI from a merchandising perspective, just like Alibaba has been for a long time. Think about the largest retailer in the world. Think about the way Macy's used to work. There was somebody at the store who would say, we're going to show black t-shirts today at the front of the line. At Amazon today, nobody knows why they are targeting Brad Gerstner with certain things. It's a black box. So they're using it.

But here's the thing I think that is happening a bit to them on that front. And by the way, Andy Jassy is getting fit. They are tightening the screws on costs and all the other things. But look at a company like Timu, okay, that Pim Dodo China owns that quickly became the largest advertiser on Facebook. Its e-commerce sales are through the roof. Now what they're doing, I mean, it's so incredibly clever. It's full stack AI.

So they don't even have inventory or merchandise. They literally go out and they collect data from customers about what they think they will want. They can assess how many of those things to build.

And they literally are building it for themselves. So they vertically integrated this AI e-commerce business, and then they're pushing it out the other end. And so I think there have been a lot of people in the U.S. who have been dismissive, but they've been shocked how big that business has become in such a short period of time. We're starting to see this out of TikTok as well.

where they're turning into an e-commerce business. I think this opportunity sits in front of Meta as well. So I think there's some orthogonal challenges, but in terms of the core, I think their core continues to get better because of better targeting and AI reducing costs. Think about their customer care costs. We do have to move on. Hit AWS.

as quickly as possible. Yeah, so I would say AWS, at the end of the day, these companies are in the business of renting AI services to end enterprises, right? And as much as we talk about Azure and Microsoft running the table today, here's the truth. We've seen almost no share shift from AWS to Azure

as a result of open AI. And if you would have asked me 12 months ago, I would have said, jury's out as to whether or not that's going to happen. It didn't happen. Amazon responded quickly enough. And here's the other thing, and Slootman's talked a lot about this term, data gravity, right? It turns out all my data's in AWS. So long as they have a decent AI solution, I'm going to stick with them because I don't have to move anything. And I think they delivered that solution. I listened to a podcast and Jesse was talking about the fact that they have proprietary chips.

for both training and inference and obviously as the aws stack grew up they had moved into networking chips they'd moved into a lot of technologies people wouldn't have thought about amazon owning or designing do you give them and this would be a good transition do you give them any chance of being competitive in it from an ai silicon perspective

So I think the right way to think about it is not, will they build a better GPU than NVIDIA?

The right way to think about it is can they supplant part of the supercomputer, right? The entire system. Are there pieces that they can pull out and plug in or workloads, specific workloads that they can serve with a lower cost infrastructure because they're doing hyper-targeting silicon all the way to model? And I think the answer to that is yes. But I still think they'll be one of the largest buyers of NVIDIA GPUs in the world because it doesn't replace...

that for a lot of really important workloads. One thing he also said that I think would be good for the listeners to hear...

and, and, you know, I, I don't want to overstate, you know, where Alexa is. And we talked about Siri earlier, but he said that as Alexa got bigger, that the, the training costs are tiny compared to the inference costs. And he suggested, and maybe this is me interpolating that, that the inference market over time is going to be much more susceptible to, uh,

you know, things that are lower power, lower cost, all the things that aren't just performance from a silicon perspective. And I totally believe that to be true. And with that, we can move to the biggest headlines of the week. We finally got here.

which is this debate over the future of the compute build-out needed to support AI. To your earlier point about valuation, how unique is this revenue? How long does it last? And so we have a couple of charts or tweets to bang through here to kind of contextualize this. First, NVIDIA stocks up a lot.

But it's because the revenue and the profits have greatly exceeded expectations. So this chart just shows what their data center market share has grown to in the year, right? The world is shifting toward AI as a compute infrastructure and they benefited. One of the areas I think I tweeted about this that I think has been greatly underestimated is this idea of sovereign demand.

And I tweeted this week, you know, I think Jensen was over in the Middle East meeting with several of the GCC countries over there. And I think what people still don't appreciate is there are probably dozens of sovereigns who are trying to get into the NVIDIA order book and that they view it as one of their top three national priorities to build out AI capabilities. And I think the size you're talking about for some of these GCC countries is going to be competitors only.

competitive with the hyperscalers itself. So in that context, right, when Sam Altman suggested and blew everybody's mind that he was going to raise $7 trillion to, you know, build chips. I don't know if he ever said it. It was inferred and repeated over and over again. Okay, so he threw out a big number. Um,

But I do think that we're talking trillions of dollars over the course of the next four to five years as we rebuild the world's compute infrastructure. And then finally, Masa did not want to be left out. And so he came out and said that he's going to raise $100 billion to build fabs and chips to compete with NVIDIA as well. You've watched this.

the semi-industry for a long time. Okay. And, and more importantly, just the dynamics of supply and demand. So just step back for a second, right? What do you make of all of this? Well,

I have some cynicism, but that comes naturally to me. The first thing I would say is they're all talking about raising money from the exact same people. So if I were those people, I would just be a little careful because I think to a certain extent, there's a smell of loose money. That's how I would interpret it. Because they're not saying they want to raise this money

Absolutely. They're saying they want to raise it from a very specific group in the Middle East. So that's one thing. Two, I was struck when I read about sovereign server stacks. There needs to be a reason, right? It would have to be about wanting to have control over certain amounts of information. It could be proprietary information to your country. It could be wanting to control how LLMs operate in that country.

servers typically depreciate a bunch

bit like fish, you know, and that's been true of DRAM and storage and all of those. Hold on, you said fish, right? Fish, yes. They last a day. Yeah, well, I mean, I'm being provocative, obviously, but people have talked about with those, like, you wouldn't want to hoard any, because what happens is, you know, the next generation comes along and it goes down quickly. So I would just, you know, there was a time at which, you know,

Microsoft was trying to convince the world that we'd all need an NT server for every employee. And, you know, when I heard that the first time, I was like trying to twist my head. So I don't know. I don't know if countries need server stacks. Maybe, like I said, it'd have to have those particular frames in mind. The second thing that just struck me, and this gets more to the Altman and the Masa quote is, you know,

The idea that we're just going to go compete with NVIDIA is pretty radical. There are already people competing with NVIDIA. AMD is competing with NVIDIA. There are other people that have somewhat of a head start. Like decades. Yeah. So you're just going to go do it? That's bold. It's not like chip-

design bins. Oh, yeah, Intel, obviously, but it's not like chip design bins to disruption like software does. This is hard stuff. And then some of them, and once again, I don't know that there was an exact quote, but the idea that you're going to build a fab and compete, you're going to compete with TSMC and NVIDIA at the same time. Like,

No chance. Yeah. Like no chance. Cause like, let's say, let's say you got it. I mean, we all know how the math works, but say you got a 20% chance of competing with either of them. Right. Like then you're down to four, like a pulling this off. And by the way, the timescale that you're going to need, like, I mean, just read, well, we'll get into it in a minute. Cause we'll talk about like,

what it means to have a competitive fab around the world. But TSMC is far, far ahead of the competition. And one of the reasons AMD has a higher market cap than Intel today is specifically because they got out of the fab business and bet on TSMC. I think it's really important to tease out those two things, right? There's chip design.

Right. And obviously, NVIDIA is already designing for two to three generations ahead. I mean, the Series B is already taking orders in the order book, likely to launch in Q3 of this year. And, you know, is order of magnitude better than the H100s that are out there today? And then they're already designing what comes after Blackwell.

And so it's not as though they're standing still. And anybody who knows Jensen, he's an animal and that company is playing for the future. And then if you look at TSMC, and I shared with you a video, maybe just pull up a little bit about kind of the findings from that. But this is from the founder of TSMC and the CEO for decades, Morris Chang, talking about the competitive advantages. And Bill, because this really gets to the fab. Like, why are all the world's fabs in Taiwan? Okay. Okay.

And why aren't the fabs in Texas anymore with Texas Instruments? Or why aren't the fabs in other parts of the world? And what does that mean for the future of this build-out? And I think the implication of these releases is that we're going to start building a bunch of fabs

in the middle east i think we know we're trying to build fabs in arizona there's some talk about building fabs in mexico but maybe just let's deconstruct that one what does it mean to build a fab outside of taiwan to make next generation ai chips you shared this link with me and i it's a talk that morris cheng gave at mit i think in november right yeah very recently he's over 90. and

The first, it's like an hour long talk and the first 60% is a history lesson. But then the last 40%, I would encourage everyone to go watch. Like everyone, including every politician in the United States of America. Because-

Morris makes the point that the reason Taiwan is competitive has to do with the labor model that exists there and the type of work people are willing to do and your ability to retain them. And he walks through his history of hiring in Texas and other parts of the U.S. Yeah, interestingly enough, he ran a fab plant for Texas Instruments in

In Texas. And he explains why Taiwan, like why Texas could never possibly compete with a fat plant in Taiwan. And he admits that the U.S. was a manufacturing prowess in the 50s and 60s, but the social...

requirements that we put on labor at that time are different than they are today. And so whether you look at TSMC, and it turns out the same thing's true of a Foxconn factory in Mexico, you have people working longer hours, sometimes living in dormitories, and he mentions that in his talk. And he

He says the country must more likely to disrupt. Taiwan would be Vietnam or India. Yes. Not an advanced culture. And to think you're going to re-onshore a fab and ignore Morris Chang is just kind of crazy to me. Right. And I look at the...

requirements once again that we put on companies around labor and say to myself, it's not going to happen here. And, and

The people will quickly react to that and say, oh, you're in favor of forced labor or like super hard labor. But the people that are choosing that at that point in time are choosing a better life. Right. And to deny them that opportunity, like the individual that lives in Juarez that's commuting to this Foxconn plant is getting a better life. That's right. Even with his family.

four nights a week in the dorm. And to deny him that and insist on our

Circa 2023 social norms on that country is unfair from my point of view. And denies them their chance at disruption. So when you unpack, and we'll put the link to the video here, when you unpack that message, it's really that Taiwan thrived because these operators and technicians would spend their life working in the same fab.

on this getting getting better and better at the same thing and he talked about a 12 percent churn rate i think when he was at the fab in in texas and he said the problem is the second a better job would come along they would leave for a better job that was it was 12 during during a recession 12 during the recession implying that it was much higher it went to 25 when when when the times were good economy was good and he said you just can't run a fab plant

With 25% churn among the operators, you produce really bad product. And I think the point is, it's not just better life. It's also kind of these cultural norms. And so that's why he said, in Vietnam and India, they have cultural norms, he believes, that are more

consistent with loyalty and staying with something longer. And on top of that, that it would be an improvement in the quality of life for the people who would take these jobs. And therefore, the incentive is there for them to stay in those positions. And neither right nor wrong. Provocative. Yeah, it's provocative because it says that if America is really worried about the concentration in Taiwan, they should probably be trying to help build

semiconductor plants in Mexico or Vietnam or that kind of thing versus bringing them here because the odds of operating them here in a competitive way, globally competitive way are low. So I guess, does that make you skeptical of the CHIPS Act? I mean, I see that Intel is back in Washington looking for another $10 billion to subsidize the work that they're doing. I mean, I get the U.S. national security concern, particularly considering that 100% or virtually 100% of these advanced CHIPS

are being manufactured in a place that has risk, has political risk associated with it. Bill, let me ask you this. By the way, I am somewhat skeptical of ChIP-TAC. And then the other thing I would say to you is,

China's probably in a really good place. They're really smart. They have all the intellectual capability of being competitive, and they probably still have this opportunity for them over time in terms of the willingness of a certain part of their population to be willing to work in those types of situations. I'll take probably the under on that. I think that...

the opportunity... Now there is a global imperative to diversify the source of manufacturing. And I think Morris Chang was having this conversation at MIT recently because he understands the global imperative. I think you are going to see plants that get built in places like Vietnam and Japan and

I think you are going to see them get built in India. You're probably going to see some attempts in the Middle East. Obviously, we're trying to do some of this stuff here. I think from a United States interest, both in terms of wanting to maintain leadership in AI and wanting to diversify our political risk associated with Taiwan.

It's not so important that everything is produced in the U.S., right? That shouldn't be our goal or objective for all the reasons that Morris Chang points out. But I do think it would be better if we had four or five places around the world that were load balancing the manufacturing of these chips. That's a fair point. And I think the whole re-onshoring –

argument conflates the national security interest with a with a interest in American jobs and that kind of thing um going back quickly to these uh new chip companies that are going to miraculously compete with Nvidia I I would I would tell you if if and this is maybe you know an older Venture capitalist talking and one who's watched um different

partners sit on the boards of of startup semiconductor companies it ain't easy you know and you know the the first silicon that comes back doesn't always work yeah and you might be 50 million to first silicon you might be a hundred million to first silicon right you've got to get in line at tsmc right how are you going to break that door down how are you going to out compete nvidia for tsmc's time right like right like how are you going to get on that

place. And it's hard, you know? And by the way, once you do get working silicon, your yields are probably crappy. Yeah. Like that's what happens. Like this is, this is physical material science type stuff. It's not software. Right. And you're going up against

Again, two companies that are run in pretty exceptional ways by exceptional founders. And the case of Jensen has been there for three decades. He's devoted his life to this. TSMC seemingly has similar types of leadership. But one of the things I wanted to pivot to on this bill, because it begs the question, why is Sam throwing out this really big number?

right? Why is MASA throwing out this really big number? And I think the answer, like one of the things I want to talk about is this, which is just what is the size of the market opportunity that we're talking about here? And so I have a slide. We had Jensen when he was in the Middle East, he mentioned, and the quote was, and this was just from Feb 24th, he said, there's about a trillion dollars worth of installed base

of data centers around the world. And over the course of the next four to five years, we'll have 2 trillion of data centers powering software around the world, and it will all be accelerated compute.

OK, and so I asked my team to break that down a little bit. Like what you know, how does that look like per year? Right. To get to this number. So, of course, I'm doing this from outside in. We take a swag at it and it pulls up this next line bar chart. So this is the AI data center build out.

In blue is the new accelerated compute, right? In green is the replacement data center that we think will go to accelerated. And then in gray is the replacement that's non-accelerated compute. So this would be more like, you know, x86.

And again, I'm certain this is wrong specifically, but that's what we're in the business of doing, trying to build a forecast based upon folks who are providing this information. The line running through the center that starts at 55% and goes down to 26%, that is NVIDIA's share of that global compute build-out based on current consensus numbers for NVIDIA.

Okay, so the consensus forecast that has the stock at $700 a share assumes, if you believe this TAM to be accurate, assumes that their share will go from 55% today to 26% in 2028. Now, I think if you just step back and you say, okay, do we think we're going to go from a trillion of data centers to 2 trillion of data centers? Just ask that at a high level, okay?

Well, you and I just spent an hour plus talking about how 10 billion queries a day are likely to move from information retrieval, right, to inference as we as humans expect to get answers rather than a card catalog.

We talked about enterprises, whether they're doing, uh, their engineers in code generation or whether customer service centers or whether Tesla and full self-driving or whether it's sovereigns who are taking on national security issues, uh,

you know, drone fleets or whether it's proteins and life sciences or material sciences, there isn't going to be a single industry on the planet that's not employing accelerated compute in order to solve the problems of their enterprise. So if you said to me with that as the backdrop, a year ago, I think the big question, Bill, was, is there going to be enough productivity gains in the world to justify this compute build-out? Remember,

People thought, oh, we're pulling forward all the demand for NVIDIA. This is like dark fiber in 2000. We're going to be way oversupplied. We're going to have a glut. I think the evidence on the field is that that was wrong.

I think the evidence on the field is that, in fact, just like we talked about on the last pod, we tend to underestimate the size of these super cycles because when you have these phase shifts, everything changes. You have positive reflexivity in the world. More begets more because it's better, right? And so I think the bigger question when I look at this chart and what I push my team on

What I'm certain of, you know, the rumored pricing of H100s to B100s is that B100s are going to cost even more than the H100s. And so I say to my team, like, these margins have to get competed down, right? But the feedback and something I think that's really important is that although the B100 is more expensive, it's so much more powerful, right?

right? It's like this. If you had an employee bill and you were paying him $100,000 and I said, hey, you ought to hire this other guy. He costs $200,000. You say, well, why would I hire him if he costs $200,000? I would say he does 10x the work of the guy who you pay $100,000 to. You would pay him $200,000 in a second, right? And I think that's why NVIDIA today is getting those margins. In the future, I expect

There's going to be more competition, whether it comes from this custom chips that you're talking about, whether it's from other competitors like AMD, whether it's from new startups from Masa or Sam, et cetera. But you and I just talked about the challenge to build a fab and the challenge to design those chips is non-trivial. And the probabilities are somewhat low. And so it's going to take time to get there. I mean, if you're starting today, when would you have an impact? But one question I would have for you on this is,

If you're right about this, I wonder about TSMC's capacity. Right. And that is a limiter to this. Right. So you're looking at the chart and saying, how do we get to $2 trillion of replacement in new if TSMC is gated in their ability to produce these? Now, Jensen gave this number.

He's intimately familiar with TSMC and their ability to produce. So I think he has a sense in his head about what they can get done. I think that the other limiting factor we're going to run up against here pretty quick, it's not going to be capital.

It may or may not be TSMC, but the power consumption. So even for the B100, the data center designs, like you're talking about liquid cooling, custom designed data centers, they're going to consume massive amounts of power. And I think part of the reason you're hearing about this sovereign demand from the Middle East bill is they understand that their chief competitive advantage is low cost energy, right?

And I don't think they're talking about building all of this just to service the needs of their country. I think they're smart enough to understand they're trying to equitize all their petrochemical wealth into technology wealth in the future. And if I was running one of those countries, I would look at this phase shift as an opportunity to become the supplier to the world of computer-aided intelligence, right? Right.

And if they can do that because they have lower cost energy and because they can recruit the likes of Sam Altman, they can recruit the likes of TSMC to their countries to set up shop there, to design chips there. It's not all that different than digging wells.

Right. Think about digging a well. You have to spend a lot of money, a lot of time, a lot of research. You're hoping you get your payback five, 10, 15 years into the future. And so I think this is a rational decision by them to build out this capability. But to your point, it's a non-zero and non-trivial undertaking to try to get it done. Now, if they do that, it's going to put them in competition with some of the hyperscalers.

right? From CoreWeave to AWS to others who are in the business of renting AI capabilities out. But I think it's good for the world because what we want to see is a lot of competition,

We want to see the price of AI compute fall over time. That's going to lead to a lot more consumption. And because of the productivity gains from the end applications, again, self-driving cars or coming up with vaccines or solving complex problems or just allowing consumers to have answers instead of 10 blue links, we need the cost to come down on all this stuff. And maybe this would be a good way to wrap, but if you bring that advertising

attitude to the table. I mean, it sounds like, and I don't want to put words in your mouth, but there's no reasonable end in sight from where you sit, like on this wave. We're at the very beginning and it's going to go for a long time. I said last week, and I think we had a little video that went out, and I said, we are going to hit a zone of disillusionment.

I don't know whether it's this quarter, maybe tomorrow with NVIDIA when it starts, right? We're going to hit a zone of disillusionment where you have a mismatch between supply and demand. And then all the skeptics are going to say, I told you so. The internet's a fad. AI's a fad. Mobile's a fad. It happens in every super cycle. We're going to use that as a buying opportunity because we are absolutely convinced that...

that the runway is longer and wider and the impact on humanity is going to be greater because the end applications are so compelling that are using AI to assist them in everything that they do. But that's really where the tug of war is in the world today, Bill. And that's what makes a market, right? You're going to have those people who are skeptics about that demand. That's what creates a wall of worry. Why does NVIDIA trade at 20 to 30 times earnings?

I would say because there's a lot of skeptics and there's a lot of worry about whether or not these free cash flow is durable into the future. I bet you the worry is more on pricing than volume. Right. And by the way, it's unknowable. I don't know. Nobody who follows this knows. So you have to assign some probability to that future outcome. But I think you're right. It is a good place to leave it. I wish you could be here. I know you're anchored down there in Texas, speaking of Texas fabs.

But this is fun to have you here in person. Good to see you. Like old times. And I think we're going to be talking and debating this for as long as we're doing this pod, for sure. No doubt. No doubt. Take care. Take care.