We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Nvidia’s New Chips, with a Side of Valuation

Nvidia’s New Chips, with a Side of Valuation

2025/3/19
logo of podcast Motley Fool Money

Motley Fool Money

AI Deep Dive AI Chapters Transcript
People
A
Asit Sharma
金融分析师,专注于市场趋势和公司表现分析。
D
Dylan Lewis
金融播客主持人和分析师,专注于市场趋势和投资策略的解读。
J
Jason Moser
作为 Motley Fool 高级分析师,Jason Moser 专注于提供深入的财经分析和投资建议。
Topics
Asit Sharma: 我认为 Nvidia 的 Vera Rubin GPU 代表了计算能力的一大飞跃。它在 Blackwell 架构的基础上进行了改进,拥有更强大的计算能力和更快的 NVLink 扩展能力,这将使其在未来几年保持竞争力。然而,我们也应该对 Nvidia 的增长前景保持谨慎态度,因为现在各个层面的竞争都日益激烈,许多公司都在试图挑战 Nvidia 的地位,例如 Meta、亚马逊和谷歌都在开发自己的 AI 芯片以降低对 Nvidia 的依赖。尽管如此,Nvidia 持续创新,例如推出基于 Python Pandas 库概念的 CUDA 库,这表明其技术在未来仍有广泛的应用前景,可能改变我们对大数据使用和处理速度的假设。此外,Nvidia 与 GM、迪士尼和 Yum Brands 等公司的合作伙伴关系也为其未来的增长提供了新的机遇。 Mary Long: Jensen Huang 对 AI 基础设施的千亿美元投资预测,虽然宏大,但并非完全没有根据。我们需要对 Nvidia 的增长前景保持谨慎,但也不应完全忽视其潜力。市场对 Nvidia 的增长预期似乎过于悲观,其当前的估值可能被低估了。 Jason Moser: 在评估一家公司的股票时,应该将客户服务体验作为考量因素之一,但每个投资者需要根据自身情况来权衡其重要性。就 PayPal 而言,虽然其客户服务存在不足,但我仍然看好该公司在数字支付领域的领先地位以及其未来的增长潜力。 Karl Thiel: (此处补充 Karl Thiel 的观点,至少 200 字,第一人称视角) Dylan Lewis: 对于初级投资者而言,开始时最好少量投资多家公司,而不是集中投资少数几家公司,这样可以学习更多关于不同公司的知识,并随着时间的推移逐渐集中投资于表现更好的公司。Nvidia 的“雷达股票”板块旨在讨论市场上一些值得关注的公司,这些公司可能很快就会成为新闻焦点,或者分析师认为投资者应该关注它们,无论好坏。

Deep Dive

Chapters
This chapter discusses NVIDIA's new generation of GPUs, Vera Rubin, its improvements over the Blackwell architecture, and the increased competition from hyperscalers. It explores the challenges faced by companies trying to rival NVIDIA's technology and the potential for future growth in AI infrastructure.
  • NVIDIA's Vera Rubin GPUs offer significant improvements over the Blackwell architecture, including doubled NVLink scaling and almost 10 times the aggregate compute power.
  • Hyperscalers like Meta, Amazon, and Google are developing their own AI chips to reduce reliance on NVIDIA.
  • NVIDIA's strategy focuses on replenishment cycles and upgrades to existing data centers to maintain its position in the market.

Shownotes Transcript

Translations:
中文

There's a new chip on the block. You're listening to Motley Fool Money. I'm Mary Long, joined on this fine Wednesday morning with a Mr. Asit Sharma. Asit, thanks for being here. Mary, thank you for having me.

Of course, of course. We've got some big news happening after we record, but before the show publishes this morning, that is, of course, that a Mr. Jerome Powell will be making some announcements later today. We're not going to hit that on today's show. Ricky and Nick Seipel will cover that tomorrow. Instead, Asit, you and I have another big name dropping another announcement or multiple announcements. That big name is Jensen Huang. NVIDIA had its

GTC, GPU Technology Conference. It's running throughout this week. And yesterday, Mr. Huang delivered the keynote speech.

Asit, this is an event that used to be kind of predominantly catered towards academics, and now it's turned into what the New York Times dubs the Super Bowl of AI. We got a number of things coming out of that keynote that you and I will hit on on today's show, but we'll start with this. In late 2026, NVIDIA will release its next generation of GPUs.

They're calling this generation Vera Rubin. So what do investors need to know about this upcoming generation of chips and how it's different from perhaps, should we call it its predecessor, the Grace? Mary, I'm going to call it Vera Rubin, just for some fun shift in pronunciation. It's probably Vera, but anyway.

What is Vero Rubin? Vero Rubin is an improvement on the Blackwell architecture. That's NVIDIA's current biggest and baddest GPU complex accelerator technology. Vero Rubin

does improve on grace. You're referring to the CPU, the chip that goes in the Blackwell. Blackwell has a GPU unit and a CPU unit. So, Vero Rubin replaces that chip unit grace with something called New Grace, which is supposed to have two times the performance. And overall, this GPU system has a lot of compute power that improves on Blackwell.

For example, it's got what is called NVLink scaling. This is when GPUs in a system communicate with each other, send information back and forth. That doubles the power of over Blackwell, the ability to scale between GPUs, NVLink scaling. It has something called HVL.

You probably heard us, if you've been listening to Mary and I or myself and Ricky over the last year or two, talk about something called HBM3, which is a type of memory within GPU structures. This is the latest version of that. It just has much more compute power, almost 10 times the

aggregate compute power of the Blackwell platform. So in many ways, Vera Rubin just represents a really big leap. And then it's going to be followed up, Mary, by something called the Vera Rubin Ultra, which is a little bit funny. Okay. I'm digging here a little bit at something that shouldn't be criticized too much. I mean, it's going to be

an advancement over an advancement, the next next generation, the company's platform. But come on, this reminds me of new and improved from the grocery store when we were growing up, Mary. After a while, how do you communicate that something's even better? Not Vera Rubin Plus, but Vera Rubin Ultra, sounds about right. I want to lay a criticism here on Nvidia, which is a company I admire. The CPU of this Vera Rubin system, which we called New Grace, they called New Grace,

is complemented by the new GPU, Graphics Processing Unit, in the system, and that's called the CX9.

Again, NVIDIA may be running out of inspiration here when you start naming your GPUs after Mazda SUVs. Asit, you and I have a tendency to really harp on what's in a name and why a company decides to name a certain project X, Y, or Z. And so we were talking a bit before the show about it's interesting to see NVIDIA in particular, we'll pick on them right now, to see NVIDIA in particular give these

these names to all these different products, but they can be very hard. I appreciate why they're taking the time to name something, name a GPU, the Air Rubin, rather than HBM or a series of numbers and letters. I do appreciate that.

But it still can be very hard for the layman to differentiate between all these different names, between CUDA, which is more of a software platform, and the Grace, the Rubin, and Blackwell and how these pieces fit together. So thank you for kind of giving us the lay of the land a little bit.

and giving us an insight into what all these new announcements actually mean. When it comes to what all these new announcements actually mean, some news that came out last week was that Meta is planning to train its own AI chips. Amazon and Google are already in the process of doing this. And this is all these companies, these hyperscalers are training their own AI chips in the hopes that they can ultimately reduce their reliance on NVIDIA.

So help us understand what NVIDIA's head start looks like, because they are kind of the go, not kind of, they are the go-to player in this game right now. Realistically, what do Meta, Amazon, Google, anyone else that's interested in training their own AI chips, what do they actually have to do to build in-house chips that genuinely rival what NVIDIA is putting out there?

Mary, one thing they have to do is to make simpler chips that have custom purposes. So when you train a model or when you get inference out of that model, not all of it has to be done on the latest and greatest GPUs. And so to the extent that a company like Amazon with its Tranium chip, it's now approaching the 3 Series,

A company like Meta, which is getting into the game, or even Microsoft, which is later to the game with a chip called Maya, what they're trying to do is to cut costs dramatically. So when you or I use a large language model on their cloud platforms, it costs less for us, the end user. We're paying some way. And enterprise businesses are paying direct to these companies. So they have a lot to gain because they really don't need the overkill of

awesome NVIDIA GPUs for many of these use cases, and that's why they're investing in this. They're working with great design companies, they are taking models, sending them over to TSMC, a great foundry, getting back prototypes, and then gradually -- I think Amazon is furthest along in this race -- just putting them into the data centers and showing a cost savings. On Amazon's last conference call, CEO Andrew Jassy said, "Look, we're reducing compute costs by 30% in some cases."

Now, NVIDIA is looking at this and saying, ha ha ha, why? Why is NVIDIA...

smugly, smirking and laughing because we want ever more powerful compute. And the world is moving in the direction of reasoning models where we're asking the models to think in steps, to go out and research on the web, come back to us with answers. And really the chips that are best designed for that are the GPU/CPU complexes that NVIDIA builds. So it believes it still has a really, really major role to play in this world going forward.

I think maybe the true story is somewhere in the middle, and time is going to tell how it really shakes out. Patrick Moorhead, who's the founder of tech research firm More Insights and Strategy, told the New York Times for an article kind of about, again, this conference, the Super Bowl of AI, that, quote, the gravy train comes to a screeching halt if cloud companies stop spending, end quote. So that's the gravy train that NVIDIA is currently reaping the benefits of.

What can NVIDIA do today to freeze some of that gravy for the future if there is a potential hibernation winter period ahead? Well, one thing they can do is to focus on the replenishment cycles. So even though data centers may get crimped at the margins, the data center buildouts, you have all these existing data centers that need to be upgraded in terms of their capacity, server capacity, memory capacity, etc.

et cetera, to meet the demands of our compute. Just because AI might become less expensive doesn't mean we're going to stop using it. So NVIDIA can really think forward in terms of where they can start going to these customers and working on upgrades to existing data centers if they pull that new build out. And I don't,

I don't think that'll necessarily happen as sort of this cliff moment, which some project. If you look back before what happened with generative AI, we had a really slow, steady, but almost inexorable march to build data centers just to handle the migration of

enterprises taking their stuff off of premise into cloud. And those percentages, if you look at them, they still have a long way to go. Even that, people have sort of forgotten this movement is still in motion. So over time, I think what we're going to see is, yeah, there could be some valleys, some unexpected slowdowns in data center builds, but the direction of the future until we can make these things

much more efficient, both from a consumption standpoint and an energy standpoint, is to keep building. And NVIDIA stands to gain from that.

Oh, and the direction of the future that Jensen Huang sees is a trillion dollars in AI infrastructure build out. That was a number that really stuck out to me. He mentioned that the company has already built out about $150 billion of AI infrastructure. But again, that he sees a roadmap to a trillion dollars sooner rather than later. What did you make of that number, that statement, that vague timeline?

Not much. It's not too different from a number that he dropped three years ago. It's just that no one could see that future. He was talking about hundreds of billions within a few year periods three years ago. And it just seemed like, well, I guess if you're talking about extremes and everything falls into place and everyone would have to be using this stuff, right? I mean, everyone would have to be using Chad GPT for that to happen. I guess so, Jensen.

So if we go back and extrapolate his original projections, guess what? Right on the mark, a trillion bucks. So that's interesting context, too, because, you know, we've seen NVIDIA stock go down about 15% thus far this year. It was down over 3% yesterday alone, in spite of all these big advancements that Jensen Huang is touting. NVIDIA's peg ratio today is just under 0.3. So despite all of the advancements that they're rolling out, that they're talking about and hyping up,

The market seems to think that this company's growth is going to stall out. Did you hear or see anything during yesterday's keynote, kind of throughout the span of this conference, that made you think, oh, maybe the market's got it right? Or do you think this is another situation where we're kind of where we were three years ago, where Jensen Huang is saying, this is the future, and folks just aren't buying it because they can't see it?

I think we should bring a lot of skepticism to bear to NVIDIA's growth thesis, just because now everyone wants to disrupt them on every level. And companies that didn't have to compete with NVIDIA yesterday, let's look at Arista Networks in the networking space, suddenly find it as a competitor and they have to shift their models. Arista Networks is an amazing company, by the way, so it's not like they're going to completely reinvent their business, but now they're paying attention to what could be some competition for them.

So, I think when you think about this, you think about companies that are trying to figure out ways to make compute more efficient. You have small competitors that are questioning even the way we build GPUs. I think we really have to be skeptical on the growth thesis. But, I mean, you bring up something interesting, Mary. What you're basically saying is, hey, you know that PEG ratio, what if that means that it really is undervalued in relation to its potential to earn out in the future? And yeah, I saw lots of stuff in that conversation.

presentation that told me this could happen. Just take one thing. So you mentioned CUDA, which is a collection of software acceleration libraries. We've talked about this before, Marian, and I've pointed out that, look, CUDA is lots of different libraries. There's one for high precision math. There's one for aeronautics. There's almost one for every really cool use case you can think of across so many industries, right?

And yet we see another one that was introduced yesterday, which is a version of something those who know the language Python, programming language Python, are very familiar with. There's a library called Pandas and it has to do with something called data frames. It's essentially how we manipulate data. What NVIDIA is doing now is taking the concepts behind data frames and extrapolating that into massively parallel computing use cases.

which just points to a future in which the assumptions we had about how we use big data and how fast we can reek stuff out of it, maybe those assumptions are off. Because if we take the same concepts from Python but throw those into these great acceleration libraries which are running

on GPU sets that are much more powerful than what is on the ground today, then it means that lots of fun things that we do with computation today we can do in a much shorter time and there will be potentially some very interesting advancements in science,

in big data, etc. That's just one little thing I saw that indicated to me that they keep innovating and creating a future use for their technology. So, I wouldn't bet against the company. I think right now it's sort of like, be skeptical, be very skeptical, but don't discount the possibility that three to five years from today, you and I could be talking about some ho-hum, amazing double-digit growth rate that Nvidia holds.

One of the central themes of yesterday's keynote was that AI has use cases in every industry and that it's NVIDIA that's really laying that foundation across all those industries.

So as a part of that thesis, we have this many announcements about partnerships between NVIDIA and different companies. So I'm going to call out three different partnerships that stuck out to me, and then we can kind of, you let me know which one's most interesting to you, and we can kind of go from there and give a few more details on it. So we've got GM and NVIDIA teaming up to build out GM's fleet of fully self-driving vehicles.

We've got Walt Disney, Google's DeepMind, and NVIDIA building a platform that will, quote, supercharge humanoid robot development. We've also got NVIDIA partnering with Yum Brands. That's the parent company of Taco Bell, KFC, and Pizza Hut. And they're going to roll out AI order taking. Which one of these do you want to take, Asit? Let's do them lightning round, all three, because they're all fun. Let's start with the drive-thru.

Okay, so we'll start with the drive-thru. First of all, my question here is, do you want AI ordering? I've seen a ton of fast food companies roll this type of stuff out. McDonald's rolled this out and then also rolled it back. McDonald's this past summer ended their partnership with IBM when it came to AI ordering.

I think that this idea of efficiency at all costs at the drive-thru, I understand why companies are pursuing that. But it also runs counter to a story that we've seen play out at Starbucks where, hey, you lean too hard into efficiency and that actually pushes some consumers away. So what do you think the future of this Yum! Brands-NVIDIA partnership could look like? The AI is getting better and better.

at the end of the day. What I want is really human interaction. But you know what I'll take, Mary, is an AI that's smart enough to get something wrong in the order, so I have to correct them and just fool me. And you wanted large fries with that? No, ma'am, I said medium fries. That's the AI interjecting something to make me think it's human. So, I'll take it in that instance.

So then we've also got this self-driving feature. What's interesting to me is that GM has struggled in its attempt to build out fully autonomous vehicles. Last year, they pulled funding for its cruise robo-taxi company. Were you at all surprised to see GM seemingly giving full self-driving another go?

I guess I was, but obviously I wasn't paying enough attention because GM sort of signaled this. We talked about it, Mary. We're not going to focus on trying to have a full fleet of autonomous vehicles. We're going to focus on the software side, getting that into our vehicles, working with simulations so that our software is second to none, the software that's in the vehicle for your driving experience. So it's assisted driver technology. That's interesting.

Okay, last but not least, we've got this robot situation. I mentioned a partnership of three between Walt Disney, Google's DeepMind, and NVIDIA. I get Google. I get NVIDIA. Walk me through how Walt Disney fits into that trio.

Walt Disney actually has a long history with robotics. Think the animatronics at their theme parks, they're very good at robotic motion and making it human-like. They also have research labs out near Burbank, near their studios, which have been working for a long time on robotics and the science, the physics.

of how these robots work and the expression of that. And then you think of their animation and motion expertise that divisions like Pixar bring. And you actually have a company that makes sense when you put that puzzle together with those gurus from DeepMind and NVIDIA's just very strong research into the robotics field and its forte in simulation and virtualization. I actually was surprised but thought to myself that

That makes sense. We got Quantum Day coming up tomorrow. I know you said that you're really excited for that. Anything in particular that you'll be keeping an eye or ear out for? NVIDIA already said that it's working on some CUDA libraries for Quantum.

And there are some really interesting problems that quantum computing needs to solve before it becomes big time. One of those is correcting for errors in the computation. So without getting into any weeds, because we haven't got time, I'd be very interested to hear about how their software might help on the quantum level, cut down the error rates when we ask these particles to do their thing and measure them and try to get a mathematical result out of that. So that's what I'll be looking for tomorrow.

Asit Sharma, always a pleasure. Thanks so much for talking in video with me today and for breaking down these often complex topics. Hot stuff, Mary. Thank you so much. You got questions, we got answers. Up next, we continue with Mailbag Week and turn to a number of Motley Fool analysts to answer your questions about fundamental analysis, AI in healthcare, portfolio management, and how customer service experiences ought to affect your investment decisions.

No problem.

Get 24-7 professional answers and live help and access support by phone, email, and in-platform chat. That's how Schwab is here for you to help you trade brilliantly. Learn more at schwab.com slash trading.

Every now and then, we'd like to turn to our listener mailbag to see what kind of questions are on your mind. We noticed that there were a lot of questions that were roughly about how to get started investing. So for today's show, we rounded up a few analysts to answer your questions that are geared towards the beginner or intermediate investor.

To kick us off, we got listener Cody King, who wrote in, "Fools, for new investors, what are some fundamentals to look at when considering investing in a stock? Is it P/E ratio? Most recent quarter's earnings? News headlines? Are any metrics overrated? What is the key info to dive into to determine if a stock is a winning investment?" For the answer, we turned to none other than Fool analyst, Asit Sharma. Cody, I love your question.

For me, there are two things that go on when I'm thinking about stocks or selecting stocks. One is to screen for new ideas or turn over stones, if you will. In that sense, different metrics like the P/E ratio can be very useful in just comparing companies and seeing what might be a really high P/E. Sometimes that's a signal for a good stock, because you're looking at a company that must be growing its revenues or has the potential to do so.

But let's take a step back, because your question really asks, what is the key info to dive into to determine if a stock is a winning investment? And that's quite a different thing. For me, the most important thing is the story, the narrative behind that company. If you can grasp that, then you can put an overlay of everything else. You can crunch the financials, you can look at ratios, you can compare it to other companies.

One of the places that I like to start is the M, D, and A section of a quarterly or annual report. You can go to EDGAR, this is scc.gov, and search these reports. They come labeled in varieties of 10-K for annual and 10-Q for quarter.

When you go to this section, what you're looking at is a required explanation of what a company does and how its recent past has fallen out. It's a required disclosure from the SEC. It's part of regular reporting. That's a great place to understand exactly what management thinks about its company, how it presents its products or services, where it sees the business going. That section discusses

recent results. I like to start from there. If I can grasp that and really feel that I'm starting to understand what a company does and what its potential might be, then the other things that you mentioned, like listening to earnings transcripts, looking at the most recent quarter's earnings, thinking about the news headlines,

Those are really key at that point to getting a beat on whether this is a company you want to put into your portfolio or take a pass on. We talk about a lot of stocks on the show. Some of them are genuine investment ideas. Others are just public companies that we find interesting or newsworthy, but don't necessarily think of as strong stock picks. One listener, Sumit Maru, asked about how listeners ought to think about what we call radar stocks. They write,

I listen to the show religiously every day. One of the questions I have is about the radar stocks. Every Friday, the team picks two radar stocks. I love what Jamo, Mattie A., and Emily bring to the section, but I'm not sure what everyday investors like me should do with those picks.

For the answer, we turn to the host of our Friday show, Dylan Lewis. I'm glad someone asked, because I love the radar stock segment, and I think sometimes it does go on without explanation, Mary. So, for me, I think the radar stock segment puts us in a position to talk a little bit about stuff that is going on in the market that our analysts really want to discuss or feel like will be popping up in the news at some point in the next week or two. Very often, you'll have our analysts previewing a company that's reporting earnings,

Sometimes they'll be talking about a company that they're not particularly excited about, but they feel like it's something that they should bring up for listeners of the show if it's a company that's in their portfolio. And so I joke sometimes that people bring radar stocks on for good reasons, because they're excited and they're following this business. Maybe it's a future watch list stock for them. Sometimes it's a radar stock for a bad reason, and it's more of a moment for us to give people a check and make sure they're paying attention to some of the things that could affect companies in their portfolio.

Speaking of investment ideas you get from the news, it's no secret that excitement about artificial intelligence is everywhere. Dana in Cincinnati wonders how healthcare companies in particular are using AI and whether she might find any intriguing stock picks herself from looking at the intersection of those interests.

In a multitude of ways. And that's probably why it's something we can talk about at greater length. But just to pick an interesting area has to do with how you design drugs. And some people may know that the Nobel Prize in Chemistry in 2024 went to

the developers of AlphaFold, which is a program that can predict protein structure just from the amino acid sequence. That's actually not a brand new thing, but using some AI tools, they were able to make it better than it's been in the past.

And the upshot of that is that you can actually visualize in three dimensions what target you're designing a drug against with a lot more ease than you could do in the past. Once you get to that point, you can start to use digital tools to help make molecules that sort of fit like a hand in glove into those targets that you're looking for.

So these are all approaches that have been used for years. It's often called rational drug design. It's something that most companies do to some extent, but you're seeing it step up in its complexity. And so it remains to be seen how much this bends the curve when it comes to speeding anything up or making anything better than we have in the past. But you're seeing some interesting programs sort of advance into clinical trials and it

Particularly interesting are companies that are going after targets that were once considered, quote unquote, undruggable. So a target where the pocket you're trying to bind into is very hard to fit without making basically a key that opens everything. If you make a key that opens everything, that's called having a ton of toxicity.

You need it to bind to just that target. And so if you can do that and show that some of these work, I think you're really going to see even more emphasis on this area. You can learn a lot about companies not just through the news, but also through your own experience with them as a consumer. Yuan Liu wrote in with a question for Fool analyst Jason Moser, which will paraphrase here. "I know you like PayPal and want to ask, why is its customer service aggressively mediocre? It used to be that you could just pick up the phone and talk to someone,

Or leave a private message on a message board and someone would get back to you. In the past few years, however, all I can do is initiate a chat with absolutely no ticket ID, no queue name, nothing to allow the next agent who happens to be on the thread to understand what my question was or what information has already been exchanged. How much weight do I put on this experience?

when it comes to making a decision about buying or selling any kind of stock? I think that's a good question. One of the reasons is because there's not one definitive answer. This is something that each investor has to answer for themselves.

So, for example, I'll say one of the reasons why I like PayPal is just because it always works. I honestly don't think I've ever had to actually contact PayPal customer service. So, I think, based on the question, I'm considering myself lucky at this point, because that stinks when you get bad customer service. And so, I suspect if I had to interact with a company and continually got

bad customer service, that would absolutely make me question whether or not this was a business that I really ultimately wanted to own. Now, you see these stories play out all over the market. I think Comcast stands out as an example as a company that really owns their space in a lot of ways. I mean, certainly is one of just a handful of very big players in what it does.

And I think perpetually they've had a reputation for just awful customer service. Now, again, not a Comcast subscriber. We don't have it here. So I don't use it. I don't have that experience. But based on everything I've heard, that starts to make you wonder, well, is this a company I feel really, you know, do I feel good about owning this? And so I think for every investor, they have to be able to weigh that a little bit in saying, well, is this a place where a company could improve?

versus are there a lot of positive qualities that this company already has, right? None of these investments is perfect. And I think that's something always to remember is that none of these investments is perfect. We try to identify the areas of weakness where they can improve, try to identify the areas of strength

And then you kind of have to weigh those against each other. And I guess for me with PayPal, I mean, I still own shares in the company and I've owned them for many, many years. It just strikes me as one of the companies that's really leading the way in this digital movement of money. Are there things that they could be doing better aside from customer service?

Absolutely. And I'm hopeful that new leadership here in Alex Criss is really spearheading that. He seems like he has some really neat strategies, and I think the market is starting to take note of that. But ultimately, yes, this is a question that each investor has to answer for themselves. I think looking at the customer experience, that's something that you definitely need to weigh. That's a valid concern that every investor should take into consideration. Once you've found the stocks you want to buy, you'll have to figure out how big a role you want each of those positions to play in your portfolio.

Over on X, Jorso asked us, "For a beginner, middle-experienced investor, are you better off buying small shares of multiple companies or saving up and buying a larger share amount in fewer companies?" For the answer, we look to Asit again. Jorso, I think for beginner, middle, and even advanced or very experienced investors, we're all better off buying small shares of multiple companies at the start of the game. Now,

Yes. I'm being partly facetious here, because if you are an extremely experienced investor, you may have this actually already in practice, but in a different way, because you're building up a lot of conviction and then putting your money in and putting serious capital to work. But the principle is that

We are taking small stakes at the beginning to learn about businesses as they grow, and we're going to keep adding money to the businesses we come to understand better and that keep performing as time goes on. That's a really great way to make money. It's actually

probably a better probabilistic strategy versus identifying at the outset a few companies where we're going to allocate a lot of capital to. That implies that you've got a really good edge on the house in this game. Ricky Mulvey and I had a great conversation likening investing a bit to probability and playing cards.

For most investors, the opposite is the strategy to use, which is, you're going to find out over time where you're going to concentrate your capital. Don't do it at the beginning. I will say, over time, even though Warren Buffett is associated with taking big and concentrated bets, we've seen him in some businesses scale up over time. Of course, his billions

have a lot more impact than maybe our hundreds or thousands. Nonetheless, this is a really fun strategy to employ because it allows you also to learn about a lot of companies and to extend your investing chops. I'm all for going with more companies, smaller positions, taking your time, and then concentrating further investments on those winners, and also adding new ideas to your portfolio as the world changes.

We love getting listener questions. So if you've got a question that you'd like to hear an answer to, write to us on X or shoot us an email at podcast at fool.com. That's podcast with an S at fool.com. We'll be sharing a few more of these with you on tomorrow's show. See you then, fools. As always, people on the program may have interest in the stocks they talk about, and The Motley Fool may have formal recommendations for or against. So don't buy or sell stocks based solely on what you hear.

For the Motley Fool team, I'm Mary Long. Thanks for listening, fools. We'll see you tomorrow.