We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode NVIDIA's Josh Parker on How AI and Accelerated Computing Drive Sustainability - Ep. 234

NVIDIA's Josh Parker on How AI and Accelerated Computing Drive Sustainability - Ep. 234

2024/10/2
logo of podcast The AI Podcast

The AI Podcast

AI Deep Dive AI Chapters Transcript
People
J
Joshua Parker
N
Noah Kravitz
Topics
Noah Kravitz 指出,AI 和加速计算在提高能源效率和应对气候变化挑战方面发挥着关键作用。他认为,随着 AI 的应用越来越广泛,人们对 AI 能耗的关注日益增加,这促使人们思考如何利用 AI 和加速计算技术来解决与可持续性和能源效率相关的挑战。 Joshua Parker 则深入探讨了 AI 能耗问题。他指出,AI 能耗是一个复杂的问题,需要考虑 AI 应用的快速增长、硬件的快速发展以及 AI 带来的益处。他强调,准确评估 AI 能耗需要进行复杂细致的分析,避免得出不准确甚至危言耸听的结论。他进一步解释了加速计算平台如何通过高效的 GPU 计算来提高能源效率,并比较了 CPU-only 系统和加速计算系统的能源效率差异。他还提到了 AI 推理的能源效率在过去八年中显著提高,以及模型训练和推理的次数以及模型的持久性对能源消耗的影响。 Parker 还讨论了人们对 AI 能耗对当地电网影响的担忧,他指出 AI 目前仅占全球能源消耗的一小部分,并且 AI 模型训练具有灵活性,可以根据能源效率选择地点进行训练。大型 AI 数据中心通常选址在能源充足且可持续的地方,以避免对当地电网造成压力。他还提到了 AI 在改进天气和气候模型、提高模拟速度和能源效率、减少野火造成的排放以及提高碳捕获和储存技术的效率等方面的积极作用。 此外,Parker 还介绍了 NVIDIA 在提高 AI 训练和推理效率以及开发更高效模型方面的工作,包括通过量化、水冷数据中心和改进 GPU 设计等方式提高能源效率。他举例说明了 AI 如何帮助提高工厂的能源效率,并解释了数据中心设计在提高能源效率方面的作用。最后,他展望了 AI 在优化电网运行、整合可再生能源以及推动药物研发和材料科学等方面的未来潜力,并强调了政府和行业在推动可持续发展的最佳实践方面的重要作用。他以乐观的态度总结,认为 AI 将成为实现可持续发展的最佳工具,只要我们能够利用 AI 和加速计算平台来提高效率,并促进可持续部署。 Noah Kravitz 强调了 AI 能耗问题的重要性,并指出 AI 和加速计算技术在解决能源挑战和可持续性问题方面发挥着关键作用。他提出了关于 AI 能耗的疑问,并引导 Joshua Parker 进一步解释。在讨论中,他表达了对 AI 能耗对当地电网影响的担忧,并对 AI 能源效率的巨大提升表示惊讶。他还提到了 AI 在应对气候变化和优化能源使用方面的积极作用,并对 AI 在提高能源效率和性能方面的进步表示赞赏。最后,他总结了 AI 在可持续发展中的重要性,并鼓励听众学习更多关于 AI 和可持续性的知识。

Deep Dive

Chapters
The episode begins by addressing the crucial question of AI's energy consumption and its environmental impact. The answer is complex due to rapid AI growth and the evolution of hardware, particularly the energy efficiency gains in accelerated computing platforms.
  • AI's energy use is a legitimate concern given its rapid growth and potential impact on energy-related challenges.
  • Predicting future AI energy consumption is difficult due to rapid technological advancements and varying applications.
  • Accelerated computing platforms, particularly GPUs, offer significant energy efficiency improvements compared to CPU-only systems.

Shownotes Transcript

Translations:
中文

Hello, and welcome to the NVIDIA AI Podcast. I'm your host, Noah Kravitz. The emergence of generative AI into our collective consciousness has led to increased scrutiny around how AI works.

particularly when it comes to energy consumption. The interest and focus on how much energy AI actually uses is, of course, important. Our planet faces energy-related challenges ranging from grid infrastructure needs to the impact of climate change. But AI and accelerated computing have a big part to play in helping to solve these challenges and others related to sustainability and energy efficiency.

Joining us today to talk about all of this is Joshua Parker, the Senior Director of Corporate Sustainability at NVIDIA. Josh brings a wealth of experience as a sustainability professional and engineer. Before his current role, he led Western Digital's corporate sustainability function and managed ethics and compliance across the Asia-Pacific region. At NVIDIA, Josh is at the forefront of driving sustainable practices and leveraging AI to enhance energy efficiency and reduce environmental impact.

Josh, welcome and thanks so much for taking the time to join the AI podcast. Thanks, Noah. Long time listener, first time caller. It's good to be here. Love it. I always dreamt of hosting an AM radio call-on show, so you're inspiring me. I'm going to just open this up broadly to you to get us started. I alluded a little bit to it in the intro, but computing uses energy. Everybody is talking about AI, obviously, and there's, with good reason,

interest scrutiny focus on, well, how much energy is AI using? And if we start using more and more AI going forward, what's the impact going to be on, you know, all of these energy related things that we deal with on our planet? So let me ask you to start. How much energy does it really use? Is this a warranted discussion? What are the things that we should be

thinking about and talking about and working on when it comes to energy and sustainability and not just AI, but accelerated computing and the other advanced technology that goes with it? It's definitely a reasonable question. And as someone who's been in sustainability for a while, it's something that we always talk about, you know,

climate and energy and emissions, those are very big urgent topics that we're all thinking about in every context. So when you see something like AI that really bursts onto the scene, especially so rapidly, it's a very legitimate question to ask, okay, what is this going to do to energy? And what is this going to do to emissions associated with that energy? So it's the right question to ask.

The answer, though, turns out to be pretty complicated because, number one, we're in a period of rapid, rapid growth. And it's hard to predict where we're going to be in just a couple of years in terms of the expansion of AI. Where is it going to be used? How is it going to be used?

What benefits do we get from it? And there are lots of nuances to that as well, including things like the hardware that is being built on. This accelerated computing platform itself is rapidly, rapidly evolving in ways that actually support sustainability. So it's the energy efficiency gains that are being developed in that accelerated computing platform are really, really dramatic. So if you want to paint a really accurate picture

picture as accurate as we can get in terms of where we're going with AI energy consumption and the emissions associated with that, you need to have a really complex nuanced analysis to avoid coming to very inaccurate and potentially alarming conclusions.

So let's dig into that a little bit within the context of a half hour podcast. Let's talk about some of those nuances. And you mentioned the hardware. And so obviously GPUs, a big part of that. How is accelerated computing sustainable?

Accelerated computing is a very tailored form of computing for the type of work required for AI. The accelerated computing platform on which modern AI is built takes the math that was previously being done on CPUs in a sequential order and basically uses these very, very efficient, very purpose-built GPUs to do them in

So you do many, many more operations and these GPUs are optimized to do that math, the matrix math that's required for AI really, really effectively and efficiently. And that's what's driven both the huge gains in performance and also the huge gains in

and efficiency in AI. And it's really what has enabled AI to boom the way it has. The traditional CPU paradigm, CPU-only paradigm for trying to run this math just wasn't scaling. And so we really need GPUs to unlock this exponential growth, really, in performance and efficiency. So if you compare CPU-only systems to

accelerated computing systems, which have a mix of GPUs and CPUs, we're seeing roughly a 20 times improvement in energy efficiency

between CPU only and accelerated computing platforms. So very dramatic, and that's across a mix of workloads. So it's a very dramatic improvement in efficiency. And if you look just over time at accelerated computing itself, so compare accelerated computing platforms from a few years ago to ones that we have today,

That change in efficiency is even more dramatic. So just eight years ago, if you compare the energy efficiency for AI inference from eight years ago until today, we're 45,000 times more energy efficient for that inference step of AI where you're actually engaging with the models, right? And it's really hard to understand that type of figure, one 45,000th.

of the energy required just eight years ago is what we're using today. So building in that type of energy efficiency gain into your models for how much energy will we be using for AI in a couple of years is really, really critical because it's such a dramatic change. Yeah, that's a huge number. I don't mean this as a joke, but the best way I can think of to ask it is, are the workloads now 45,000 times more

or more energy intensive than they were eight years ago? Or is really the efficiency outpacing all of this new attention on AI? So that ends up being a very complex question as well because you have to get into the realm of figuring out how many times do I need to train a model versus how many times can I reuse it with inference. So big models like Cloud 3.4

chat GPT 4.0 and so forth. They're trained, takes a lot of time to train them, but the inferencing when we're actually engaging with the model, if it ends up being durable, then that inference step is very, very efficient. So because we're still in this inflection point where things are moving very rapidly, it's hard to see how the inference

compute requirements are scaling versus the energy efficiency. Certainly, they're scaling. We continue to see bigger and bigger models being used and trained because companies are seeing huge benefits in doing that. But yeah, this is what makes it complicated is that the energy efficiency is ramping up very dramatically at the same time. Right. So along those lines, there's been a tension in

Well, there's been a tension on the stability and durability of power grids, national, regional, local, as long as they've existed, but certainly over the past five years, 10 years or so. But since...

AI has come into the public consciousness, there have been news stories and what have you about kind of the localized effects of, oh, this data center was built in, you know, wherever it was, and it had this huge impact on the local power grid or people are concerned it might.

Can you talk a little bit about the common concerns around AI's energy consumption, particularly when it comes to the impact on local power grids, whether it's in the area where a data center might be or other places where people are concerned that AI is impacting the local energy situation?

The first thing to look at when you're trying to put this in context and figure out what the local constraints might be on the grid is the fact that AI still accounts for a tiny, tiny fraction of overall energy consumption generally. If you look at it globally first, and then we'll get to the local issue, look at it globally, the International Energy Agency,

estimates that all of data centers, so not just AI, all of data centers account for about 2% of global energy consumption. And AI is a small fraction of that 2% so far. So we're looking at much less than 1% of total energy consumption currently used for AI

focused data centers. Now that is absolutely growing and we expect that to grow, but ultimately this is a very small piece of the pie still compared to everything else that we're looking at. And the second thing to consider is the fact that AI

is mobile, especially when you think about AI training. So when you're working to train these models for months at a time, potentially very large models, you need a lot of compute power. That training doesn't have to happen near the internet edge. It doesn't have to happen in a particular location. So there is more mobility

built in to how AI works than in traditional data centers, because you could potentially train your model in Siberia, if it were more efficient to do that, or in Norway or Dubai. Wherever you have access to reliable, clean energy, it would be easy to do your training there. And some companies, including some of our partners, have built business models around that, locating AI data centers and accelerated computing data centers

close to where there is excess energy and renewable energy. So to get back to your original question, what we see kind of local constraints and problems with the grid, I think for the most part, we're able to avoid that because of those issues. AI is still relatively small and the companies who are deploying large AI data centers

already know where there is excess energy, where there are potentially constraints. And of course, they're trying to find locations for the AI data centers where it's not going to become a problem and they're going to be able to have good access to clean, reliable energy. So is the sort of pessimistic, or it sounds like overblown, if data centers only comprise 2% of global energy usage and AI-specific data centers are only 1%,

Is the sort of proliferance of pessimistic stories around AI's impact on the energy grid, is that just kind of the dark side of a hype cycle that we're sort of used to and this is how it's coming up with AI? I don't want to say that those concerns are misplaced. Certainly, if you're living in a community and you see an AI data center going up, you may have questions about what this is going to do to your local grid. And

We are, because we're in this period of very, very rapid and to some extent unexpected deployment of AI, because ChatGPT really took the world by storm and by surprise two years ago. There is some churn right now, which you would expect when you have a new

technology, a new industrial revolution that's bursting on the world, there's going to be a little bit of time where resources are not perfectly allocated. But what we're seeing is we're already working through that phase. And the companies who are deploying the big AI data centers are finding ways to do that that are sustainable and that won't threaten the

local grids, even if in the near term there are some constraints that we all need to work through. In the long term, even in the medium term, we're very optimistic that these are solvable issues. - So sort of to look at the bright side of things relative to AI, as with so many industries and so many problems that people are trying to solve in all walks of life,

AI can be a help when it comes to optimizing grids and energy use and even perhaps trying to solve some of these climate challenges that we're all facing. Can you talk a little bit about that and about how AI can or perhaps already is making a positive impact on our energy situation? Sure. Sure.

Two examples that I'll focus on that speak to different aspects of sustainability. The first one is helping us adapt and mitigate the worst impacts of climate change. AI and accelerated computing in general are game changers when it comes to weather and climate change.

modeling and simulation. NVIDIA has a platform called Earth2, and we partner closely with national labs and non-governmental organizations and other organizations to develop systems where we can much, much more accurately forecast weather, model weather, and

and help mitigate the worst impacts of near-term weather and also look longer-term at climate so that we've got a better understanding of where we're going with climate and can better prepare for and plan for that. And then the other piece of the puzzle is that accelerated computing and AI, both of them, have real-world applications that directly reduce energy and emissions. And one example of that is we've...

transitioned the pandas library in python it's one of the most well-used libraries for simulation and for high performance computing we've taken that basically and uh and written libraries that will translate that code code that's written for that library onto the accelerated computing gpu platform and doing that we've basically opened up for

for the world of researchers a way to run their simulations in that library without any code changes at 150 times speed and many many times more energy efficiently so the application of this itself is going to end up reducing energy consumption and also reducing the associated emissions

Right. No, it sounds like a virtuous cycle. So to kind of dig into that for a second, you know, people familiar with NVIDIA, longtime listeners to the podcast, perhaps understand that NVIDIA is not just a hardware company. It's hardware, it's software, it's all of the tools and everything in the stack, right?

leverage the GPUs in all of these different systems, accelerated computing, AI, and what have you. Can you talk a little bit more, you mentioned earlier some of the efficiency gains, but a little bit more about some of the efficiency improvements in AI training and inference

And then also NVIDIA's role in developing more efficient models. You mentioned Earth 2 just a second ago, but some of NVIDIA's other work in increasing the overall efficiency of the hardware, the software, and then these models themselves. Sure.

If you start with inference, one data point that I like to share is that just in one generation of improvement, so if you look at one generation of NVIDIA hardware, our Ampere platform, or sorry, our Hopper platform, which is the one that we're shipping in the highest volume right now, and you compare that,

to the Blackwell platform, which we're releasing next. We'll come out with them in the next several months. The Blackwell platform is 25 times more energy efficient for AI inference than Hopper was. So just in the space of one change, one generation of NVIDIA hardware and software, it's using 1 25th of the energy. That's a 96% reduction in energy used. And there are performance gains associated as well. That's right.

Yeah, significant performance gains. Yeah, I understated it, but performance gains are amazing. But that's incredible. So a 96% efficiency gain while also getting these enormous next generation performance gains as well.

That's right. And you asked a little bit about how we're getting there. That 25x improvement is through innovation in many spaces. It includes things like quantization, where we're using lower precision math, basically finding ways to optimize the model on training and inference in ways that allow us to do it even more in parallel and do it more efficiently.

It includes things like water cooling our data centers so that we're using less water and significantly less energy to keep the data centers cool. Of course, it includes things like

like better GPU design, and all of these levers we're pulling at the same time to drive those energy efficiency improvements. And we expect that to continue because energy efficiency is something that we care about and our customers care about, and it really helps enable more performance. When we're able to take out waste and to be able to do things more efficiently, it enhances our ability to drive more performance and to make the AI even more valuable than it was.

Our guest today is Joshua Parker. Josh is the Senior Director of Corporate Sustainability at NVIDIA. And we've been talking a little bit about

energy and energy in the AI area, climate change, sustainability, all these incredibly important things that kind of form the basis of our ability to live on earth and how these things are being affected by all of these rapid advances in technology and obviously being fueled by the interest in AI, which is, you know, AI has been around for a while as, as you well know, listening to the show, but over the past couple of years, it's really ramped up in intensity.

Josh, you mentioned just a second ago customers. Maybe we can dig in a little bit to some case studies, customer examples, real-world applications of AI improving energy efficiency. Sure.

One that I love to talk about is with a partner of ours called Wisdron. It's a Taiwan-based electronics company. It has a lot of manufacturing that many people may not have heard of, but it's a large, sophisticated company. They took our Omniverse platform, which is a 3D modeling platform, and they modeled a

one of their buildings in Omniverse. And then they used AI to run emulations on that digital twin that they created in our Omniverse platform and were looking for ways to improve energy efficiency. And after doing that, after using that digital twin, applying AI to run some emulations, they were able to increase the energy efficiency of that facility by 10%.

Which is a dramatic change just based on the digital twin. And in this case, it resulted in savings of 120,000 kilowatt hours per year. Fantastic. A word that I hear a lot that I'm not really sure what it means, I've got kind of a working understanding, is decarbonization. And my understanding is that NVIDIA has been involved in some work

optimizing processes for decarbonization in industry. I think you know a little bit about that, and I think it's relevant to what we're talking about. So could you dig into that a little bit as well?

Decarbonization is a really broad term and it makes sense to have a nuanced appreciation for everything that encompasses. But basically, I think best understanding is that it describes our efforts to reduce greenhouse gas emissions and an effort to try to mitigate the climate change that we're seeing.

It can apply to a lot of things, including things like carbon capture and storage, where we're actually pulling carbon out of processes or out of the atmosphere and finding ways to store it. That's an area actually where we have some good work being done and some partnerships with NVIDIA and Shell, for example, where we're finding ways to use AI to greatly enhance

carbon capture and storage technologies. So that's one example. Other example of decarbonization, and this goes directly to emissions, is we've also partnered

partnered with California, I believe in the San Diego area, to help firefighters there use AI to monitor weather risks that could lead to wildfires. And in doing so, we've been able to expand their responsiveness, to improve the responsiveness of their firefighting efforts, and not only to potentially save lives and property, but also to significantly reduce

emissions associated with those wildfires. So that's another example of decarbonization that we're seeing. And then the third example I'll give is NVIDIA itself. We are trying to decarbonize our own operations by transitioning the energy that we use from standard

energy to renewable. And this year we're going to be 100% renewable for our own operations. So I'm very excited to be transitioning over there. Quick show note to listeners, if you're interested, we did an episode previously about the use of AI in firefighting and combating wildfires in California. I would imagine it's the same organization. Forgive me, it might not be, but definitely worth a listen. It was a great episode.

Josh, you mentioned a little while ago data centers. We talked about being a citizen and seeing a data center come up. And of course, it's good to have questions and concerns about how's that going to impact things. The design of the data centers themselves obviously plays a big part in how efficiently they do or don't operate. You mentioned a little bit earlier water cooling as a technique that's been effective in

or increasing energy efficiency, I should say. Can you talk a little bit more about the data center, how data centers relate to sustainability kind of broadly and some of the innovations that have helped in that regard? Yes. The first thing again to mention is to put data centers in context because it's easy to think about data centers being really impactful.

in terms of sustainability. You see them, they're large, you hear about them using all this energy, all this water and so forth. So it's a legitimate question to ask. But ultimately, again, the IEA estimates that all of data centers only account for 2% of global energy consumption right now.

So it's much, much smaller than most of the other sectors. Not to interrupt you, but I've obviously been working in this arena for a while now, but that figure really kind of blew my mind. I was expecting something a little bit bigger than 2%. Yeah, and that makes sense because there's so much attention on this. Right.

AI is very much in the zeitgeist right now. We're talking about it. We see the rapid expansion. And yeah, so that's one of the things where it's important to put it in context. But yeah, the innovation in data center design is one of those levers that I mentioned that we're all pursuing to try to improve

energy efficiency and as we're transitioning to this this new generation of products at nvidia to blackwell our reference design our recommended design for the data centers for our new v200 chip is focused all on direct to chip liquid cooling which is much more efficient really unlocks

better energy efficiency, of course, but also unlocks better performance because the cooling is more effective. So we're able to run the chips more optimally in ways that lead to better performance as well as to better energy efficiency. Just to paint the picture, sorry to interrupt you again. When you talk about direct-to-chip cooling, what is that replacing? What's the thing that it's more efficient than? So that's in comparison to aerosol.

air cooling where you have heat sinks and you're using airflow yeah direct to chip liquid cooling we're able to get liquid in closer to the silicon and to more effectively get get heat away from that and one of the reasons why this is so effective and helpful with accelerated computing is that the compute density is so high if you look at a modern ai data center and you see a rack

For example, on a modern AI data center, there's as much compute power in that rack as there was in several racks, many racks of a traditional computing data center. So the compute density is so high that it makes more sense to invest in the cooling because you're getting so much more compute for that same data.

single direct-to-chip cooling element that you're using. So obviously, all of this is a work in progress, so to speak, that advances in AI aren't slowing down anytime soon. And you're talking about the increases in efficiency and the increases in performance and all of that from generation to generation. As you said, this is early days of the world leveraging AI, to put it that way.

So as we'd like to do it on the podcast, as we move towards wrapping up, let's talk about the future a little bit. And, you know, not to put you on the spot to make crystal ball predictions, but what are some of the things that are being explored as future directions for AI and energy efficiency and

And really, you know, kind of supporting this growing energy demand, not just from AI and data centers, as you rightfully pointed out, but just in general. You know, how is AI being explored to meet future energy demands kind of across the globe? Yeah.

There are many ways, and most of them are yet to be discovered and talked about because we're in such early days that it's hard to know what opportunities are yet in front of us. But some examples, again, are, for example, grid operations.

update and updates to the grid in ways that will enable more renewable energy. So when you have more and more renewable energy coming online, that is a more cyclical right than traditional energy. If you're burning coal 24/7, it's a steady stream of energy. If you've got wind or solar, it's more variable. And

Also with things like residential solar, you have times when you may be wanting to allow those residential solar panels to send

energy onto the grid instead of pulling data for the house off the grid. All of those types of things benefit from modernization and modernization in a way where AI can play a significant role in helping to avoid waste and to create ways for that energy flow to be optimized. So that's one very near-term area where we're seeing progress. And a partner of ours, Portland General Electric, is using AI

AI and using some of our products to do just that, to put smart meters around to help them manage the growth in renewable energy. There certainly is a perfect opportunity right now for us to do this, to engage in grid modernization because we have so much value to potentially unlock with AI. We've got these big companies who are really good at developing infrastructure, who are motivated to help us modernize the grid,

and introduce more renewable energy and do that in a responsible way. So it's a perfect time for us to be focusing on this. There are also fantastic other sustainability-related benefits from AI in terms of drug discovery for human welfare and materials discovery for things like electric vehicle batteries.

And batteries more generally. We've heard reports from Microsoft as well as from Google about discoveries they've made in material science that could potentially lead to much more efficient batteries in the future, which of course would not only save resources, but also save energy as well. Right.

It's funny listening to you talk about sustainability, and it's obviously related, but it makes me think about, or when you mentioned about residential solar and being able to send solar power back to the grid and, you know, thinking about the virtuous implications of that, it made me think about recycling for whatever reason.

And recycling being one of those things that, you know, individually, if we all do it, it sort of forms a collective. And then obviously, if we're moving from residential to industry and large corporations and, you know, factories and what have you, the importance of recycling, you know, is sort of bigger, obviously, in these spots in a factory than in an individual house.

I'm wondering about the importance of collective action when it comes to all of these things that you've been talking about with AI and sustainability and energy demand. And then sort of dovetailing from that, what about the role of industry or even of governments?

in driving these kind of new and emerging best practices that will support sustainability for all of us? I think there's a great role for governments, policymakers to play here in terms of, number one, setting an example of how accelerated computing and AI can be used as tools for good.

And we're seeing some great work by regulators, especially in the United States, but also in Europe and elsewhere, where there's a real appreciation of the potential benefits to society and to sustainability in adopting both accelerated computing and AI and using that

that for the public good. So I think that's the first way in which I think there's an opportunity here, accelerating all of these workloads, transitioning them over to a sustainable platform, and then using AI to try to benefit society in an environmental way and in a social way as well. And then also to encourage that type of sustainable deployment in industry as well, and make sure that we're, again, modernizing the grid

and creating the environment where we have a clear path towards sustainable deployment of AI. Because ultimately, you know, this is moving very, very quickly. This fourth industrial revolution is we're living through it. It's very exciting. And we don't want anything to undermine our

our ability to capture the benefit from this so that we can try to mitigate climate change. We can develop new drugs. We can see all of the potential benefits from sustainability. And we can have that at the same time with sustainable deployment of AI data centers if we're careful. Absolutely. Josh, before we wrap, I just want to mention the NVIDIA Deep Learning Institute is actually sponsoring this episode. So I want to give them a shout out. And listeners out there who are interested in

obviously in AI, but in AI and sustainability in particular, head over to the NVIDIA Deep Learning Institute. There are DLI courses on AI and sustainability. You can learn so much more about what we've been talking about and go out and make an impact of your own, which we obviously encourage everybody to do. Josh, kind of as we wrap up here and along those lines,

For listeners who would like to learn more about the work that you're leading, the work that NVIDIA is doing on sustainability, energy efficiency, everything we've been talking about, maybe even some of the work that NVIDIA is doing with partners along these fronts, where is a good place for a listener to go online to kind of start digging in deeper to this? We are publishing more and more content about the connection between accelerated computing and AI.

and sustainability on our website. We have a sustainable computing subpage on our public webpage. We have some corporate blogs there, some white papers and so forth. Those were all really interesting and very readable, I think, in terms of giving you examples of

of how NVIDIA and our partners actually are doing so much good work in sustainability. We also, of course, publish an annual sustainability report if you're interested in the corporate level view, what we're doing in terms of energy efficiency in our systems and our own corporate commitments. That's in our annual sustainability report, which is on our website as well. Fantastic. Closing thoughts, anything you want to leave the listeners to take with them or ponder when it comes to

The future of AI, the future of sustainability, made better by AI, anything else we've touched on? I'd just offer a healthy dose of optimism. We've heard, I think, an unhealthy dose of pessimism and skepticism about AI, specifically in the realm of sustainability. And again, those are all legitimate questions. But AI, I firmly believe, is going to be the best tool that we've ever seen to help us

achieve more sustainability and more sustainable outcomes. If we capture this, if we capture the moment and use AI for good, and if we use this new accelerated computing platform to drive better efficiencies, then we're going to see really dramatic and positive results over time as we do that more and more.

That's what we want. And let's end on the optimistic note. Josh, thank you so much for taking the time to come on and talk with us. And it goes without saying, but all the best of luck to you and your teams on the work you're doing. It could be more important. Thanks, Noah. Really appreciate it. Thank you.