We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode AI and the Environment

AI and the Environment

2025/3/30
logo of podcast Consider This from NPR

Consider This from NPR

AI Deep Dive AI Chapters Transcript
People
B
Benjamin Lee
D
David Craig
J
Jeff Brumfield
K
Kevin Miller
S
Sasha Luccioni
S
Sasha Luciani
Topics
Sasha Luciani: 我最初对人工智能领域对环境的影响感到焦虑,这与我的价值观相冲突。我意识到自己可以用人工智能专业知识为环境保护做出贡献,因此我辞去了工作,投身于人工智能可持续发展事业。我认为我们需要一个行业标准来评估人工智能模型的能源效率,但目前科技公司对透明度持谨慎态度。 Benjamin Lee: 生成式人工智能的兴起使得数据中心运营商实现净零排放目标变得更加困难。混合专家模型可以通过使用多个较小的模型来降低能耗,这是提高效率的一种方法。 Jeff Brumfield: 人工智能行业和核能行业的工程文化差异巨大,这给合作带来挑战。核能虽然是低碳能源,但其建设周期长,成本高,且存在安全风险。 David Craig: 我们只有一个地球,应该关注环境保护。液冷技术可以提高数据中心的能效,延长设备寿命,减少能源浪费。但液冷技术目前成本较高,难以大规模应用。 Kevin Miller: 对数据中心和云计算的需求正在增长,这是由数字化经济的整体发展趋势驱动的。我们需要找到平衡点,在满足社会需求的同时,最大限度地减少环境影响。

Deep Dive

Chapters
The rapid growth of AI has raised significant environmental concerns, particularly regarding energy and water consumption. Data centers, crucial for AI operations, are projected to consume a substantial portion of the nation's electricity by 2028. This has led to a growing movement advocating for more sustainable AI practices.
  • AI boom caused a surge in energy consumption
  • Data centers could consume up to 12% of the nation's electricity by 2028
  • AI is leading to increased water consumption
  • Big tech companies have climate goals but lack regulations

Shownotes Transcript

In 2018, Sasha Luciani started a new job, AI researcher for Morgan Stanley. She was excited to learn something new in the field of AI, but she couldn't shake this worry. I essentially was getting more and more climate anxiety. I was really feeling this profound disconnect between my job and my values and the things that I cared about.

And so essentially I was like, oh, I should quit my job and go plant trees. I should, you know, I should do something that's really making a difference in the world. And yeah.

Then my partner was like, well, you have a PhD in AI, maybe you can use that to make a difference in the world. So Luciani quit her job and joined a growing movement to make AI more sustainable. Since 2022, AI has boomed and it's caused a surge in energy consumption. Tech companies are racing to build data centers to keep up these huge buildings filled with hundreds of thousands of computers that require a lot of energy.

By 2028, Lawrence Berkeley National Laboratory forecasts the data centers could consume as much as 12% of the nation's electricity. And AI is also leading a surge in water consumption.

It's a concern echoed all over social media. The amount of water that AI uses is astonishing. AI needs water. People are saying that every time you use ChatGPT... ChatGPT uses this much water for a hundred word email. Where will that water come from? And the four big data center operators with a growing water and carbon footprint are Google, Microsoft, Amazon, and Meta. And to be clear, all four of those are among NPR's financial supporters and pay to distribute some of our content.

Before generative AI came along in late 2022, there was hope among these data center operators that they could go to net zero. Benjamin Lee studies computer architecture at the University of Pennsylvania. Generative AI refers to the AI that uses large language models. So I don't see how you can undercurrent infrastructure investment plans, you could possibly achieve those net zero goals. And data center construction is only going to increase as

On January 21st, the day after his second inauguration, President Trump announced a private joint venture to build 20 large data centers across the country, as heard here on NBC. A new American company that will invest $500 billion, at least.

in AI infrastructure in the United States and very quickly moving very rapidly. This new project, known as Stargate, would together consume 15 gigawatts of power. That would be like 15 new Philadelphia-sized cities consuming energy. Consider this. As much as big tech says they want to get to net zero, there are no regulations forcing them to do so. So how is the industry thinking about its future and its environmental footprint?

From NPR, I'm Emily Kwong.

This message comes from Doctors Without Borders. Over 80% of their staff are from the countries they work in. Support their local teams and make a global impact. Learn how to donate at doctorswithoutborders.org slash NPR.

This message comes from Doctors Without Borders. Over 80% of their staff are from the countries they work in. Support their local teams and make a global impact. Learn how to donate at doctorswithoutborders.org slash NPR. Let's consider this from NPR.

Okay, so the four cloud giants, Google, Meta, Microsoft, and Amazon, all have climate goals. Goals for hitting net zero carbon emissions, most by 2030, Amazon by 2040. And there's a few ways they can get there. Let's start with a very popular energy source for big tech, nuclear. Because Amazon, Meta, and Alphabet, which runs Google, just signed an agreement, along with other companies, that

That supports tripling the global nuclear supply by 2050. And along with Microsoft, these four companies have signed agreements to purchase nuclear energy, an industry that has been stagnant for years. Microsoft has committed to buying power from an old nuclear plant on Three Mile Island in Pennsylvania today.

You may remember that was the site of a partial nuclear meltdown in 1979. And NPR's Nina Totenberg talked to kids in the Harrisburg area right after. You know what evacuation is? That everybody has to go. Do you know why? Because of radioactivity. While some radioactive gas was released, thankfully, it wasn't enough to cause serious health effects. And Microsoft now wants to build this nuclear site back. In a way, AI companies are turning into energy brokers.

But my science desk colleague, Jeff Brumfield, sees a discrepancy in this between the AI people and the nuclear energy people. These are just two super different engineering cultures. You know, and the way I've come to think about it is Silicon Valley loves to go fast and break things. The nuclear industry has to move very, very, very slowly because nothing can ever break. Because of accidents like Three Mile Island. Jeff says that nothing in the nuclear industry ever happens quickly. It's also extremely expensive.

And while solar and wind energy combined with batteries is quicker to build and more inexpensive than nuclear or gas power plants, it still takes time to build. And there are problems hooking up new energy sources to the grid. So in the meantime, many data centers will continue to use fossil fuels.

But there's another solution here, and that's to make data centers themselves more efficient through better hardware, better chips, and more efficient cooling systems. One of the most innovative methods on the rise is liquid cooling. Basically, running a synthetic fluid through the hottest parts of the server to take the heat away or immersing whole servers in a cool bath.

It's the same idea as running coolant through your car engine and a much faster way to cool off a hot computer. Here's Benjamin Lee again at UPenn. And as you can imagine, it's much more efficient because now you're just cooling the surface of whatever the cold plate is covering rather than just blowing air through the entire machine.

One of the biggest providers of liquid cooling is iZotope. David Craig is their recently retired CEO and based in the UK. I definitely come from the point of view that, you know, we literally have just one planet and

I cannot understand why anybody would want to do anything other than care for it. Craig says that the older way of cooling data centers, basically there's lots of methods, but it's a daisy chain of moving heat with air and water, is consumptive. With liquid cooling, a lot of the heat stays in the system and computers don't have these massive swings in temperature. It's not got a constant thermal shock. It's got less vibration from fans and stuff like that. So things last longer.

And then what we're doing is we're capturing that heat in a closed water loop. Liquid cooling, however, is expensive, which makes it hard to scale. But Isotope has announced public partnerships with Hewlett Packard and Intel. And a spokesperson at Meta told me they anticipate some of the company's liquid cooling enabled data centers will be up and running by 2026.

Throughout my many emails and seven hours of phone conversations with spokespersons at Amazon, Google, and Microsoft too, there was one innovation they were kind of quiet about. And it's the one that scientists and engineers outside of big tech were most excited about. And that is smaller AI models. One's good enough to complete a lot of the tasks we care about, but in a much less energy intensive way.

Basically, a third and final solution to AI's climate problem is using less AI. One major disruptor in this space is DeepSeek, the chatbot out of a company in China claiming to use less energy. We reached out to them for comment, but they did not reply.

You see, large language models like ChatGPT are often trained using large datasets, say, by feeding the model over a million hours of YouTube content. But DeepSeq was trained by data from other language models. Benjamin Lee at UPenn says this is called a mixture of experts. The whole idea behind a mixture of experts is

You don't need a single huge model with a trillion parameters to answer every possible question under the sun. Rather, you would like to have a collection of experts

smaller models, and then you just sort of route the request to the right expert. And because each expert is so much smaller, it's going to cost less energy to invoke. Even though DeepSeq was trained more efficiently this way, other scientists I spoke to pointed out it's still a big model. And Sasha Luccioni at Hugging Face wants to walk away from those entirely. Since JASABT came out, people were like, oh, we want general purpose models. We want models that

What Sasha is talking about are small language models, which have far fewer parameters and are trained for a specific task. And some tech companies are experimenting with this.

Last year, Meta announced a smaller quantized version of some of their models. Microsoft announced a family of small models called Phi 3. A spokesperson for Amazon said they're open to considering a number of models that can meet their customers' needs. And a spokesperson for Google said they did not have a comment about small language models at this time. So meanwhile, the race to build infrastructure for large language models is

is very much underway. Here's Kevin Miller, who runs global infrastructure at Amazon Web Services. I think you have to look at the world around us and say, we're moving towards a more digital economy overall. And that is ultimately kind of the biggest driver for the need for data centers and cloud computing. If that is the level of computing we're headed for, Luciani has one last idea. He's going to be

an industry-wide score for AI models. Just like Energy Star became a widely recognized program for ranking the energy efficiency of appliances. She says that tech companies, however, are far from embracing something similar. So we're having a lot of trouble getting buy-in from companies. There's like

Such a blanket ban on any kind of transparency because it could either like make you look bad, open you up for whatever legal action or just kind of give people a sneak peek behind the curtain. So as a science reporter for NPR, my main question is, do we really need all of this computing power when we know it could imperil climate goals?

And David Craig, the recently retired CEO of iZotope, chuckled when I asked this. He said, Emily, you know, human nature is against us. We are always that kid who does touch the very hot ring on the cooker. When our mum said don't, you know, we are always the people who touch the wet paint sign and stuff, right? That's human beings.

And the truth is with data, you know, this stuff has just grown up in the background. People just haven't known about it. But here's something I think we can all think about. The AI revolution is still fairly new. Google CEO Sundar Pichai compared AI to the discovery of electricity. Except unlike the people during the industrial revolution, we know AI has a big climate cost. And there's still time to adjust how and how much of it we use.

This episode was produced by Avery Keatley and Megan Lim, with audio engineering by Ted Meebane. It was edited by Adam Rainey, Sarah Robbins, and Rebecca Ramirez. Our executive producer is Sammy Yannigan. It's Consider This from NPR. I'm Emily Kwong. You can hear more science reporting like this on the science podcast I co-host every week, Shorewave. Check it out.

This message comes from Charles Schwab. When it comes to managing your wealth, Schwab gives you more choices, like full-service wealth management and advice when you need it. You can also invest on your own and trade on Thinkorswim. Visit Schwab.com to learn more. Support for NPR and the following message come from Washington Wise.

Decisions made in Washington can affect your portfolio every day. Washington Wise from Charles Schwab is an original podcast that unpacks the stories making news and how they may affect your finances and portfolio. Host Mike Townsend and his guests explore policy initiatives for retirement savings, taxes, trade, and more. Download the latest episode and follow at schwab.com slash Washington Wise or wherever you listen.

This message comes from Bombas. Their slippers are designed with cushioning, so every step feels marshmallowy soft. Plus, for every item purchased, Bombas donates to someone in need. Go to bombas.com slash NPR and use code NPR for 20% off your first order.