We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Is AI destroying the planet?

Is AI destroying the planet?

2025/4/7
logo of podcast LSE: Public lectures and events

LSE: Public lectures and events

Transcript

Shownotes Transcript

So I'm opening up ChatGPT on my laptop and it's asking me what can I help with. I'm going to type how much water does ChatGPT use. Okay, it says it's thinking, it's searching the web.

Here's what it says: "For each ChatGPT interaction, estimates suggest that it could use around 500ml of water." Oh wow, so I basically just poured away the equivalent of a small bottle of water to find out the answer to that one question. And if I was to use ChatGPT just once a week for a whole year, that would be like pouring away 27 litres of water. That's the same as filling your kitchen sink right to the top and then pulling the plug.

Artificial intelligence is transforming the world around us, offering increased productivity and promising to help tackle difficult problems like global warming. But when we're being warned that climate change is exacerbating water scarcity and told that we all need to reduce our carbon footprint, the rapid growth of AI poses a big challenge to the environment. It's an AI sustainability paradox. Welcome to LSEIQ, the podcast where we ask social scientists and other experts to answer one intelligent question.

I'm Anna Bevan from the iQ team. We work with academics to bring you their latest research and ideas, and talk to people affected by the issues we explore. In this episode I'm asking: Is AI destroying the planet? I find out just how much water data centres use, discuss what an Irish goddess has to do with energy conservation, and learn how AI can help catch poachers in the Serengeti.

But first I travel to Slough, about 20 miles outside of central London, to find out exactly how data centres work. So here I am in Slough. It's not the most glamorous of locations. Once famed for providing the backdrop to Ricky Gervais' sitcom The Office, it's now home to Europe's largest data centre hub.

These are the hubs that power huge chunks of the internet. So there's a good chance that pretty much everything on your phone or your computer is stored here on Slough Trading Estate. We have the big kind of cloud providers with us, so your Googles and your Microsoft and your Amazons. That's Raj Ubi. He works at Equinix, one of the biggest data centre companies in the world.

It has 268 data centres across the globe and 97% of all internet traffic flows through one of its data centres. So it's huge. Raj is giving me a tour of one of the buildings on Equinix's Slough campus called LD6. And what type of connections are happening right now?

Everything and anything you could think of. So, you know, even simple stuff like your internet connection that could possibly run through here, most likely does, mine does, I only live around the corner, so mine definitely does. All the way up to how Teslas connect to their cars, how your fridge connects to, if you've got one of those fancy ones, how your TV works, your banks, anything you think of, Ministry of Defence, you know, it's...

all running through data centres like ours. And we're the biggest, so we have the most people going through our data centres. And how come you're based in Slough? So LD4 is our longest standing data centre here. And one of the main reasons is there is an undersea cable going to, I believe it's New York in America.

So it runs along the M4 and this data center is placed right there. So basically all of the trading platforms and banks huddled into that area. That's why we had to make LD 4.2 because we were basically running out of space and power in that one building. Yeah, sure.

The data centre is built like a sandwich. So you have the power generators, generation system, I should call it, in the bottom floor. And the middle floor is all the data halls. And then the top is the cooling, because the heat rising, it's easier to expel it through the roof. We're in a long metal corridor here, in this windowless warehouse, peering through the door into the generator room. It's pretty much the engine of the whole operation.

the bottom slice of bread of that data centre sandwich that Raj described. There's a mass of very neatly positioned wires, hundreds or maybe even thousands of wires, big buttons, generators, pipes and fans. It's kind of like looking at the engine in the front of your car, if your car was the size of a giant warehouse. Security is obviously number one. All of the windows, I believe, are all kind of bulletproof. You've done fingerprint entrance to get in. Yes. So...

Yeah, so the first door you have to get through security let you in. Fingerprint into the building to begin with. You have to get your pass and that will only let you in certain places. So a normal customer wouldn't be able to even come down here to begin with. And how come the security is so tight?

It's just to stop people coming in and you should only be going where you need to be going and the customers expect that. If you tell a customer and they're a banking customer, for example, and you say, oh yeah, there was someone walking around in your cage, I think they would just leave immediately. And it's mostly because...

Raj talks a lot about cages. They're basically floor-to-ceiling wire mesh enclosures that customers hire to house all of their servers, their hard drives, their networking equipment, and the zillions of cables they use to ensure that data flows seamlessly between devices. So this is what we call the UPS system on the right-hand side. So there's uninterrupted power supply. Equinix's customers are requiring more and more power because of AI.

Tools like Gemini or ChatGPT rely on hardware housed in data centres like this one. Just one AI query uses huge amounts of energy, which generates a lot of heat. So then that hardware needs to be cooled down and fast. That's where all the water comes in. Because AI, at the end of the day, is just raw brute force that

is running through these devices. So that's what the customers need and the main challenge is does it need to be liquid cooled? And on the basic level, when we're talking about things needing to be cooled, why is that? All fundamentals of kind of how physics works, right? So, you know, if you rub your hands together, it creates heat. So when these devices are working, you can kind of hear it right now. They all need air because they're generating that power, doing the transfers and whatever workloads.

they need air to cool them down. If you scale it from your laptop doing a Word document to an AI-based workload doing whatever type of solution it is or translation or whatever it is, just think of the scale in terms of what workload it's doing. And the heat and the type of energy used is scaled with that as well. So

A lot of these connections that data centers house are essential, like ensuring hospitals have uninterrupted access to patient information or that banks can process financial transactions. But it's also incredible to me that when I'm sat down at my laptop asking ChatGPT a question like, how much water does AI use? These huge machines are still working behind the scenes, using a lot of power and a lot of water to come up with the answer to that question.

And recently I've noticed that sometimes when I use a search engine, I don't even get the option of whether I want to use AI or not. It just defaults to giving an AI overview. Professor Nick Caudry is a sociologist at LSE. He says we need to question the direction we're going in when it comes to AI for many reasons. Some people calculate, it's obviously a complex calculation,

that when you use ChatGBT to find something out that broadly you could have found out through a Google search as well, slightly different format, you're probably going to use six times as much electricity to do that. And that's because a Google search uses a very complex program to find websites that give you a door that you can open and go in and find the answer yourself by clicking on the link.

Whereas AI doesn't work that way. It processes everything across vast numbers of data sources, trained on a large language model that enables it to predict what's the next most likely word in the sentence that you've just uttered, which was your question. And it predicts that the most likely next word will be whatever it is.

That is an awful lot of calculation that previously wasn't necessary in the same way for a simple Google search. But we're now more and more relying on it instead of a Google search. And sometimes you don't even get that option as to what you're using, because if you type a question into Google now, it's coming up with the AI overview options.

Yes, and this raises questions about whether this is giving us the most useful answers and all sorts of other questions for the industries that rely on being linked to through a Google search, such as journalism. But there are also questions about the energy usage. And now you normally get an AI summary before you see the actual links. In fact, you have to look hard to get the links.

But that means there's a default option now, which is using more electricity, using more water. So we have to ask these questions about the sustainability of the general direction we're going in. In the early days, businesses tried to put data centers into far north territories where they relied on the temperature, the external temperature to cool down the data center.

But in most places, that's not a factor. That's not adequate. You need water. So fresh water is often used. And it turns out that in West London, Thames Water is extremely worried about water shortages for many reasons, probably. But one of them is data centers using water.

And it's quite interesting that according to some estimates, when you build a really big data center, exactly the ones that are going to be needed to run our large scale AI, they can be using 1.5 to 2 million liters of fresh water a day.

We only use about 5 million litres of fresh water a day as human beings in a country. So that's an awful lot of fresh water. And have we started to feel the effects of that yet, through that vast consumption of water? Well, it's...

hitting some parts of the world, for example, where another aspect of AI comes home, which is the devices that we need to use in order to get the results of AI or to do anything on social media or on the web, such as our smartphones. They rely on batteries. The batteries are made of lithium. There are only two ways of making lithium. One of them involves sucking water up to the surface and letting it evaporate so the lithium gets left.

that's done in places like the Atacama Desert in Chile, which is obviously a desert. So it's desperately short of water. And the local people, the indigenous peoples, have become extremely angry about that because they're short of water and all this water is being wasted, as they see it, to produce lithium for batteries in computer devices. Similar disputes are going on in Argentina as well. Elon Musk, incidentally, has a very close relationship with the president of Argentina because he wants the lithium.

Well, President Malay has a gift for me. Javier Malay from Argentina, you guys know who that is, right?

We think of data and AI as very ethereal, somehow up in the cloud or those sort of metaphors. But of course, it's just a vast number of chips which are communicating with each other so they can do huge numbers of simultaneous parallel calculations. And sometimes it could be as many as a billion or even trillions of calculations are being run simultaneously because that's how many variables there can be in the very, very large AI.

AI models. But each chip itself has to be designed very carefully, mainly silicon but also a lot of quite rare metals like gallium and palladium, which are in short supply. And so there are a lot of supply issues in the making of chips. So AI isn't just up in the cloud. It's very much down on the ground in data centres, using large amounts of water, electricity and even rare metals, placing a strain on already depleted natural resources.

And while many governments around the world are actively looking to grow the AI industry, environmental campaigners are fighting to stop it. One, two, three, four. Data centers no more. Five, six, seven. Action network, it's too late. I spoke to Cara Carney, an actor and green activist from Ireland, who recently dressed up as the goddess Eru to protest against the number of data centers being built in her home country.

I've always been aware of the climate crisis since I was a kid and really cared about nature, but I only kind of got into activism in 2019 with the Green Wave when Extinction Rebellion kind of grew really big and the Fridays for Future movement. And they had a really lovely open door policy of anyone can join our movement. There's people with all sorts of backgrounds and skills that kind of come together to create these various protests.

There was one that we did in 2023 that was outside a data conference in the RDS where I dressed as Eirú. So that's the Irish name for the God of

personification of Ireland was basically her saying that you're come and leeching our resources like they're going to data centres as opposed to the people, to communities. And what is it about data centres in particular that frustrates you or concerns you? So...

Oh, Anna, the big thing is Ireland is not energy secure. We import a lot of our energy and obviously us and so many other countries in the EU are trying to meet legally binding emission reduction targets by 2030. Data centres are using one fifth of our energy at the moment.

That is going to grow. We have over 80 data centers working right now in Ireland and we have another 20 more in construction or with planning permission granted and tens, tens more in the process of getting planning permission. They're coming to Ireland because of Ireland's tax loophole for one thing. Like it's no secret that Ireland is the European center for so many businesses like Amazon and Meta and Microsoft because they get tax breaks.

And on top of that, we have a cool country temperature wise. So they would have a much harder job keeping data centers cool in Las Vegas than they would here. As far as I know, I think we're going to have 130 data centers here is the projected like paperwork right now in the next five years. I

I don't know what will stop these machines, these big giant computer machines being built until we literally run out of physical space because they're huge. Like that is the thing that I think will stop them before the government unless people come together and really pressure the representatives to kind of switch priorities here from tax breaks and keeping corporations happy

to actually preserving a livable future that is thriving. These data centres do not give back to the society at all. They are just leeches on our energy system and communities will pay, not just in Ireland but perhaps abroad as well. We're not saying in Ireland even to knock down the data centres here, we need a moratorium, we need to build no more until we have renewable energy, until

they can redesign how they're set up to give back a little bit. Like, we're not saying let's all erase data centres and stop sending emails and just send letters. We know we use data, we need it, but what it's being used for, a lot of it is just not important. Both Cara and Nick are concerned about the impact AI is having on the environment. But would it help if data centres were finding ways to give back to the communities that house them?

Last year Equinix, the company that gave me the tour of its Slough campus, used some of the excess heat from one of its data centres in Paris to heat the Olympic swimming pool during the 2024 Summer Olympics. Eugenie Duguas is an environmental economist at LSE who says there are many ways data centres can be designed to reduce their impact on the environment.

One way of designing data center in a smarter way is instead of letting the heat dissipate, is to reuse the heat for things where we need heat and we need heat in particular for heating houses.

In some cities like Stockholm or Helsinki, the excess heat is redistributed in the district heating system network. So these two systems are basically coupled, which is a good thing because, you know, that excess heat from the data centers is substituting away some heat that would have otherwise been generated by more fossil-intensive generation.

You can, in this case, find a way of coupling, of integrating the data centers into the architecture of the cities. Those sort of initiatives should be encouraged and incentivized in some ways. And is it realistic that all data centers would be able to do that, sort of offset their water usage or their electricity consumption?

No, and there's not one silver bullet, right? There are several ideas of how you can reduce energy consumption on one hand or water consumption on the other. And depending on, you know, the geography where you are, the technological solutions are also going to be different. So one idea is about where you locate the data center. If you want to reduce their carbon footprint, well, locate them near to where renewable energy is abundant, so close to a place that has

good solar with potential or geothermal potential. Like Iceland is a great example in there because it has a lot of geothermal potential. There's no lack of ideas in how data centers could be designed slightly differently to make them less hungry and less thirsty and so on. It's estimated that data centers account for 1 to 1.5 percent of global electricity use. That's according to the International Energy Agency. I asked Ergeny if we should be worried about this figure.

1% is not nothing. It's also not a huge fraction, right? There are a lot of other sectors, a lot of important industrial sectors that emit a lot more emissions to this date. But of course, why it is important to start talking carefully about this sector is because the pace of growth has been outstanding.

Mostly this is because of AI and the rise and the adoption and the takeoff of a large language model, which is really increasing the demand for electricity and therefore the possibility of much higher carbon emissions in the future. At the moment, it's still a small fraction, as I said, but...

The fact that it's really growing very fast is a very important reason to pay close attention and really start thinking carefully about how this particular sector can or should be regulated. So there are ways that data centres can be designed differently to ensure they're more environmentally responsible. But few countries are currently demanding this.

Last year, the European Union moved one step closer towards regulating the industry's energy efficiency by imposing mandatory disclosure of the energy performance of data centres in the EU. Here's Ergeny again. So to be clear, they're not mandating anything else, right? They are not saying that they should increase...

the fraction of renewable energy that they're using in their electricity or they're not setting standards on how energy efficient these data centers are. But it's at least the first step. The EU will be able to provide a sort of sustainability rating of each data center, which can provide some sort of incentive

to make progress in the right direction by a sort of mechanism like name and shame, right? If you can point fingers at the data centers that have the worst energy rating and exert some sort of stakeholder influence on that tech company, you know, that may help move things in the right direction.

And I urge countries to speed up their net zero timelines. According to the United Nations, more than 100 countries have committed to becoming net zero by 2050. That means balancing the amount of greenhouse gases going into the atmosphere. But data centres pose a significant challenge to those pledges. And in recent months, we've seen many governments and companies backpedalling on their green promises. So is there any real motivation for data centres to become more efficient?

It is very expensive to run these data centers, so companies are going to have incentives to make this data center more energy efficient, right? Because the less energy, the less electricity they need to provide a certain number of compute requests, and the cheaper it's going to be for them. The demand for AI is going to scale up, but the demand for electricity is not going to scale up proportionally to the demand because these data centers are becoming more energy efficient

by the day. There are some fundamental mechanisms that are working on our side. For example, these centers have become more and more energy efficient over time. The improvement in energy efficiency truly has been astounding. Some numbers that I remember off of my mind are things like over the last, you know, maybe 10 years, every year has been an improvement in something like 20% energy efficiency.

So maybe reducing the costs involved in running these data centres will be the real motivation to improve their energy efficiency. Earlier this year, the Chinese AI startup DeepSeek made headlines. The rise of Chinese AI chatbot DeepSeek has taken the world by storm.

You're seeing markets being absolutely eviscerated on news from a company most people have never even heard of. It's not DeepFake, it's DeepSeek. Let's talk about DeepSeek because it is mind-blowing and it is shaking this entire industry to its core. After the latest version of its app promised to outperform competitors at a fraction of the cost and using a fraction of the energy, providing hope for that AI sustainability paradox. But Nick Caldry says it's not just the environmental impact that we should be concerned about.

We're increasingly moving to a situation where businesses are encouraging us to use AI for almost everything, to write our letters, to write a letter to a friend to say, I can't come to your party and be embarrassed. How do I put it? You can ask ChatGP teacher, write your answer for you. So the question to ask is whether that's the right thing for the environment and also whether it's the right thing for the social environment, our social world.

Whether it's right for us to be dependent on AI in that way. Is that a good thing for us to do as a society? And what I want to do is raise the question whether this is the right deal we're doing with AI.

There's no question that there is some uses of AI which are really sensible, such as looking at an x-ray in the NHS. We're short of staff, we're short of radiologists to do that scanning. Maybe it does make sense to use AI to come up at least with first approximations where the patterns might be, whether the patterns in these x-rays are a sign of something worrying or not.

I'm all in favor of that. But that's a situation where the answer is something that a human expert can then say, that's a hallucination, that's way off, or no, that could be right. I need to look at that more closely. So we can immediately see good uses of AI and this vast calculated power, which fulfill a social goal and where we might think the energy cost is absolutely justified.

But then something else, just finding an answer to a question that you didn't really need to ask anyway, or you could have found out some other way, but using six times as much electricity, the calculus is completely different when we think about the environmental costs. So sustainability for me has to be the core of how we build the right sort of social contract around AI. There's also the social side. Supposing you lose the habit of

of answering questions that a few years ago you would easily answer. The risk is that we end up de-skilling ourselves.

So I think we need to look really carefully at the social contract we seem to be building today around the use of AI, because it may have costs that at this stage we can't see, but once we enter down that path, it will be too late to correct. We could easily get this wrong. We need to have the debate now, and that does mean pausing temporarily the headlong rush to move towards AI at all possible costs.

"The largest AI infrastructure project by far in history, and it's all taking place right here in America. $500 billion at least." Yes, there is a geopolitical race because the US and China want to lead the world in AI,

Every other country wants to be part of the AI race and not to be left behind. But that's not necessarily the best basis on which to have a balanced debate about the risks and benefits of AI that truly benefits citizens and societies and the environment. That's what we need to do. And I think that's the role of academics and social scientists to contribute to that better debate than we're having at the moment, either in the UK or anywhere else.

Eugenie Dugois also wonders if the AI path we're currently traveling down is the right one. I think what is really striking in how AI is being developed at the moment is like it's mostly a couple of large, very large tech firms that are owning the technology. So it's mostly private sector companies.

entities. And we have to ask ourselves, what are the interests of the private sectors and to what extent are they aligned with public interest and maybe more generally like public goods? I'm pretty sure some interests may be aligned, but not all of them. And I think it will be in the public's interest to maybe even invest in a publicly owned, publicly funded AI initiative so that AI algorithm for the public good can be developed and

and made available. We've heard about the amount of water that AI uses, the amount of electricity it consumes, and the amount of rare earth metals that are needed for data centers to function. But Urjani also tells me that AI could actually help us deal with some of the effects of climate change.

On adaptation, there are really cool examples there on early warning systems, for example, floods or wildfire. There's this system that was launched by Google, the Google's Flood Forecasting AI, which has been already in place and has sent out text messages to millions of people when the system detected that there could be a flood. So floods are very difficult to predict, right? You need to have a lot of information about how much rain

falls has been there. You need to understand the geography of the place you're at, the levels of the river and so on. So that's definitely something that now with the satellite information that we have and new sensors that are placed that are capturing information in real time, this algorithm of predicting real time whether a flood event is likely to happen in your neighborhood or not.

And how can AI be used to monitor the Earth and protect biodiversity? I think this is a really interesting area. So far, you know, we've been very limited in how we could protect Earth's natural ecosystems. There's an example in Africa, in the Serengeti, where technology called the Trail Guard AI is basically used to detect potential poachers

So this is a very large area, right? And you only have, I think, about 100 or so rangers that are basically tasked with roaming around and trying to catch poachers if they find some. So it's obviously something that's very difficult to do because the area is very large. But now this system that combines some cameras

potentially drawn images as well and with a sort of recognition algorithm sends messages within 30 seconds if a potential poacher is detected it sends a message to the rangers that can then go and intervene and that has actually led to the arrest to dozens of potential poachers before they killed the big animal which is really the important point here because

Of course, the goal here is to protect endangered species. AI is neither a hero nor a villain at this point. I think AI is like a tool. We can make great things out of it and we could make potentially very bad things out of it. So humans have a major role to do here in terms of shaping the direction of this new technology. That's why it's really important to talk about these things and think about what

what are the incentives already at play and how could we change incentives to make sure that future developments are better aligned with human welfare? And that includes, you know, aspects on labor and environment, but it could also be much broader than that, right? So in a way, AI is a little bit like fire. It can warm you up or it can burn you. It depends how you use it and maybe if you know how to use it well.

This episode of LSE IQ was produced and edited by me, Anna Bevan, with script development from Sophie Mallett and on-location sound recording from Oliver Johnson. We'll be taking a break over the summer, but we'll be back with a new season of IQ in September. In the meantime, why not attend the LSE Festival, either in person or online, from the 16th to the 21st of June?

Our world-leading speakers will be exploring the threats and opportunities of the near and distant future and discussing what a better world could look like. For more information, visit lse.ac.uk forward slash festival. And if you enjoyed this episode on AI and sustainability, check out LSE's AI, Technology and Society series, where we're exploring AI and technology's potential to do good and how to limit its potential to do harm through short films, events, blogs and podcasts.