The risk that people are most concerned about over the short term is conflict. State-based armed conflict is the name of the risk. In the last 30 years, people have talked about us living in this golden era of turning swords into plowshares, to use the biblical phrase. And now we're on the eve of a world which is turning plowshares back into swords. And that's got real consequences for the world and for our economies and for our governments.
Welcome to Radio Davos with the annual meeting just days away. On this episode, we're taking a look at the biggest risks facing the world. In the medium terms of the two-year timeframe, misinformation and disinformation tops the ranking. The changing media landscape, the changing technology landscape creates these new vulnerabilities which can get exploited by
actors with a range of different ambitions. You don't have to go to kinetic hot wars in order to be adversarial and be aggressive. And now I'm joined by a co-host, a colleague who's put together the latest edition of the World Economic Forum's Global Risk Report. Welcome, Mark Elsner. Mark, how are you? Thank you very much, Robin. Now, you're head of the Global Risk Initiative at the World Economic Forum.
What is that and what is the Global Risk Report? So let me start with the report. So this is one of our flagship reports at the World Economic Forum. We're very proud that this is the 20th year of the Global Risk Report. It's a report in which we conduct a survey of almost 1,000 decision makers and leaders all over the world.
from the fields of public sector, the private sector, international organizations, civil society and academia. The Global Risks Report captures the perceptions of this group of experts and decision makers from around the world. And what it allows us to do is to rank a set of, in this case, 33 global risks
according to their expected severity. And we do this over three time frames: the immediate turn, so 2025,
The short to medium term, so 2027, and then the long term 2035. It's such an interesting report and every year it's different and every year it comes out just ahead of Davos. And so really it's food for thought. And those, particularly those near-term risks tend to be the things everyone's talking about in Davos. Can you tell us some of the highlights from it? What we can expect will be these big risks.
Yeah, well, you may not be surprised to hear that the risk that people are most concerned about over the short term is conflict. Specifically, state-based armed conflict is the name of the risk. Is that a euphemism? You mean war, don't you?
Yes, yes. And interestingly, this is ranked top by almost one quarter of respondents this year. And compare that to just two years ago, when it wasn't even in the top 10 of two-year risks. This just shows how, unfortunately, the world has changed over the last couple of years. Are there any other, looking at the medium and long term, any other things that
You'd put as headlines? Absolutely. So in the medium terms of the two-year timeframe, misinformation and disinformation tops the ranking, the same as it did last year. We look forward to getting into that in a little bit more detail with our experts. Cool. Yeah. Okay. Well, let's bring in our experts there. They're here on the screen for us.
We have Nairie Woods, who's the Dean of the Blavatnik School of Government at the University of Oxford. Nairie, hi, how are you? Hi there. Great to be with you. What is the Blavatnik School of Government at the University of Oxford?
So we're a school of government built, my colleagues and I built this school 15 years ago to think about how to help improve government, how to attract great people to government, how to give them a training that will help them do better in government. So it's a professional school.
Right. And a lot of people in government will be, well, they'll be included in the survey for the risk, but also they'll be very interested in reading what's in this report. Our second expert witness is Azim Azhar, who's Chief Executive Officer of Exponential View. Hi, Azim. How are you? Very, very well, Robin. Thank you. Great to see you. Tell us something about Exponential View. Exponential View helps people understand how the world is changing as it
intersects with these rapidly advancing technologies like artificial intelligence and we do that through publishing and events and podcasts as well. I know indeed you've been on my podcast a couple of times. That's right. Let's talk about technology then the first kind of theme if you like of this interview. There are
Two technology risks that we'll look at. And first, one of them, I mean, is this a technology risk, the miss and disinformation that Mark just mentioned? I remember looking at this report last year and that one had rushed to the top of the list from pretty much nowhere, I think. And particularly because 2024 was going to be this big election year around the world and people were using generative AI in a way they had never been before.
it was suddenly seen as this massive risk to societies and to stability and to social coherence, perhaps. I wonder, was it overstated, do you think? How do you see the risk of mis- and disinformation?
I think it was really critical that you asked whether this is really a technology risk or not. And I think of the risks that we look at, this is one which is much, much closely interconnected to questions of geopolitics, geoeconomics and crime on the one hand, and on the other hand, the changing nature of the media business model and where consumers want to get their news from and their information from. And I think that those sets of trends
collide to create a space where trusted sources are not as trusted as they used to be, where because people get their news and their information from social media platforms, trusted sources don't necessarily surface as much as they might have.
At the same time, adversaries are getting better and better at using these types of techniques. And let's clarify the difference between misinformation and disinformation. Disinformation is the deliberate use of information.
uh, you know, what we might've called considered propaganda, uh, back in the day. It's, it's the deliberate use of, uh, communications messages across platforms to, uh, confuse or, or change opinions. Misinformation is where, uh,
things are arising because of a mistake, but that distinction is not as clear as we would like. I think we've observed what's happened in the US over the last couple of years with Twitter becoming X and its algorithmic editorial policy, the rules by which content gets surfaced or hidden have really, really changed. And so I'm
Is that a deliberate decision? It probably is. So is that really misinformation or is it moving in towards disinformation? But I think that the key point here is that the changing media landscape, the changing technology landscape creates these new vulnerabilities which can get exploited by...
actors with a range of different ambitions. Some of them are adversarial, geopolitical, and some are just for the sake of making a little bit of money through the advertising funnel that is the internet. But I think the other big risk, just to build on what Azeem said, is to democracy. Because if you're going to give citizens a voice over how they're governed, they need information and they need shared information.
And so democracies over the last several hundred years have had to always think about that. When the printing press was invented, they had to censor the kinds of pamphlets that people would print in an anonymous way. When radio was invented, when television was invented, the United States brought in a fairness doctrine telling all broadcasters that they had to present the fair arguments on either side of any issue.
legislation that President Reagan dismantled. It's constantly there. If citizens are going to have a voice in how they're governed, they actually need access to information. So I think this technology revolution, like everyone preceding it, will
require regulation. And I've said democracies, of course, other governments are threatened by it as well, because even in an autocracy, a government needs a modicum of trust from its citizens. And if the information ecosystem is one which systematically erodes trust among citizens as well as of their government, that's going to be very difficult to govern.
But I think that it's sort of sorry to add on that. I mean, I think it's also the case that in some of the autocracies, they do have more robust and robustly enforced policies about how information can flow. And we might disagree.
feel in some places that that's preventing freedom of speech. But at some point to Nairie's point, you need to, that they've recognized the need to moderate exactly the flow of material. I think one of the challenges that democracies face is that they are vulnerable to
by dint of their open architecture towards ideas, to misinformation, but more importantly, disinformation. What are the solutions to that? Because after the printing press, there were lists of banned books that stood for hundreds of years, which I think with the benefit of hindsight doesn't look like a wonderful thing.
for humanity. And if we were to clamp down on freedom of speech, say, and this is what supporters of kind of absolutist free speech would say, well, who's setting the rules here? You know, why would it be any of us to set the rules? I mean, is there a way of squaring that circle?
Well, of course, following the printing press, you didn't just have the banning of books. That's not the regulation I was thinking about, actually. It's that when people could print pamphlets anonymously, the census began to require the printing houses to hold the name and full address of the person who was printing the pamphlet.
And then you could hold people to account for what they said. So the idea is, yes, you can have free speech, but people also have to take responsibility for what they say. Free speech has to have that correlate responsibility. And I think it's much more that than the censorship and banning of books that's important to our era. It's basically dismantling Article 230. You can't give those who publish...
a free right to publish absolutely anything, whether it's hate speech, defamation, or likely to engender really negative forces across society. That's very different to censorship, and societies have always had to wind their way around that. What was Article 230?
So that's the United States regulation, which basically shields social media companies as platforms from any responsibility over what is on those platforms, which meant one thing before you had algorithms.
platforms every single day, whether it's Meta, Facebook, you know, Google, are actually making decisions using algorithms about what to pump out and how much of what to give people. And as such, on any legal definition of publication, they are publishers, but they're not regulated as publishers. So they can defame people in a way that no other publisher can.
We have to move on from this. I mean we could talk for hours on this subject, but I'm thinking It's not just algorithms. Is it people self? Select sources of information Where if you're listening to a certain podcast and the guests are coming on and saying this science, you know Eating more meat and less vegetables will cure your cancer. For example, there's an example of this recently a very popular podcast
That's not an algorithm doing that, although I suppose that clip will be amplified by an algorithm. Those are millions of people choosing to listen to what is actually a very, very good podcast when it's not saying things like that. I think one of the difficulties here is defining trusted voices who the general public don't feel they're being lectured at.
from on high, but they feel they're being spoken to by trusted people who are basing their information on facts and evidence. I mean, I'm not sure we ever did live in that world, but if we did, how do we get back to it? Or how do we go forward to it?
Before we move away from disinformation. I think, again, that insight you have, which is did we ever actually live there, is quite an important one. And there are topics that you can follow in the mainstream media or traditional sources of excellence where the mainstream media is clearly disinformation.
often behind where the science and the empirical evidence is. A very good example would be in Europe, the effectiveness or not of heat pumps in cold countries and the right-wing press in the UK where I'm based
is quite unscientific in their assessment of this, which is essentially these things work in cold countries, but it's not in their interest for whatever reason to express that. I think that's why this ends up being quite a multi-factor problem. Without doubt, there is more than that
the technology platforms can do and without doubt things like Section 230 need to somehow be brought up to date but it's not clear what will happen in the US over the next four years given sort of the politics there.
But there are other dimensions, I think, that have played into getting to a point where there is this lack of trust with, you know, authority. And so when I think about prescriptions here, I do try to think about, you know, more than what sort of one particular bullet, which might be get the platforms to do more, you know, what else needs to take place.
I think the other, the real problem is that the platforms have worked out that what really sticks, the clickbait that really makes the money is emotional. It's fear. It's playing to people's anger, fear, etc. And that those are very sticky emotions. So you can have the most trusted scientists in the world saying this vaccine is safe.
And then you have one fearmonger saying, my neighbour took it and turned into a monkey or something. And that will stick. And that will be the section that people then send around all their friends. And it will cause hesitancy. So I think that's the issue. It's not about reason. It's reason versus emotion. Right. And as I think, as Ema alluded to there, that's not a new thing. That predates. Because if you think about the tabloid press in the UK and their campaign against...
the MMR vaccine was, I'm sure, done because it got people buying those newspapers because they were scared of what this thing was doing to their children. Turns out it wasn't doing anything to their children if you listen to the actual science with facts behind it. I wanted to ask one more question before we move on to the environment, but on technology, Mark, so...
This is the immediate term risk 2025. The number one risk, right, is mis- and disinformation. In the two year one. So in the next two years, that's the number one risk. Then we have another two or three risks. And I think the number five, the risk of cyber espionage and cyber warfare. Has that become more of a consideration than it was before?
Well, yeah, I think it's tied up with the fact that we've got conflict coming out at number one for the very short term, right? So cyber espionage and warfare, a huge, big part of that. And I think, you know, we're seeing it around the world, right? We've seen it in Lebanon, the link between cyber and physical warfare really sort of
Coming out into the open, let's put it that way. You know, the military has been working on it for a long time, but this was perhaps the first time that we, you know, the world in general has seen just how effective that can be. So I think, yes, it's in fifth position. But the biggest story perhaps in the global risk report is just that conflict is top of mind. Right.
Is war just so much more technological now? Azim, you're a technology expert. Is it exponentially more militarized, this technology? Well, I think what we've seen over the last 10 years has been a hybridization of how technology
countries represent their interests and you don't have to go to kinetic hot war in order to be adversarial and be aggressive. And of course, as the tools have got better, people have done this more and more. I mean, many disinformation campaigns that take place within democracies can be traced back to autocratic adversaries. And the same is true for cybersecurity and
and cyber attacks. Amazon recently disclosed that they get nearly 1 billion cyber incidents per day, which is an astonishing large number. And the only way you can defend against those is by using AI. A lot of those will be coming from criminal gangs who are tacitly or sometimes explicitly supported by adversarial nations. And it is part of an overarching picture of
applying pressure to represent your interests. And I would say that, you know, it's not just about, it's not just about, you know, road calls or fishing. What we've seen in the Baltic Sea recently has been, you know, physical infrastructure that supports a digital economy in terms of cables between Finland and Germany and so on being destroyed likely deliberately by, you
you know, by third parties. And I think that this picture is closely connected to the first risk that the experts were concerned with in the short term, which was state-based armed conflict. I would consider this as state-based conflict in a way against which there's a sort of a ladder of different initiatives and different actions. The issue I think here is that...
the nation state's ability to secure every part of what's called the attack surface, which is, you know, not just your airspace, but it's the cables that are owned by third parties that bring the internet and the digital economy to your nation. Um, it's the, uh,
the websites and the payment systems that your economy, the firms in your economy rely on, nation states are not on their own able to provide the protection that we would, you know, used to ask of them 50 or 100 years ago, and are reliant on the very large private actors who run the digital infrastructure as well. So it's quite a complex, quite an interesting space, which also brings the four new methods of governance and reporting as well.
I think the other huge thing that the era of conflict and the WEF's risk report is right in saying this is both people's perception, but it's also driving a whole new policy. If we look at what's happening to defence spending across every category of country in the world, we see countries that are fiscally constrained. So governments are trying to cut expenditure, but they're massively increasing their defence budgets.
And the last 30 years, people have talked about us living in this golden era of turning swords into plowshares, to use the biblical phrase. And now we're on the eve of a world which is turning plowshares back into swords. And that's got real consequences for the world and for our economies and for our governments. Let's leave that one there and move on from technology to society and, in fact, demographics.
Mark, super-aging societies are identified as a big risk. Tell us what does that mean in fact, super-aging societies? Yeah, so when we talk about super-aging or super-aged societies, what we're talking about is societies in which at least 20% of the population is aged 65 and over.
So Japan is already in that category, two or three European countries, Spain, Italy, Germany are pretty much over that line already and many more are going to come into that category over the next decade, not just in Europe but also others like South Korea for example.
This is this massive underlying sort of structural trend that brings along with it numerous risks of both societal and economic nature. Some of those, for example, include the pensions crisis and how is that going to unfold over the next 10 years. And in our risks report, we go into this topic in a little bit of detail. But also, labor shortages in these countries,
in several sectors, but one that jumps out in particular is the long-term care sector, where unless
policies change and directions change a bit, we're unlikely to simply have enough people to take care of those who need taking care of, will need taking care of in the year ahead. So there are big risks and I think it's worth discussing these and unpacking them a bit more. Well, Nairi, do you want to jump in? Why is this demographic trend such a big risk around the world?
Well, yes. So for the super-aging populations, I mean, it's exactly as Mark said. They become, you know, fewer and fewer workers, higher and higher healthcare costs because, of course, your aging population have access to ever more wonder drugs that will keep them alive even longer but cost even more. And then pensions, pension systems designed for people to die younger when actually they're living longer and costing more as they live.
And of course, all of that can breed some nasty politics, some intergenerational friction and resentment between the young who feel they're having to work twice as hard to support the elderly.
but also among the aging populations, all the issues of loneliness and how they socialize and how we keep those communities together. So it's a pretty big risk. Now, the problem for most societies is that one of the solutions is to have more babies. And it turns out that
All the societies, think from Singapore to Russia, who have tried to increase the birth rate have always failed, no matter what they've tried, money, social programs, etc. And the other alternative is more immigration. And we're living in a period where most countries, if anything, are opposed to greater immigration.
So the problem looks quite intractable. Now, Azeem and I were actually co-chairs of one of the WEF's Global Future Councils on Complex Risks, and we spent a lot of time looking at this. And of course, we observed that if you've got really strong economic growth, you're fine, whether you're populous or super ageing.
The problem is if you're stagnating, which is what a lot of countries are, then it becomes a real problem. And what adds to the problem is that aging societies can find it difficult to keep their smartest, most innovative workforce. So they can suffer brain drain and then the death of innovation within them. And once you get into that cycle, you really do start cycling downwards. Azeem, is technology going to help us, take us out of this hole?
Well, I would want to add a little detail as well, which is that, you know, in the macro perspective,
perspective if a country's population is is coming down it says something about that country's that nation's purpose right internally to itself i mean a nationhood has come from people knowing a national anthem and you know sort of rallying around a symbol and finding that shared history and and i'm really quite curious about what um what it means for that idea of nationhood and and it's
I think it adds a little bit of fuel to the fire of difficult politics that Nairi referred to. You know, with that one number, which is how many people are choosing to live here, is coming down, it doesn't send a very positive message in the long term. And that's why there's a sort of attractability issue here. Now, where technology can help, I mean, I think technology can help
at some point with some of the labour shortages. And the good news about demographics is that demographics moves quite slowly, you know, one year at a time, essentially. And so we're starting to see in countries where the demographics are not particularly helpful, a rise in the use of industrial robots, for example, in an attempt to make those workers who are staying in the workforce more productive.
productive and and you know at some point we may be able to find that we can
scale out social care by automating certain parts of where carers spend their time. I think that carers, when they're caring for people, spend a certain amount of time with the people they're caring, which is the most valuable, and a large part of the time ends up being in administration and paperwork. And one might hope that technology could reduce that administration burden to allow for more person-to-person care.
contact time as well. But that doesn't really alleviate the overarching trend, which is that if populations are getting older, if the
population pyramid is inverting, you still have fewer workers to do the work, making less money to support many people who aren't working. And I think one of the hardest questions that nations will have to grapple with is what were the assumptions about retirement age and pensionability when these rules were created? And do those still make sense in 2025 and beyond? We're on to the environment.
Yeah, absolutely. 2024 is on track to be the first year with average global temperatures more than 1.5 degrees Celsius above pre-industrial levels. So that's farewell the Paris Treaty target for keeping it underneath that.
Now, I've covered this report for a few years and always on the 10-year timeline, climate change and related risks to do with climate change. It's basically the top five, I don't know, but every year on a 10-year timeline. And there it is again. But if it's always 10 years away, isn't that the problem here that we're putting war and we're putting demographics and we're putting disinformation aside?
Near-term climate change, you know, it's always the biggest one and it's way out there. It's a problem of perception, isn't it? You're right. Over many, many years, environmental risks have topped 10-year risk outlook. But this year, we have two that have come into the top 10 in the short-year outlook as well. Extreme weather events and pollution.
And so I think this is perhaps the sign that the message is getting through that this has become a short-term crisis and we see it with the kind of weather events all over the world. We've seen it in 2024 with the floods, for example, in Brazil or in Europe as well. So it is here, but...
It's still to some degree this kind of long-term issue. Nairi, do you think it is taken seriously enough, climate change? Are these extreme weather events really hitting home now in terms of people's attitudes?
Yeah, look, I think people believe that climate change is real. I think that that has taken off in large measure. I think the current debate is who pays for the mitigation and who pays for the adaptation. And I think that's the concern. I think there's...
Youth movements across Europe that we used to rely upon to push the climate agenda, who are now saying, hold on, is this just an elite issue? Are we being forced to pay for something that others are not paying for?
So when German tried to do it through the green boiler subsidies and France, you know, by reducing fuel subsidies, they quickly created a backlash. And the backlash was that climate is a kind of elite preoccupation, but that other parts of the population have more urgent problems. So it's not that people don't believe it's happening, and they're certainly living it happening. It's that they want to know that others are also, you know,
taking action to prevent it. Climate is genuinely a collective action problem. Each of us
will act to prevent it if and only if we believe that others are going to act as well. And the hard thing about the climate change negotiations that took part in, that just took place in Baku, is that the really big countries' leaders were not there. And of course, if you're a very small country facing terrible climate effects and you don't believe that the United States is
certainly as of January 2025, is going to be committed to taking action to get to a net zero position, then the prospects of your actions having an impact
you know, become very small. So I think the, I'm hoping that 2025 will be a year when people find a better politics to support net zero policies, which are going to affect rich and poor across the world. And some of the poorest and most vulnerable island states will literally disappear if action isn't taken. Azeem. Well, uh,
I mean, Naira is absolutely right. There's a horrible collective action problem here. But I'll venture to...
that may give us some sort of cause for a little bit of optimism. So a few years ago, China had really terrible air pollution problems in its cities, and it doesn't now. And it was able to do that through a combination of policy and regulation, but critically through technology and four major measures. The first was the shutting down of airports
some outdated industrial systems that were using old technologies. The second was the shift towards electric vehicles, first with their bus fleets and then with their cars, and now 50% of all cars sold in China are electric vehicles. The third was a real drive towards renewables, and China puts up
unending amounts of renewable capacity each year. And the final one was using digitally enabled and internet enabled monitoring systems across the city so they had a better sense of how well things were going. And in a few years, what seemed to be intractable problems, because these are some of the biggest cities in the world, has been addressed. And I think the lesson that comes out of that is that technology creates policy space because technology creates
is things getting cheaper. And so when we look at the
American politics from 2025 and the things that have been said about some of this debate, ultimately the pocketbook will speak. And if it's cheaper to bring on board gigawatts of power through solar and batteries and through coal or natural gas, which it is in many parts of the world and soon will be everywhere, then the collective action starts to get addressed gently by the price mechanism. So the one
tailwind that I think we have here is that some of the key technologies of the energy system are now cheaper in the form of renewables than in the form of fossil fuels. And that wasn't the case five years ago. It's not sufficient by any means, but it's helpful. And it's not just that the technologies are getting cheaper. It's that they're actually more secure for most countries. You don't have to import
hydroelectricity. You don't have to import solar and wind power. You can do it on your own shores and stay in control of it yourself.
The other thing that gives me optimism is that some of the largest oil and gas exporters in the world, in the Gulf, are starting genuinely to plan how they're going to celebrate exporting their last barrel of oil. And I don't think five years ago that that was in the World Economic Forum's future look, that there would be such a shift among the oil and gas exporters themselves.
If collective political action is increasingly difficult, you've got a returning Republican presidency, and we've seen that happen many times in the past, when it switches from Democratic to Republican, that the backing for the international, the United Nations climate action recedes. We can expect that to happen, I would assume, this year and for the next four years. Does that mean that kind of multilateral process
is can take four years off and that will rely on the profit motive and these technologies getting cheaper to make the progress or is there still a role in this year in 2025 and for the next four years for that multilateral approach
Definitely still a role for the multilateral approach. And as I said in my previous answer, to me, the bigger risk is about the domestic politics of this. It's politicians explaining better to their populations that this isn't
this isn't going to fall on the working class in all countries who are already feeling that they're paying a huge burden for other effects of globalization, that this is a price that's going to be picked up by everyone in society and by other countries as well. And as for the new administration in the United States, you know, the president has selected as his current main advisor, you know, one of the world's largest producers of electric vehicles.
Let's go with that. Let's see the positive upside. Let's look at the Inflation Reduction Act effects, which are very powerful across Republican states, where the subsidies are really creating a little bit of a manufacturing renaissance. Some of this could hold. I wouldn't give up on the United States as anything.
staying, to some degree, on a cleaner pathway. And just ask a follow-up question, and I know this is something we discussed in the past scenario. To what extent could the energy transition be derailed by critical resources shortages? Because this actually comes up as number four on our 10-year risk horizon. Talk about things like copper and
lithium, nickel or whatever else may be needed, cobalt, for the energy transition. Do you think we'll find ways around that soon enough?
I think the private sector has moved incredibly quickly to start exploring where they can extract critical minerals and metals. And what's become apparent to me is how many of those critical minerals and metals are actually available in a lot of different places. It's just that they're not commercial to extract because quite small quantities are needed. So no matter which side of the, you know,
fragmenting world you're on, I think there will be a supply. It's much more a question of cost and who pays the cost of extraction than it is about the rareness
of supply. I would agree with that. I think that this story of some critical resource limitations is the commodity traders version of clickbait, because it drives clicks on the business websites. And we saw this three years ago, where people were petrified about what was going to happen with lithium pricing. And
There was a slight wobble where lithium prices sort of didn't drop. Lithium ion battery prices didn't drop for a year and they're down 25% now. And in one of the annual meetings, I had a number of conversations with bosses out of that industry. And they said, yes, we're going to have a couple of sticky quarters, but then of course, incentives, innovation, commercial partnerships come to bear and supply will expand once again. And so, yeah,
There is the old adage that the cure for high prices is high prices. And the moment we need these materials, the earth is a big place. There are lots of places which haven't been explored. As Nairi points out, our technology is much better now than it was 20 or 30 years ago. A high price signal and a looming shortage creates incentives and innovations for alternative approaches. And I wouldn't count out that
combination of incentive and innovation being able to meet the forthcoming demand. We're running towards the end of time. We've got you both for. I just want to ask you for a couple of closing remarks, really. This report, the global risk report, has been going for 20 years. The risks have changed over the time. In looking ahead,
20 years, if you can, which goes well beyond the timeline of this report. I mean, where do you see it? Are all the risks already here? Are there some known unknowns that will be hitting us 20 years down the line? Who wants to go first?
I'm going to go first because I'm going to give Nairi a couple of minutes of thinking time. So my perspective is that the fragmentation that we see is coming from a number of different forces. And I find it hard to believe that we'll get to a stable equilibrium in a five or 10 year period.
I mean, certainly in 20 years' time, the risks that relate to the energy system will be done and dusted. We'll be in a
predominantly solar based energy system providing all of that national sovereignty and security that to every nation that that Nairi alluded to earlier, but I think the reorientation of the international system, which is now not just about nations, it's about technology standards, as well, will may still be playing out even that far out.
You'll remember the Hollywood movie WALL-E, where human beings, having discovered that technology can fulfill every burdensome task they might need to fulfill, become just blobs that sit on these little automated airbeds drinking Slurpees. And I think that one of the big challenges for this decade is for human beings to really work out
How important is human agency to meaningful lives? And, you know, if an alien came down to Earth and watched the way the world's millionaires manically exercise every day, even though they have no need to go out and hunt their own boar for lunch or, you know, catch their own water, all the things for which human strength was essential.
You know, my hope is that in a decade human beings will be doing the same in all kinds of ways to sustain their agencies. I think the technologists are trying to make our lives more and more convenient and in many ways that's great, but it is at terrible cost if humans simply become unthinking blobs. So to me that's one of the big risks. The other is governability.
There are some problems that human beings can't solve just left to their own devices. You can end up with the Lord of the Flies. And that's why we have government, right? We have government in order to let human beings live
bring out their positive side, cooperate and work together to do much more than they could do individually. And I think the biggest challenge for the next decade is to rebuild governability among human beings, to rebuild how it is that human beings come to trust each other and trust the people in whom they give authority so that they can actually work as a society positively.
Those are two fantastic answers. I was going to ask you to end on an optimistic note, but you both have. I'll come to you in a moment, Mark. But I mean, just Azeem, the energy problem will be solved in 20 years. I mean, that's the kind of optimism. I couldn't have asked for anything greater. I'll come back to you in 20 years and we'll see how that worked out.
And then Nyeri, this human agency thing is so important. I think for me, the advent of generative AI, where you can create a song or a piece of music
in a moment without ever studying music. There's something wonderful about that, but for me I think it makes that human thing of meeting up with someone else and singing together or the communion of humanity which is totally missing from so much of the technology, I think it's made a lot of us realize how important that was that maybe we weren't thinking that a couple of years ago. Mark, what do you say about the 20-year horizon?
Well, to sort of bring it back to how we frame things in our report, we have four structural forces that we look at that are essentially driving a whole range of risks, and particularly when these four structural forces interact with each other. So...
These are demographic bifurcation, which we've touched on already, tech acceleration, which we've talked about, climate change and geostrategic shifts. So I think a little bit building on some of Azeem's points is the question is, you know, will these four structural forces still be
structural forces in 20 years? Or will some of them, maybe climate change perhaps even, will some of them no longer be the driving structural forces? And then the question is, which new structural forces might come along? And to that, I don't have an answer just yet. We'll see. Well, people can find out all the information on our website from the Global Risk Report 2025, link in the show notes.
It just remains for me to thank our guests, Nairie Woods from the Blavatnik School of Government in Oxford and Azeem Azhar of Exponential View. Thanks so much for joining us on Radio Davos. Thank you. Great to see you. Thanks very much for being with us. And thanks to you for watching or listening to Radio Davos. A reminder, you can find that report, the Global Risk Report 2025, on our website. Radio Davos will be back soon. Thanks very much.