We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode AI and Nuclear Power and ‘Wizard of Oz,’ Oh My!

AI and Nuclear Power and ‘Wizard of Oz,’ Oh My!

2025/4/21
logo of podcast WSJ Tech News Briefing

WSJ Tech News Briefing

AI Deep Dive AI Chapters Transcript
People
B
Belle Lin
I
Isabel Busquets
Topics
Isabel Busquets: 我深入研究了谷歌如何利用AI技术对经典电影《绿野仙踪》进行修复,使其能够在拉斯维加斯球体屏幕上以超高分辨率呈现。这项技术不仅提升了影片的分辨率,还通过‘出画’技术扩展了场景,展现了更多画面细节,为观众带来更沉浸式的观影体验。然而,这项技术也引发了争议,许多观众和业内人士担心AI会改变经典作品的原本面貌,并对未来创意作品的版权和使用产生担忧。虽然谷歌努力保持电影的原貌,但AI技术的应用仍然引发了人们对创意作品未来发展的担忧。 这项技术不仅在娱乐行业引发了广泛讨论,也对电影制作行业产生了深远的影响。AI技术的应用使得电影修复和再创作成为可能,但也带来了新的挑战,例如如何平衡技术创新和艺术创作之间的关系,如何保护创作者的权益等问题都需要进一步探讨。 Belle Lin: 我报道了美国能源部下属实验室开发的ProAID人工智能工具,该工具旨在辅助人类操作核电站。ProAID并非完全取代人工操作,而是作为辅助工具,帮助操作员完成一些重复性工作和故障排除,提高效率并减轻人工负担。考虑到许多核电站的设备比较老旧,ProAID的应用可以有效提升这些核电站的运行效率和安全性。 ProAID的开发和应用也反映出AI技术在能源领域日益重要的作用。随着AI技术的发展,未来AI技术将在能源领域发挥更大的作用,例如优化能源生产、提高能源效率、促进能源转型等。然而,AI技术在能源领域的应用也面临着一些挑战,例如数据安全、算法可靠性、伦理道德等问题都需要引起重视。

Deep Dive

Chapters
Google employed AI to enhance the resolution and expand scenes of the classic film for an immersive experience at the Las Vegas Sphere. This massive undertaking involved generating new pixels and expanding scenes to include previously unseen characters and backgrounds. However, this use of AI also sparked concerns about the alteration of a beloved film and the potential impact on creative intellectual property.
  • AI used to enhance resolution of the 1930s film for a massive screen
  • Generative AI created new pixels, a technique beyond simple copy-paste
  • Outpainting technique expanded scenes to include off-screen characters and backgrounds
  • The project generated excitement but also anxiety about altering a classic film and the impact on creative intellectual property

Shownotes Transcript

Translations:
中文

You're the owner of a small business, which means you're also the tech guy and HR and personal assistant and head honcho and intern. You could use another pair of hands like the experts you'll find at Verizon Small Business Days, April 21st through 27th. Get a free tech check, special deals and more. Call 1-800-483-4428 or visit verizon.com slash small business to book your appointment. Verizon Business.

Welcome to Tech News Briefing. It's Monday, April 21st. I'm Victoria Craig for The Wall Street Journal. Artificial intelligence goes retro. We've got a show all about how today's hottest tech is upgrading innovations of yesteryear. We'll take you to Las Vegas, where the Wizard of Oz is getting a 21st century makeover, and then look at how an AI-based tool can help operate decades-old nuclear reactors. But first, to the wonderful world of Oz. Toto?

"I have a feeling we're not in Kansas anymore." If Dorothy was shocked when her world went from black and white to technicolor in 1939, she'd be amazed at how AI has reimagined it. "We must be over the rainbow!"

The Las Vegas Sphere is preparing to give audiences an immersive viewing experience of The Wizard of Oz this summer. But adapting the 86-year-old film for the ultra-big screen presented quite a challenge. WSJ-CIO Journal reporter Isabel Busquets has an exclusive look at how the Sphere worked with Google to find the solution. So, Isabel, just walk us through the technology first, because...

This wasn't just a few tweaks here and there to the original Wizard of Oz film. This was a project really on a massive scale. So just describe that to us. Yeah, I mean, when you think about the fact that this movie, it was shot in the 1930s on a 35 millimeter camera, and they're trying to essentially put it on one of the highest levels

resolution, biggest screens in the world. It's 160,000 square feet. It's curved. The challenge of doing this is essentially a big one such that even the engineers at Google DeepMind were initially like, oh my God, where do we even start? But essentially what they did was two things, both involving AI. On one hand,

They just used AI to enhance the resolution. This is a slightly different technique than traditional resolution enhancement, which would essentially involve taking the existing pixels and doing a copy-paste. They essentially went really deep and used generative AI to generate new pixels. And that was a better way of enhancing the resolution there. And then the other thing they did was expand...

scenes. Essentially, in the original movie, if the screen ended at certain boundaries, but you know other characters are off screen, the sphere version is now giving you a much wider view into what's happening in that scene. You can

see characters that were off screen in the past. You can see a much broader background, maybe parts of the poppy field or parts of the Emerald City, crowds essentially that were not in the original shots. That's a technique that they called outpainting. There's also going to be some other sensory elements they're working on. They didn't want to talk about that a lot yet. Some of those elements are still under wraps, but they're really leaning into this is not just

going to the movies. This is really experiencing this film. And this use of AI has really generated a lot of excitement for some in the entertainment industry, but not everyone, including

A lot of people in the comments section of this story are thrilled with the technology really seeping into, I guess, more traditional, can we call it sacred aspects of our memories? This film is so beloved. So when Google says AI has touched more than 90% of it, it gives a lot of people anxiety and they feel like something that was sacred is maybe now gone.

being reinvented in a way that maybe not everyone is happy with. Google did put in a lot of work to sort of stay true to the core of the film. They match it shot for shot, so it's not like they're inventing new scenes or anything. But people were still pretty upset. They looked at the AI version and they were like,

I don't like her movements. This is not the way it was intended. And like, what are they going to do next? Within Hollywood as well, this is also stoking a lot of fears. There's a lot of anxiety in the acting community, in the writing community over how their creative intellectual property is going to be used and reused by generative AI. So it's stoking a lot of those anxieties and unanswered questions.

That was WSJ-CIO Journal reporter Isabel Busquets. Coming up, the Energy Department is hoping AI will take a starring role in advising humans at the helm of nuclear power's big tech-fueled revival. That story, after the break.

McDonald's meets the Minecraft universe with one of six collectibles and your choice of a Big Mac or 10-piece McNuggets with spicy Netherflame sauce. Now available with a Minecraft movie meal. And participating McDonald's for a limited time. A Minecraft movie only in theaters.

We've talked a lot on the show about how nuclear energy is seeing a revival thanks to big tech's data centers and their ravenous appetite for power, especially when it comes to AI. But to add an inception-like layer here, the relationship between nuclear and AI doesn't end there. Artificial intelligence could be used to help humans run those nuclear plants, too. Bell Lynn covers AI and technology for The Wall Street Journal.

Bell, there's a lab in Illinois that's run by the Energy Department, and it's come up with an AI tool that can not only help design but also run the nuclear plants. Just walk us through that tool and what it could be used for and how it would be used.

This tool is called ProAID and it was developed by Argonne National Laboratory in Lamont, Illinois. And it predates this current AI boom, but it really kind of dovetails quite nicely because as AI is driving this immense great need for power and particularly nuclear power, it's also arriving as sort of a way to help make these nuclear plants more efficient.

And the reason why nuclear plants really need this kind of AI boost is because they sort of belong in the era of nuclear's heyday, which was really in the 1970s, the 1990s. And since then, the utility providers that run them really haven't updated the technology. And so if you kind of jokingly imagine Homer Simpson sitting in his nuclear power plant with all the monitoring switches and the analog buttons,

That's actually not too dissimilar from what a lot of these nuclear power plants actually look like in their monitoring rooms. And so there's a great need for technology like this ProAID tool from Argonne to come in and replace some of the more manual tasks that a lot of nuclear operators have to do on a regular basis. What are some of those tasks? I mean, what really, for people who can't conceptualize this, I'm one of those people. What kind of things can AI take over from humans in this way?

It's essentially acting as an assistant. So it's not really taking over a lot of the work of a utility plant operator. It's really meant to aid the work of an operator in a lot of the making sure that the flips and the switches are running green rather than running red. You talked about how long nuclear power has been around and how old some of these plants are. So what

Walk us through the process to integrate this newer technology. Is it easier to do with newer build plants? Is that an option? Or are they really looking to sort of like retrofit all of these older ones?

Yeah, it's a bit of a combination of both, but it's a lot easier to add new technology to newer builds, to the sort of newer generation of small modular reactors and companies that are hoping to bring nuclear power online rather than updating existing nuclear power plants, some of which may be aging out of commission.

Argonne, for instance, is looking to these companies like TerraPower and Oklo, which are backed by big tech and big tech personalities. But there's also a hope that the existing nuclear plant providers will be wanting to upgrade their plants as their lifetimes become extended because of the greater need for nuclear power.

And one of the things that really struck me from your story was a quote from a senior nuclear engineer at that Illinois lab. He said, if we can hand off some of those lower-level capabilities to a machine when someone retires, you don't need to replace him or her. And to me, when I read that, I think, oh, my gosh, humans are going to be completely out of this nuclear power game. But you alluded to this before. That's not the case. It's not that AI is going to completely take over operating and running these plants. It's just going to be helping assist in –

certain ways here and there. Yeah, I think that's a good way to describe it. There's just a lot of that kind of more tedious work that ProAid and tools like it, but ProAid is kind of the only one that is really harnessing LLMs, for instance, can assist with. And a lot of the work in the plants has to do with troubleshooting. And so that's something that LLMs are good at in terms of

communicating with plant operators in natural language. So you can sort of interrogate in plain English the way that you and I are chatting with the tool and it can return a response as if you were chatting to a person who really knows a lot about the nuclear plant that you're operating. So if it

notices that maybe something is overheating, it can alert an engineer or someone at the plant and then they can go from there. Then the human basically takes over. Yeah, yeah, that's right. And how far along is this tool in its development? Is it already being implemented in some places? The tool is ready because it's been developed by taxpayer dollars, essentially as part of Argonne National Laboratory. It needs to be commercialized by the private sector. So they're looking for partners to work with them on making it widely available commercially.

That was WSJ reporter Belle Lynn. And that's it for Tech News Briefing. Today's show was produced by Ariana Asparu with Deputy Editor Chris Zinsley. Additional production support from Julie Chang and Pierre Bien-Aimé. I'm Victoria Craig for The Wall Street Journal. We'll be back this afternoon with TNB Tech Minute. Thanks for listening.