We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode AI Guided Diagnostic-Quality Lung Ultrasound

AI Guided Diagnostic-Quality Lung Ultrasound

2025/2/21
logo of podcast JAMA Medical News

JAMA Medical News

AI Deep Dive AI Chapters Transcript
People
C
Cristiana Baloescu
Y
Yulin Xun
Topics
Yulin Xun: 人工智能在超声波图像的获取和解读方面都有应用,目前AI主要集中在图像解读方面,而图像获取方面的研究较少。 Cristiana Baloescu: AI引导的肺部超声检查可以帮助医护人员更早地诊断出患者的呼吸困难原因(例如肺水肿或慢性阻塞性肺疾病),从而及早采取相应的治疗措施。一项多中心验证试验表明,AI可以有效地指导医护人员获取高质量的肺部超声图像,即使是缺乏经验的医护人员也能获得与专家水平相当的图像质量。AI引导的肺部超声技术在实际应用中面临诸多挑战,例如操作者差异、患者体格差异、设备差异以及AI使用方式的差异等,这些因素都会影响图像质量。AI引导的肺部超声技术在资源匮乏地区具有很大的应用潜力,例如可以用于社区卫生工作者进行肺部疾病筛查。AI引导的肺部超声技术在各种场景下都有应用前景,例如移动诊所、疫情响应和家庭护理等。

Deep Dive

Chapters
This chapter explores the complexities of lung ultrasound, highlighting the need for AI in improving image acquisition. It emphasizes the technical skill required for obtaining diagnostic-quality images and the potential of AI to overcome limitations in access to expertise.
  • AI focuses on both image interpretation and acquisition aspects.
  • Image acquisition requires specialized training and practice.
  • AI can assist medics in point-of-care ultrasound, leading to faster and more appropriate treatment for patients with shortness of breath.

Shownotes Transcript

Translations:
中文

I'm Yulin Xun, Associate Editor of JAMA and JAMA Plus AI, and you're listening to JAMA Plus AI Conversations. In today's episode, we'll explore the application of AI in providing assessments of point-of-care ultrasound. Joining me is Dr. Christiana Balouescu, an Assistant Professor of Emergency Medicine at Yale University School of Medicine.

Dr. Balouescu's research spans a diverse range, such as leveraging machine learning for advanced ultrasound interpretation in the clinical setting. In today's episode, we'll explore the applications of AI in providing assessments for ultrasound. Welcome. Thank you. Very nice to be here.

Can you tell us a little bit about your research and your background? Yes. So part of my academic focus within emergency medicine has actually been in research, specifically in ultrasound and point-of-care ultrasound, lung ultrasound being an example of this. And I've particularly worked a lot on the development and validation of artificial intelligence for both cardiac and lung ultrasound, both for detection and quantification of pathology

but also for image acquisition, which this study particularly focuses on. What are the issues with this area? Why do we need AI and machine learning?

Yeah, ultrasound as a imaging modality is kind of two aspects to it. There's an image acquisition aspect and then there's an image interpretation aspect. And so it can be quite complex. So far, a lot of the artificial intelligence has focused on the interpretation aspect. So looking for specific artifacts that show up on the ultrasound images that point to different pathologies.

which, again, you need specialized training for. Not a lot has been done on the acquisition aspect, so obtaining the actual images. And the acquisition is a very technical skill that requires education, repetition, a lot of practice. And so that takes time and expertise that may not always be available wherever this technology could be used. Can you give an example of when you've seen this and why the study then is important?

Yes, I'm an emergency physician. So we end up seeing a lot of patients that present with shortness of breath and a lot of patients come to us by ambulance. One potential use of AI in this setting is medic would perform a point of care ultrasound or a lung ultrasound.

the images would go to an expert that would interpret them, the medic would get feedback and they could initiate a treatment that is based on what is seen on lung ultrasound. Because specifically a population that we see a lot with shortness of breath, a lot of our patients may have heart failure with pulmonary edema, but may also have a history of chronic obstructive pulmonary disease or COPD. And those two

two diseases or pathologies present differently on lung ultrasound. With heart failure, with pulmonary edema, you usually get the presence of artifacts called B-lines, which are these hyperechoic and vertical artifacts that show up in the lung ultrasound. And you usually don't get those with simple chronic obstructive pulmonary disease exacerbations. And the treatments for those two pathologies are different. And if you know what's

what the problem that the patient is presenting with is earlier, then you could give them the appropriate treatment earlier. And so you would prevent their condition from getting worse at the minimum and definitely get them better sooner.

Artificial Intelligence Guided Lung Ultrasound Binding Non-Experts in JAMA Cardiology. So can you tell me a little bit about this study? Yeah, so this was a multi-center validation trial that evaluated the ability of artificial intelligence to guide and facilitate the acquisition of lung ultrasound clips that were of different

diagnostic quality by a group of healthcare professionals. The majority of them had no significant lung ultrasound experience. Subjects in the study underwent two lung ultrasound examinations, one that was performed by a lung ultrasound expert without using AI, and the second one was done by a trained healthcare professional, or THCP is the abbreviation we use, with the use of AI.

The AI itself, it has a set of algorithms that guide image acquisition, the automatic capture of clips once a threshold of image quality was achieved, and it also annotates for the presence of beelines. And then we had an independent panel of blinded expert readers, and these were different experts than the ones who obtained images in the study.

that remotely reviewed the clips. And they decided if the clips that were acquired by the two sets of operators were of sufficient quality to make a clinical assessment. The primary endpoint was the proportion of the studies that hit that diagnostic quality threshold in both groups. And so 98% of the studies that were acquired by the trained healthcare professionals using AI guidance were of high diagnostic quality. And there was no significant difference to the proportion of

the quality studies obtained by the lung ultrasound experts without utilizing AI assistance. So where do you think we go from here then? I think where we go from here is seeing how well it performs, not in a clinical trial, but in the real world, and maybe integrating some of the image interpretation artificial intelligence. Can you tell me what happens in the real world?

I think there's a combination of factors as it's not a simple answer and it probably depends a lot on the situation. Ultrasound is complex in the sense that there's a lot of operator variability, which partly AI is trying to move us away from that. There's also patient variability in terms of body habitus and anatomy. Sometimes landmarks are a little bit different.

Sometimes, you know, it's harder to kind of move an arm out of the way or like positioning. And then there's also equipment differences. So this particular AI software can be paired with different ultrasound machines, but not all of them. So those are some, at least some examples of complexity in the real world that are due to the ultrasound technology itself. We always have these really amazing trials with such great results. And then you put it in the real world and it's just, just disappears.

So I would say there's a lot of complexity about the real world that a study will never replicate, no matter how well of a well-designed study it is. In terms of just lung ultrasound or ultrasound generally, there's variability in, you know,

you know, how the operator uses the AI itself. All of the trained healthcare professionals had training in how they interacted with the AI and they had the AI presented to them in like a very specific way, which may not be how someone would use it once this is on the market. There are differences in

characteristics that would influence how great of a lung ultrasound can be obtained, even with the help of AI. Things like body habitus, the ease of patient repositioning in order to get the full eight zone exams. There are also times when you don't need a full eight exam or you would benefit from more zones than just the eight zones. There are also practical considerations on

on the situation in which the software is used. And that's where it's a little bit difficult to think about all of these situations in advance, which is why the real world is difficult.

We had a discussion on one of our other podcasts about just ultrasound in high altitude areas, for instance, or in lower resource countries, for instance. How can this potentially help those areas? It's a little bit of a paradox sometimes in that I think these areas can...

utilize technology such as this. There's a lot of potential use case scenarios, but then there's also practical barriers, sometimes electricity or Wi-Fi. I do think that AI that guides lung ultrasound acquisition could be very useful in the hands of novices in environments where experts are scarce. For instance, I can think of cases where community health workers could use

AI-guided lung ultrasound to screen for lung conditions, pneumonia, pulmonary edema, pleural effusions, particularly in areas where there's high HIV, AIDS, and TB co-infection. These captured clips, again, could be remotely reviewed by a centralized physician or reviewed at the end of the day just based on the memory of the device.

This approach is particularly impactful when it's paired with a portable ultrasound device, of which we have many nowadays. And I think that technology is itself continuing to adapt and improve. And so maybe it would be even more feasible in the next five years or 10 years.

I think other examples of scenarios where this can prove useful, you could have this deployed in mobile clinics to provide diagnostic support in areas that may be affected by natural disasters. You could have rapid screening and monitoring during a pandemic response for respiratory illness.

We have seen some increased research looking at patient-performed ultrasound. And so I think with further testing, with obviously the relevant approvals, this technology could pave the way for remote monitoring of patients in home-based care settings. And this is not only in

resource-limited settings, but also in higher resource settings where perhaps a more cost-effective way of taking care of their patients is in their homes. There's home hospital programs that I think have gotten started in several areas in the U.S. that I can think that they would benefit from this. Well, thank you so much for being here. I really appreciated this conversation and lots to learn about advancing ultrasound interpretation.

Thank you very much. It was really nice to talk to you about this. I am Yulin Xuan, Associate Editor at JAMA and JAMA Plus AI. And I've been speaking with Dr. Cristiano Baluescu about AI-guided ultrasound. You can find a link to the article in this episode's description. And for more content like this, please visit our new JAMA Plus AI channel at jamaai.org.

To follow this and other JAMA Network podcasts, please visit us online at jamanetworkaudio.com or search for JAMA Network wherever you get your podcasts. This episode was produced by Shelley Steffens at JAMA Network. Thanks for listening. This content is protected by copyright by the American Medical Association with all rights reserved, including those for text and data mining, AI training, and similar technologies.