Artwork

Indhold leveret af the Royal Australasian College of Physicians and The Royal Australasian College of Physicians. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af the Royal Australasian College of Physicians and The Royal Australasian College of Physicians eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

Ep99: When AI goes wrong

39:55
 
Del
 

Manage episode 373783911 series 2898400
Indhold leveret af the Royal Australasian College of Physicians and The Royal Australasian College of Physicians. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af the Royal Australasian College of Physicians and The Royal Australasian College of Physicians eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

This is the fourth part in a series on artificial intelligence in medicine and we try and unpick the causes and consequences of adverse events resulting from this technology. Our guest David Lyell is a research fellow at the Australian Institute of Health Innovation (Macquarie University) who has published a first-of-its kind audit of adverse events reported to the US regulator, the Federal Drugs Administration. He breaks down those that were caused by errors in the machine learning algorithm, other aspects of a device or even user error.
We also discuss where these all fit in to the four stages of human information processing, and whether this can inform determinations about liability. Uncertainty around the medicolegal aspects of AI-assisted care is of the main reasons that practitioners report discomfort about the use of this technology. It's a question that hasn’t been well tested yet in the courts, though according to academic lawyer Rita Matulonyte, AI-enhanced devices don’t change the scope of care that has been expected of practitioners in the past.

Guests
>
Rita Matuolynte PhD (Macquarie Law School, Macquarie University; ARC Centre of Excellence for Automated Decision Making and Society; MQ Research Centre for Agency, Values and Ethics)
>David Lyell PhD (Australian Institute of Health Innovation, Macquarie University; owner Future Echoes Business Solutions)
Production
Produced by Mic Cavazzini DPhil. Music licenced from Epidemic Sound includes ‘Kryptonite’ by Blue Steel and ‘Illusory Motion’ by Gavin Luke. Music courtesy of Free Music Archive includes ‘Impulsing’ by Borrtex. Image by EMS-Forster-Productions licenced from Getty Images.

Editorial feedback kindly provided by physicians David Arroyo, Stephen Bacchi, Aidan Tan, Ronaldo Piovezan and Rahul Barmanray and RACP staff Natasa Lazarevic PhD.

Key References
More than algorithms: an analysis of safety events involving ML-enabled medical devices reported to the FDA [Lyell, J Am Med Inform Assoc. 2023]
How machine learning is embedded to support clinician decision making: an analysis of FDA-approved medical devices [Lyell, BMJ Health Care Inform. 2021]
Should AI-enabled medical devices be explainable? [Matulonyte, Int J Law Inform Tech. 2022]

Please visit the Pomegranate Health web page for a transcript and supporting references. Login to MyCPD to record listening and reading as a prefilled learning activity. Subscribe to new episode email alerts or search for ‘Pomegranate Health’ in Apple Podcasts, Spotify, Castbox or any podcasting app.

  continue reading

120 episoder

Artwork
iconDel
 
Manage episode 373783911 series 2898400
Indhold leveret af the Royal Australasian College of Physicians and The Royal Australasian College of Physicians. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af the Royal Australasian College of Physicians and The Royal Australasian College of Physicians eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

This is the fourth part in a series on artificial intelligence in medicine and we try and unpick the causes and consequences of adverse events resulting from this technology. Our guest David Lyell is a research fellow at the Australian Institute of Health Innovation (Macquarie University) who has published a first-of-its kind audit of adverse events reported to the US regulator, the Federal Drugs Administration. He breaks down those that were caused by errors in the machine learning algorithm, other aspects of a device or even user error.
We also discuss where these all fit in to the four stages of human information processing, and whether this can inform determinations about liability. Uncertainty around the medicolegal aspects of AI-assisted care is of the main reasons that practitioners report discomfort about the use of this technology. It's a question that hasn’t been well tested yet in the courts, though according to academic lawyer Rita Matulonyte, AI-enhanced devices don’t change the scope of care that has been expected of practitioners in the past.

Guests
>
Rita Matuolynte PhD (Macquarie Law School, Macquarie University; ARC Centre of Excellence for Automated Decision Making and Society; MQ Research Centre for Agency, Values and Ethics)
>David Lyell PhD (Australian Institute of Health Innovation, Macquarie University; owner Future Echoes Business Solutions)
Production
Produced by Mic Cavazzini DPhil. Music licenced from Epidemic Sound includes ‘Kryptonite’ by Blue Steel and ‘Illusory Motion’ by Gavin Luke. Music courtesy of Free Music Archive includes ‘Impulsing’ by Borrtex. Image by EMS-Forster-Productions licenced from Getty Images.

Editorial feedback kindly provided by physicians David Arroyo, Stephen Bacchi, Aidan Tan, Ronaldo Piovezan and Rahul Barmanray and RACP staff Natasa Lazarevic PhD.

Key References
More than algorithms: an analysis of safety events involving ML-enabled medical devices reported to the FDA [Lyell, J Am Med Inform Assoc. 2023]
How machine learning is embedded to support clinician decision making: an analysis of FDA-approved medical devices [Lyell, BMJ Health Care Inform. 2021]
Should AI-enabled medical devices be explainable? [Matulonyte, Int J Law Inform Tech. 2022]

Please visit the Pomegranate Health web page for a transcript and supporting references. Login to MyCPD to record listening and reading as a prefilled learning activity. Subscribe to new episode email alerts or search for ‘Pomegranate Health’ in Apple Podcasts, Spotify, Castbox or any podcasting app.

  continue reading

120 episoder

Alle episoder

×
 
Loading …

Velkommen til Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Hurtig referencevejledning