Published in AI

AI transcription tool whisper faces scrutiny over accuracy

by on28 October 2024


You have a bad case of Scrutter's Cephaloanal Inversion

Hospitals are starting to use a transcription tool powered by a hallucination-prone OpenAI model.

ABC News cited boffins, who found that OpenAI’s Whisper, which many hospitals use, has been making things up.

Whisper is employed by a company called Nabla for a medical transcription tool that has reportedly transcribed 7 million medical conversations. According to ABC News, more than 30,000 clinicians and 40 health systems use it. Nabla is aware that Whisper can hallucinate and is "addressing the problem."

A study by researchers from Cornell University, the University of Washington, and others found that Whisper hallucinated in about 1 per cent of transcriptions, fabricating sentences with sometimes violent sentiments or nonsensical phrases during silences in recordings.

The boffins, who gathered audio samples from TalkBank’s AphasiaBank, noted that silence is pervasive when someone with a language disorder called aphasia is speaking.

One of the researchers, Allison Koenecke of Cornell University, posted study examples where researchers found that hallucinations also included invented medical conditions or phrases you might expect from a YouTube video, such as "Thank you for watching" (OpenAI reportedly used over a million hours of YouTube videos to train GPT-4.)

The study was presented in June at Brazil's Association for Computing Machinery FAccT conference. It’s not clear if it has been peer-reviewed.

OpenAI spokesperson Taya Christianson told The Verge: "We take this issue seriously and are continually working to improve, including reducing hallucinations. For Whisper use on our API platform, our usage policies prohibit use in certain high-stakes decision-making contexts, and our model card for open-source use includes recommendations against use in high-risk domains. We thank researchers for sharing their findings."

 

Last modified on 28 October 2024
Rate this item
(0 votes)

Read more about: