Close Menu
Best in TechnologyBest in Technology
  • News
  • Phones
  • Laptops
  • Gadgets
  • Gaming
  • AI
  • Tips
  • More
    • Web Stories
    • Global
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

What's On

Ninja Gaiden 4 Cover Story, Voidbreaker, And Metal Gear 3DS (Feat. Mike Drucker) | The Game Informer Show

31 August 2025

Gear News of the Week: Apple’s iPhone Event Gets a Date, and Plaud Upgrades Its AI Note-Taker

30 August 2025

Security News This Week: DOGE Put Everyone’s Social Security Data at Risk, Whistleblower Claims

30 August 2025
Facebook X (Twitter) Instagram
Just In
  • Ninja Gaiden 4 Cover Story, Voidbreaker, And Metal Gear 3DS (Feat. Mike Drucker) | The Game Informer Show
  • Gear News of the Week: Apple’s iPhone Event Gets a Date, and Plaud Upgrades Its AI Note-Taker
  • Security News This Week: DOGE Put Everyone’s Social Security Data at Risk, Whistleblower Claims
  • What to Look for When Buying a Sleeping Mask
  • Antarctica Is Changing Rapidly. The Consequences Could Be Dire
  • Review: Ride1Up TrailRush Electric Mountain Bike
  • Extreme Heat Makes Your Body Age Faster
  • Scammers Will Try to Trick You Into Filling Out Google Forms. Don’t Fall for It
Facebook X (Twitter) Instagram Pinterest Vimeo
Best in TechnologyBest in Technology
  • News
  • Phones
  • Laptops
  • Gadgets
  • Gaming
  • AI
  • Tips
  • More
    • Web Stories
    • Global
    • Press Release
Subscribe
Best in TechnologyBest in Technology
Home » OpenAI’s Transcription Tool Hallucinates. Hospitals Are Using It Anyway
News

OpenAI’s Transcription Tool Hallucinates. Hospitals Are Using It Anyway

News RoomBy News Room30 October 20243 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

On Saturday, an Associated Press investigation revealed that OpenAI’s Whisper transcription tool creates fabricated text in medical and business settings despite warnings against such use. The AP interviewed more than 12 software engineers, developers, and researchers who found the model regularly invents text that speakers never said, a phenomenon often called a “confabulation” or “hallucination” in the AI field.

Upon its release in 2022, OpenAI claimed that Whisper approached “human level robustness” in audio transcription accuracy. However, a University of Michigan researcher told the AP that Whisper created false text in 80 percent of public meeting transcripts examined. Another developer, unnamed in the AP report, claimed to have found invented content in almost all of his 26,000 test transcriptions.

The fabrications pose particular risks in health care settings. Despite OpenAI’s warnings against using Whisper for “high-risk domains,” over 30,000 medical workers now use Whisper-based tools to transcribe patient visits, according to the AP report. The Mankato Clinic in Minnesota and Children’s Hospital Los Angeles are among 40 health systems using a Whisper-powered AI copilot service from medical tech company Nabla that is fine-tuned on medical terminology.

Nabla acknowledges that Whisper can confabulate, but it also reportedly erases original audio recordings “for data safety reasons.” This could cause additional issues, since doctors cannot verify accuracy against the source material. And deaf patients may be highly impacted by mistaken transcripts since they would have no way to know if medical transcript audio is accurate or not.

The potential problems with Whisper extend beyond health care. Researchers from Cornell University and the University of Virginia studied thousands of audio samples and found Whisper adding nonexistent violent content and racial commentary to neutral speech. They found that 1 percent of samples included “entire hallucinated phrases or sentences which did not exist in any form in the underlying audio” and that 38 percent of those included “explicit harms such as perpetuating violence, making up inaccurate associations, or implying false authority.”

In one case from the study cited by AP, when a speaker described “two other girls and one lady,” Whisper added fictional text specifying that they “were Black.” In another, the audio said, “He, the boy, was going to, I’m not sure exactly, take the umbrella.” Whisper transcribed it to, “He took a big piece of a cross, a teeny, small piece … I’m sure he didn’t have a terror knife so he killed a number of people.”

An OpenAI spokesperson told the AP that the company appreciates the researchers’ findings and that it actively studies how to reduce fabrications and incorporates feedback in updates to the model.

Why Whisper Confabulates

The key to Whisper’s unsuitability in high-risk domains comes from its propensity to sometimes confabulate, or plausibly make up, inaccurate outputs. The AP report says, “Researchers aren’t certain why Whisper and similar tools hallucinate,” but that isn’t true. We know exactly why Transformer-based AI models like Whisper behave this way.

Whisper is based on technology that is designed to predict the next most likely token (chunk of data) that should appear after a sequence of tokens provided by a user. In the case of ChatGPT, the input tokens come in the form of a text prompt. In the case of Whisper, the input is tokenized audio data.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleOnePlus 13 Complete Specifications Listed on TENAA Ahead of October 31 Launch
Next Article 30 years ago, this sci-fi movie became a minor hit — and spawned an even better TV franchise

Related Articles

News

Gear News of the Week: Apple’s iPhone Event Gets a Date, and Plaud Upgrades Its AI Note-Taker

30 August 2025
News

Security News This Week: DOGE Put Everyone’s Social Security Data at Risk, Whistleblower Claims

30 August 2025
News

What to Look for When Buying a Sleeping Mask

30 August 2025
News

Antarctica Is Changing Rapidly. The Consequences Could Be Dire

30 August 2025
News

Review: Ride1Up TrailRush Electric Mountain Bike

30 August 2025
News

Extreme Heat Makes Your Body Age Faster

30 August 2025
Demo
Top Articles

ChatGPT o1 vs. o1-mini vs. 4o: Which should you use?

15 December 2024105 Views

Costco partners with Electric Era to bring back EV charging in the U.S.

28 October 202495 Views

5 laptops to buy instead of the M4 MacBook Pro

17 November 202490 Views

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Latest News
News

Review: Ride1Up TrailRush Electric Mountain Bike

News Room30 August 2025
News

Extreme Heat Makes Your Body Age Faster

News Room30 August 2025
News

Scammers Will Try to Trick You Into Filling Out Google Forms. Don’t Fall for It

News Room30 August 2025
Most Popular

The Spectacular Burnout of a Solar Panel Salesman

13 January 2025129 Views

ChatGPT o1 vs. o1-mini vs. 4o: Which should you use?

15 December 2024105 Views

Costco partners with Electric Era to bring back EV charging in the U.S.

28 October 202495 Views
Our Picks

What to Look for When Buying a Sleeping Mask

30 August 2025

Antarctica Is Changing Rapidly. The Consequences Could Be Dire

30 August 2025

Review: Ride1Up TrailRush Electric Mountain Bike

30 August 2025

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2025 Best in Technology. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.