this post was submitted on 04 Nov 2024
95 points (99.0% liked)

Fuck AI

1503 readers
116 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 9 months ago
MODERATORS
 

OpenAI’s Whisper tool may add fake text to medical transcripts, investigation finds.

top 10 comments
sorted by: hot top controversial new old
[–] [email protected] 16 points 1 month ago

Regular transcription software is finally respectable (the early days of dragon naturally speaking were dark indeed). Who thought tossing AI in the mix was a good idea?

[–] [email protected] 14 points 1 month ago (1 children)

I work in judicial tech and have heard questions of using AI transcription tools. I didn't believe AI should be used in this kind of high risk area. The ones asking if AI is a good fit for court transcripts can be forgiven because all they see is the hype, but if the ones responding greenlight a project like that there will be some incredibly embarrassing moments.

My other concern is that the court would have to run the service locally. There are situations where a victim's name or other information is redacted. That information should not be on an Open AI server and should not be regurgitated back out when the AI misbehaves.

[–] [email protected] 3 points 1 month ago (1 children)

Don't court stenographers basically use tailored voice models and voice to text transcription already?

[–] [email protected] 2 points 1 month ago

I don't get too technical with the court reporter software. They have their own license and receive direct support from their vendor. What I have seen is that there is an interpreting layer between the stenographer machine and the software, literally called magic by the vendor, that is a bit like predictive text. In this situation, the stenographer is actively recording and interpreting the results.

[–] [email protected] 10 points 1 month ago* (last edited 1 month ago)

Private hospitals care about only one thing: profit. These error-ridden tools serve that purpose.

[–] [email protected] 10 points 1 month ago (1 children)

God I hope this isn't the AI plan that the NHS adopts

[–] [email protected] 9 points 1 month ago* (last edited 1 month ago)

This is the AI plan every healthcare entity worldwide will adopt.

No joke. They are desperate for shit like this.

[–] ShareMySims 5 points 1 month ago

Errors and Hallucinations are definitely serious concerns, but my biggest concern would be privacy. If my GP is using AI, I no longer see my medical information as private, and that is unacceptable.

[–] FigMcLargeHuge 4 points 1 month ago

If anyone needs to know the state of AI transcription, just turn on closed captioning for your local tv channel. It's atrocious and I am sorry that people who need closed captioning are subjected to that.

[–] [email protected] 1 points 1 month ago

Years ago, I worked in a tech role at a medical transcription company. It hadn't occurred to me that AI would render their jobs irrelevant. This used to be an area where women in particular could make decent money after a bit of training, and there were opportunities for advancement into medical coding and even hospital administration.

I worked with some good people. Hope they landed on their feet.