this post was submitted on 26 Oct 2024
269 points (98.9% liked)

News

35749 readers
2362 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious biased sources will be removed at the mods’ discretion. Supporting links can be added in comments or posted separately but not to the post body. Sources may be checked for reliability using Wikipedia, MBFC, AdFontes, GroundNews, etc.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source. Clickbait titles may be removed.


Posts which titles don’t match the source may be removed. If the site changed their headline, we may ask you to update the post title. Clickbait titles use hyperbolic language and do not accurately describe the article content. When necessary, post titles may be edited, clearly marked with [brackets], but may never be used to editorialize or comment on the content.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials, videos, blogs, press releases, or celebrity gossip will be allowed. All posts will be judged on a case-by-case basis. Mods may use discretion to pre-approve videos or press releases from highly credible sources that provide unique, newsworthy content not available or possible in another format.


7. No duplicate posts.


If an article has already been posted, it will be removed. Different articles reporting on the same subject are permitted. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners or news aggregators.


All posts must link to original article sources. You may include archival links in the post description. News aggregators such as Yahoo, Google, Hacker News, etc. should be avoided in favor of the original source link. Newswire services such as AP, Reuters, or AFP, are frequently republished and may be shared from other credible sources.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
 

Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and accuracy.”

But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.

Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.

top 19 comments
sorted by: hot top controversial new old
[–] funkless_eck@sh.itjust.works 50 points 1 year ago

once again AI is a cool trick your uncle learned in the army and not actual Gandalf level magic

[–] antifa 39 points 1 year ago (1 children)

How is anyone surprised by this? Making shit up is literally all that LLMs do

[–] itslilith@lemmy.blahaj.zone 3 points 1 year ago

This isn't an LLM, but a speech-to-text tool. In my experience it's really stable while people are talking, but makes things up during periods of silence. A better pipeline might make things better, but I would never use it within a medical context

[–] Cenotaph@mander.xyz 33 points 1 year ago (1 children)

The automated voicemail transcriptions my work uses (medical office) recently switched to AI. In some cases it works really well, but if there are long periods of silence it will start to make things up. I've ended up with some pretty loopy messages when someone leaves voicemail with long silence at the end.

[–] spankmonkey@lemmy.world 27 points 1 year ago (2 children)

Seems like an obvious defect that should have come up during negative testing.

[–] catloaf@lemm.ee 35 points 1 year ago (1 children)

Bold of you to assume there was any testing process involved beyond "does it run? ship it"

[–] _____@lemm.ee 4 points 1 year ago (1 children)

I don't understand how AI keeps getting away at delivering software that does not meet obvious specifications

[–] catloaf@lemm.ee 4 points 1 year ago

Same way regular software does.

[–] Cenotaph@mander.xyz 14 points 1 year ago

You'd think. If I was the one paying for it, I would be changing providers but you know how that goes. I just work here, man.

[–] N0body@lemmy.dbzer0.com 33 points 1 year ago

“I don’t care what the patient said before he went under. They say all kinds of loopy things. The chart clearly says to amputate the right leg. Hand me the bone saw.”

[–] Dozzi92@lemmy.world 17 points 1 year ago

I'm a stenographer. People ask me if I am concerned with AI taking my job and I say notttt yettt.

[–] Mac@mander.xyz 16 points 1 year ago

This is why i don't trust AI summaries.

[–] A_A@lemmy.world 13 points 1 year ago

Fantasy in, fantasy out. Don't use it if you don't know this or if you can't repair its mistakes.

[–] BrokenGlepnir@lemmy.world 13 points 1 year ago

Violent rhetoric? "Doctor Johnson recommends punching the patient in his fucking face until he squeals! Now to make pizza get a bottle of Elmer's glue..."

This is when it invents stuff based on stereotypes and their harmful perpetuating continues.

[–] Snapz@lemmy.world 6 points 1 year ago

Yes... you already said it was an AI tool?

Digital lying box... That lies to you.

[–] Bishma@discuss.tchncs.de 6 points 1 year ago (1 children)

I'm shocked. Shocked!

Well, not that shocked.

[–] NarrativeBear@lemmy.world 2 points 1 year ago

Politicians take interest.