this post was submitted on 26 May 2024
382 points (97.0% liked)

Not The Onion

17622 readers
1381 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

cross-posted from: https://zerobytes.monster/post/1072393

The original post: /r/nottheonion by /u/The_Ethics_Officer on 2024-05-25 00:48:15.
all 23 comments
sorted by: hot top controversial new old
[–] gravitas_deficiency@sh.itjust.works 39 points 1 year ago (4 children)

Oh, it’s worse than that.

Google’s “AI” results feed you things for 10 year old Reddit posts that are subtle (but sometimes, also not so subtle) bullshit.

Whatever they’re using to curate training data is evidently pretty awful at detecting shitposts.

[–] Carighan@lemmy.world 15 points 1 year ago

Bold of you to assume they're curating their training data.

[–] Diplomjodler3@lemmy.world 8 points 1 year ago

Those underpaid Indians probably aren't very good at picking up irony, even if they give a shit.

[–] CosmoNova@lemmy.world 4 points 1 year ago

Most of the curation or fine tuning is done in low income African countries so this is little surprising. They‘re cheap labour but you can‘t expect them to reliably detect sarcasm or notice mistakes in specialized fields. They basically give a thumbs up whenever the AI sounds convincing. Of course that includes instances where it‘s confidently wrong and that appears to be most of the time with this model.

[–] Even_Adder@lemmy.dbzer0.com 2 points 1 year ago (1 children)

It's not a training data issue, look up Retrieval Augmented Generation. It's basically serving up stuff on the web and taking it as gospel.

[–] Melvin_Ferd@lemmy.world -1 points 1 year ago

That's bullwhip why can't it just think for itself

[–] aeronmelon@lemmy.world 30 points 1 year ago

c/literallytheonion

[–] Hotzilla@sopuli.xyz 22 points 1 year ago* (last edited 1 year ago)

Eventho GPT is far from perfect, this does show how far behind google still is on AI.

Edit: GPT4o answer to exactly same question, which also explains why google failed:

"The Onion" humorously refers to itself as "America's Finest News Source." It's a satirical news organization known for its parody articles on international, national, and local news, mocking traditional news media and public figures. Despite its comedic nature, it has gained a reputation for its sharp wit and clever commentary on contemporary issues.

[–] riodoro1@lemmy.world 7 points 1 year ago

Theres so many fake screenshots from this im not believing another one.

[–] public_image_ltd@lemmy.world 6 points 1 year ago

I did not know about avclub up to now. So thanks for letting me know.