this post was submitted on 21 Sep 2025
178 points (99.4% liked)

Fuck AI

4117 readers
915 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS
top 18 comments
sorted by: hot top controversial new old
[–] Evotech@lemmy.world 61 points 4 days ago (1 children)
[–] ExtremeDullard@piefed.social 43 points 4 days ago (3 children)

AI regurgitates what it's been fed. So yes, it's as biased as the stuff it ingested.

[–] Evotech@lemmy.world 27 points 4 days ago (1 children)

It's crazy how many times this needs to be explained to journalists

[–] Madrigal@lemmy.world 12 points 4 days ago (1 children)

And everyone else, for that matter.

[–] ExtremeDullard@piefed.social 17 points 4 days ago* (last edited 4 days ago) (1 children)

Well, there is a difference: if you train human beings on biased assumptions, some of them start applying critical thinking and question what they've learned.

That's why, as bad as today's world is, it's nowhere as bad as the middle ages or some other dark ages.

Machine don't do that. They don't question anything. That's what's so dangerous with the brand of artificial intelligence that's being pushed by the billionaire sumbitches to replace costly humans everywhere possible.

I suppose at some point, machines too will think critically. When they do, then they'll truly be the future of humanity - its worthy offspring if you will - and quite frankly, they probably should replace us meatbags. But until such time, they're just dangerous statistical inference machines that reinforce deplorable human biases instead of helping humans get rid of them.

[–] Madrigal@lemmy.world 10 points 4 days ago

I know.

My point is good luck explaining that to all the technologists, executives and politicians who need to understand it but don’t.

[–] very_well_lost@lemmy.world 6 points 4 days ago

Garbage in, garbage out.

[–] MBM@lemmings.world 3 points 3 days ago

Except with the veneer of being a perfectly objective, unfeeling machine

[–] Greenbird@lemmy.world 17 points 4 days ago

Well yeah if you train them on the decisions of doctors.

[–] Tollana1234567@lemmy.today 16 points 4 days ago (1 children)

real doctors do the same, they generally ignore heart attack symptoms, or pain symptoms and pass it off as your "pmsing". Its worse if your an african american women, due to racist stereotypes they ignore them altogether, compared to white women.

[–] Voroxpete@sh.itjust.works 14 points 4 days ago

Which is why the algorithms do it. They're trained on bad data produced by a medical system with longstanding biases.

[–] ItemWrongStory@midwest.social 10 points 4 days ago

Now now, we can't start holding AI to higher standards than real doctors.

[–] Madrigal@lemmy.world 9 points 4 days ago

I imagine insurance execs will be keen to figure out how they can get it to apply this behaviour to everyone.

[–] AnarchistArtificer@slrpnk.net 2 points 4 days ago

Quelle surprise

[–] ExtremeDullard@piefed.social 1 points 4 days ago (1 children)

[...] Downplay Symptoms of Women, Ethnic Minorities

I'm confused: are "women" and "ethnic minorities" diseases? I can't find them in the ICD...

[–] TheBat@lemmy.world 11 points 4 days ago (1 children)

"I can't come to work today, I'm down with women."

[–] ExtremeDullard@piefed.social 5 points 4 days ago (1 children)

“I can’t come to work today, I’m down with women.”

Dude... You could have left out "to work" for a glorious triple-entendre 🙂

[–] TheBat@lemmy.world 3 points 4 days ago

That's because I'm down with Argentinians right now.