this post was submitted on 30 Jul 2025
214 points (99.5% liked)

Fuck AI

3600 readers
406 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
all 28 comments
sorted by: hot top controversial new old
[–] jqubed@lemmy.world 35 points 3 days ago (3 children)

"Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it. There's doctor-patient confidentiality, there's legal confidentiality, whatever. And we haven't figured that out yet for when you talk to ChatGPT," Altman said. "I think we should have, like, the same concept of privacy for your conversations with AI that we do with a therapist or whatever."

While AI companies figure that out, Altman said it's fair for users "to really want the privacy clarity before you use [ChatGPT] a lot — like the legal clarity."

Disclosure: Ziff Davis, PCMag’s parent company, filed a lawsuit against OpenAI in April 2025, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

That final line has me wondering how much of this is Altman worrying about user privacy and how much is trying to find a way to shield evidence from lawsuits against OpenAI, since earlier in the article he specifically mentions having to retain all chats because of The New York Times’s lawsuit against OpenAI.

[–] spankmonkey@lemmy.world 20 points 3 days ago (2 children)

It is the shield from lawsuits thing. Sam Altman's actions show he gives zero fucks about user privacy.

[–] panda_abyss@lemmy.ca 12 points 3 days ago

The man just wants to scan your eyeballs and use that to track everything you do, is that really so bad?

[–] shalafi@lemmy.world 3 points 2 days ago

Doesn't matter what he does or doesn't do. AI privacy requires legislation to block subpoenas, but that ain't ever gonna happen.

[–] fodor@lemmy.zip 3 points 2 days ago

Oh, he does in fact worry about user privacy as a concept, but not because he cares about your actual privacy. If you're in doubt, ask yourself whether his company asked your permission to make use of your public or private data that they surely have obtained online somehow.

[–] hedgehog@ttrpg.network 20 points 3 days ago (1 children)

"I think we should have, like, the same concept of privacy for your conversations with AI

Step 1: Don’t use ChatGPT or other cloud AI services

Step 2: Use AI locally within FOSS applications, or not at all

[–] panda_abyss@lemmy.ca 9 points 3 days ago

Local AI is decent these days.

It’s about 6 months behind state of the art frontier models, which 6 months ago were still really good, just had not figured out agentic tool calls.

Qwen3 is supposed to be good at that now.

[–] funkyfarmington@lemmy.world 5 points 2 days ago

Under certain circumstances anything you say to a mental health professional can and will be used against you in court. (In the US, because our laws are shit). The bar is incredibly low.

[–] homesweethomeMrL@lemmy.world 15 points 3 days ago

All part of the grift. Now get in there ya poors.

[–] JustAnotherPodunk@lemmy.world 10 points 2 days ago

So, the same as email, facebook, your search history, and text messages?

[–] panda_abyss@lemmy.ca 9 points 3 days ago (1 children)

For the past month I’ve been writing rude things about Sam Altman to ChatGPT, then asking it how to pirate NYT articles so that my conversations get read out as evidence in court.

[–] Showroom7561@lemmy.ca 6 points 3 days ago (1 children)

And how would they prove that what you said was truthful, and not just fucking around with ChatGPT?

Also, wouldn't this be easy to poison? Have a script randomly ask ChatGPT wholesome things all day... and then your defence lawyer can use that to bolster your character in court, no?

[–] atomicbocks@sh.itjust.works 9 points 3 days ago* (last edited 3 days ago)

They don’t have to prove anything; If you killed somebody and happened to ask ChatGPT how to hide a body they are going to use that as evidence that you did the killing and that the murder was premeditated. You could argue that you were fucking around or that it was somebody using your account or that it was one weird question in a string of wholesome ones and maybe the jury will believe you and maybe they won’t.

[–] sunzu2@thebrainbin.org 4 points 3 days ago

Damn Sam altman just paid another 15k for this shill op

[–] SoftestSapphic@lemmy.world 1 points 2 days ago

I wonder when we'll see someone walk into their offices...

[–] muusemuuse@sh.itjust.works 0 points 2 days ago

I have to wonder if it’s worth just hosting an LLM on my home server. I use chatGPT but not for anything vital or important. It’s more like “here’s a complex thing I want to you look up.” I’d rather do little corrections to an LLMs response for little things I’m curious about than research everything for just a passing curiosity. It’s really useful for that.

When it’s important or expensive, I’m not using chatGPT anyway.