this post was submitted on 14 Jun 2025
11 points (100.0% liked)

ChatGPT

9834 readers
2 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 2 years ago
MODERATORS
 

In the last few months, it's been more rare that my model just made stuff up

But now, it searches for almost every query even if asked not to search and it makes up nonsense too

For instance I asked if about small details in video games and it told me "the music box stops playing when Sarah dies". There is no music box. This is nonsense

you are viewing a single comment's thread
view the rest of the comments
[–] Lembot_0003@lemmy.zip 1 points 2 months ago

Yes, and the system can figure out the correct answer as soon as you point out that hallucination is wrong. Somehow ChatGPT is even more unwilling to say "no" or "don't know" recently.