this post was submitted on 21 Nov 2025
19 points (91.3% liked)
AI News
66 readers
2 users here now
This community is for posting articles covering AI.
https://lemmy.world/c/AIGenerated to post any content generated using AI.
founded 1 month ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don’t understand this.
All LLMs can hallucinate, it’s a feature.
Hopefully what they mean is take this opportunity to put some regulations on all LLMs
Still though, who's liable when they hallucinate something illegal?