Like, how does that even happen?
Poorly designed journalism-bot it sounds like. Ethics of not writing this yourself aside, it should be trivial to check that whatever is in quotes is at least a substring in the source text. If the LLM is the top layer, and any research it does is a tool call that is purely at its discretion, it's going to end up failing silently like this.
