Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
Dismissing it all as “hallucinations” is like saying all cars are useless because they can crash. No tool is flawless but imperfect doesn’t mean worthless.
Nice strawman, but not applicable. A car can mechanically fail, resulting in a crash or a human can operate it in such a manner as to cause a crash. It can't crash on its own and if driven and maintained correctly, won't crash.
An AI, on the other hand, can give answers but never actually "knows" if it's correct or true. Sometimes the answers will be correct because you get lucky but there's nothing in any current LLM out there that can tell fact from fiction. It's just based on how it's trained and what it's trained on, and even when taking from "real" sources, it can mix things up when combining sources. Suggest you read https://medium.com/analytics-matters/generative-ai-its-all-a-hallucination-6b8798445044
The only way a car would be like an AI is if every time you sat in the car, it occasionally drove you to the right place and you didn't mind the other 9 out of 10 times it drove you to the wrong place, drove you using the least efficient route, and/or occasionally drove across lawns and fields, and on sidewalks. Oh, and the car assembles itself from other people's cars and steals their gas.
Too many beers to reply now, maybe tomorrow.