this post was submitted on 31 Jul 2025
381 points (96.8% liked)
Comic Strips
18481 readers
2662 users here now
Comic Strips is a community for those who love comic stories.
The rules are simple:
- The post can be a single image, an image gallery, or a link to a specific comic hosted on another site (the author's website, for instance).
- The comic must be a complete story.
- If it is an external link, it must be to a specific story, not to the root of the site.
- You may post comics from others or your own.
- If you are posting a comic of your own, a maximum of one per week is allowed (I know, your comics are great, but this rule helps avoid spam).
- The comic can be in any language, but if it's not in English, OP must include an English translation in the post's 'body' field (note: you don't need to select a specific language when posting a comic).
- Politeness.
- Adult content is not allowed. This community aims to be fun for people of all ages.
Web of links
- !linuxmemes@lemmy.world: "I use Arch btw"
- !memes@lemmy.world: memes (you don't say!)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It is funny watching people claim AGI is just around the corner so we need to be safe with LLMs
...when LLM can't keep track of what's being talked about, and their main risks are: Covering the internet with slop and propaganda, and contributing to claime change. Both of which are more about how we use LLMs.
The risk of LLMs aren't on what it might do. It is not smart enough to find ways to harm us. The risk seems from what stupid people will let it do.
If you put bunch of nuclear buttons in front of a child/monkey/dog whatever, then it can destroy the world. That seems to be what's LLM problem is heading towards. People are using it to do things that it can't, and trusting it because AI has been hyped so much throughout our past.
LLMs are already deleting whole production databases because "stupid" people are convinced they can vibe code everything.
Even programmers I (used to) respect are getting convinced LLM are "essential". 😞
They are useful to replace stackoverflow searches.
I've not found them useful for that, even. I often just get "lied to" about any technical or tricky issues.
They are just text generators. Even the dumbest stack overflow answers show more coherence. (Tho, they are certainly wrong in other ways.)
True but stackoverflow frequently lies to me as well.