this post was submitted on 27 Jan 2026
-15 points (29.7% liked)
Technology
79355 readers
4180 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's perfectly valid to discuss the dangers of AGI whether LLMs are the path there or not. I've been concerned about AGI and ASI for far longer than I've even known about LLMs, and people were worried about exactly the same stuff back then as they are now.
This is precisely the kind of threat you should try to find a solution for before we actually reach AGI - because once we do, it's way, way too late.
Also:
You couldn't possibly know that with absolute certainty.
>You couldn't possibly know that with absolute certainty.
I recommend you read Cameron's very good layman's explanation.
Adding to that framework, there is not enough data, compute and context size to reach AGI, for the current level of technology to reach anywhere near an AGI.
Nobody knows what it actually takes to reach AGI, so nobody knows whether a certain system has enough compute and context size to get there.
For all we know, it could turn out way simpler than anyone thought - or the exact opposite.
My point still stands: you (or Cameron) couldn't possibly know with absolute certainty.
I'd have zero issue with the claim if you'd included even a shred of humility and acknowledged you might be wrong. You made an absolute statement instead. That I disagree with.
This is science, not religion.
Do take refuge in form when you can't dispute content though, while you're at it, remember to pray too, because I can tell you god doesn't exist so that's another fear you can add to the fray.