this post was submitted on 29 Jul 2025
29 points (91.4% liked)
Programming
21924 readers
610 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The reason is because company decisions are largely driven by investors, and investors want their big investments in AI to return something.
Investors want constant growth, even if it must be shoehorned.
Venture Capital Driven Development at its finest.
This is true but not the whole picture.
AI is the next space race on nukes. The nation that develops AGI will 100% become the global superpower. Even sub-AGI agents will have the cyber-warfare potential of 1000s of human agents.
Human AI researchers are increasingly doubting our ability to control these programs with regards to transparency about adherence to safety protocols. The notion of programing AI with "Asimov's 3 laws" is impossible. AI exist to do one thing; get the highest score.
I'm convinced that due to the nature of AGI, it is an extinction level threat.