the_dunk_tank
It's the dunk tank.
This is where you come to post big-brained hot takes by chuds, libs, or even fellow leftists, and tear them to itty-bitty pieces with precision dunkstrikes.
Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.
Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.
Rule 3: No sectarianism.
Rule 4: TERF/SWERFs Not Welcome
Rule 5: No ableism of any kind (that includes stuff like libt*rd)
Rule 6: Do not post fellow hexbears.
Rule 7: Do not individually target other instances' admins or moderators.
Rule 8: The subject of a post cannot be low hanging fruit, that is comments/posts made by a private person that have low amount of upvotes/likes/views. Comments/Posts made on other instances that are accessible from hexbear are an exception to this. Posts that do not meet this requirement can be posted to !shitreactionariessay@lemmygrad.ml
Rule 9: if you post ironic rage bait im going to make a personal visit to your house to make sure you never make this mistake again
view the rest of the comments
I think you forgot to include the part where he thinks this needs to be done so that we can, essentially, kill all of the dumb people who would get tricked by a rising superintelligent AI.
There are so many cranks in "AI safety" stuff to the point where it is legitimately difficult to talk about what should be done that isn't very obviously slanted for some industry's benefit. You've got people like this, you've also got people like Gladstone that are LITERALLY EX-PENTAGON PEOPLE SPONSORED BY LOCKHEED MARTIN (who I am sure are very concerned about AI safety -- the only way I could be more convinced is if it was Boeing), who have suspicious demands that the publication of open-source models should be made illegal (probably out of concerns about China, as if half of the papers I read on new developments aren't already from them or the Noah's Ark lab in Moscow). There is no well that is unpoisoned here.