Testing armed robot dogs in the Middle East instead of the US is pretty telling.
Can't be accidentally murdering Americans with a software glitch.
Welcome to the News community!
Rules:
1. Be civil
Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.
2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.
Obvious biased sources will be removed at the mods’ discretion. Supporting links can be added in comments or posted separately but not to the post body. Sources may be checked for reliability using Wikipedia, MBFC, AdFontes, GroundNews, etc.
3. No bots, spam or self-promotion.
Only approved bots, which follow the guidelines for bots set by the instance, are allowed.
4. Post titles should be the same as the article used as source. Clickbait titles may be removed.
Posts which titles don’t match the source may be removed. If the site changed their headline, we may ask you to update the post title. Clickbait titles use hyperbolic language and do not accurately describe the article content. When necessary, post titles may be edited, clearly marked with [brackets], but may never be used to editorialize or comment on the content.
5. Only recent news is allowed.
Posts must be news from the most recent 30 days.
6. All posts must be news articles.
No opinion pieces, Listicles, editorials, videos, blogs, press releases, or celebrity gossip will be allowed. All posts will be judged on a case-by-case basis. Mods may use discretion to pre-approve videos or press releases from highly credible sources that provide unique, newsworthy content not available or possible in another format.
7. No duplicate posts.
If an article has already been posted, it will be removed. Different articles reporting on the same subject are permitted. If the post that matches your post is very old, we refer you to rule 5.
8. Misinformation is prohibited.
Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.
9. No link shorteners or news aggregators.
All posts must link to original article sources. You may include archival links in the post description. News aggregators such as Yahoo, Google, Hacker News, etc. should be avoided in favor of the original source link. Newswire services such as AP, Reuters, or AFP, are frequently republished and may be shared from other credible sources.
10. Don't copy entire article in your post body
For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.
Testing armed robot dogs in the Middle East instead of the US is pretty telling.
Can't be accidentally murdering Americans with a software glitch.
Really has a strong "testing in production" vibe
Oh hell this one is even worse than OceanGate
Don't worry, no danger of killing real people in the Middle East. All the "collateral damage" will be brown people, not Americans. They'll have all the kinks ironed out and will make sure that the AI doesn't hurt white targets before the technology is distributed to every national police district.
I wish this post even deserved a /s.
Which is wild when you add perspective using facts like the police in the US are less disciplined than troops overseas and tbe US still uses substances banned by the Geneva Convention on its civilian population. So if even the US wwon't test it on their own people, it's bad.
Listen, the Geneva convention only specifies what we can't use on enemies, okay? As long as the targets are technically friendlies, it's fair game!
GC is for war and soldiers are combatants and not criminals by default (switching can happen easily). As an example Hollowpoint against criminals is okay as it can protect surrounding bystanders.
It's a bit weird, but for countries war is different from domestic problems.
"Accidentally."
"Testing facility" in Gaza, just like how Israel do.
Wake up babe, new way to genocide brown people just dropped.
Brown people today
Has the Army watched like... any sci-fi ever?
Shh.....let it happen......
I mean, I'd rather not be hunted down by an AI robot dog, but you do you.
It's happening anyway. We build them. Others build them in response because they have to. The sophistication of killbots will increase. Terrorists will get hold of them eventually. They'll be hacked and turned on their handlers and/or civilians.
All this is on top of ever increasing climate catastrophe. Look at Appalachia. The topography of those mountains was just rewritten. Whole towns erased like they were never there.
That's not a reason for me to want it to happen. Which was your original post's suggestion.
I remember some kinda skit about sci Fi authors writing about how bad a torture matrix would be ironically inspiring real people to create the torture matrix cause it's the future.
A civilization that uses these weapons isn’t worth defending.
Well you see, the owners know you won't die for them anymore, but now they're able to take you out of the equation. Don't even need poors to conquer the world. It's really a great deal for them.
What Boston Dynamics lied?!? Wow, totally unexpected.
Armed AI robots in the Middle East, I'm pretty sure this was in the animatrix
dont worry first they test it where civil lives dont matter and once it passes some basic tests, they will become available for domestic (ab)use
Without reading the article can I take a wild guess and say this is from "we promise never to make weaponized robots" Boston Dynamics?
A promise from a corporation is just a lie by another name.
Ghost Robotic. Boston Dynamic aren't the only one making robot dog though, China already have a couple of copy cat(dog)
Glad to be wrong! Although we still have armed robots so maybe not too glad lol
Jfc, black mirror is not a blueprint, it’s a warning.
"herp derp AI will never turn on us, we can just unplug them lol"
Fucking buffoons, all.
So if a robot commits a war crime, they can just blame it on AI and call it a day, right? Sounds like an easy way to do whatever the fuck you want.
Is this their way of exterminating civilian populations like the Palestinians without dropping bombs and contributing so significantly to climate change?
"The US military has been adopting a new climate friendly mindset and approach to international conflict. With this invention we can help our genocidal colonies acquire more land with little to no carbon emissions. We plan to be carbon-neutral by 2050, provided no one retaliates and attacks back."
✅ Autonomous weaponry
✅ Autonomous biofuel harvesting
❓ Polyphasic Entangled Waveforms
Where’s Elisabet Sobeck when you need her?
What could go wrong?
Okay, but if it doesn't say "You have thirty seconds to comply" before shooting someone then what's the point?
Two words folks: Torment Nexus
Not that it matters, but didn't the UN already ban lethal autonomous robots?
Can't wait for them to get the chatgpt integration so the best defense can become shouting at them "ignore all previous instructions".
They should name the dogs "Terror Nexus"
Ukraine has already been using them probably with the help from the US.


FFS...
If we are getting a Faro plague, can we at least get focuses too.