this post was submitted on 03 Mar 2024
592 points (100.0% liked)
196
18237 readers
298 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
Other rules
Behavior rules:
- No bigotry (transphobia, racism, etc…)
- No genocide denial
- No support for authoritarian behaviour (incl. Tankies)
- No namecalling
- Accounts from lemmygrad.ml, threads.net, or hexbear.net are held to higher standards
- Other things seen as cleary bad
Posting rules:
- No AI generated content (DALL-E etc…)
- No advertisements
- No gore / violence
- Mutual aid posts are not allowed
NSFW: NSFW content is permitted but it must be tagged and have content warnings. Anything that doesn't adhere to this will be removed. Content warnings should be added like: [penis], [explicit description of sex]. Non-sexualized breasts of any gender are not considered inappropriate and therefore do not need to be blurred/tagged.
If you have any questions, feel free to contact us on our matrix channel or email.
Other 196's:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I disagree.
Machines are made by humans, and can be built responsibly.
But then the one who is to be held accountable is the human who made it, or the human who used it.
When a plane crashes it isnt the planes fault either.
Yeah, but even machines that are "built responsibly" (whatever that may mean) can make mistakes. Correction: they will make mistakes, because decision making isn't linear, and afaik computers are only good at linear tasks, such as calculations. And once they do, who should be held accountable? The AI's creator? The person/company who accepted whatever decision the AI made? Or nobody? When people are deciding, it's a bit easier to know who to blame. But how do you do it when the decision is by an algorithm?
And sure, maybe AI's can help in decision making, but shouldn't decisions be made by people in the end?
It depends on what decision you're looking at.