157
this post was submitted on 19 Aug 2023
157 points (99.4% liked)
Technology
73727 readers
3836 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I’ve had my entire career working within industrial automation and I see the value AI and automated efforts being to the world.
I do not see the value in allowing private companies to playtest autonomous driving with human life as a potential collateral.
The argument keeps getting made — “how many humans make that same mistake daily?” — and it’s not equivocal; if autonomous vehicles cannot reach a 100% safety and accuracy feature, they should not be allowed to risk human lives.
Don't let perfection be the enemy of good. I'm not suggesting we're don't have a really high bar, but 100% is just unreasonable.
It's also the only acceptable level.
Planes are mostly on auto pilot these days, where most accidents are actually due to pilot error. Will you never go on a single flight for the rest of your life unless it's somehow 100% (not 99%, not 99.9%, but 100%) safe?