this post was submitted on 01 Nov 2025
492 points (97.5% liked)
Not The Onion
18522 readers
127 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Not that odd. Death by car is easily accepted by society. They are "accidents" and a "necessary evil" for society to function.
There's around a million people dying from cars every year and we just shrug and normalize them. Human or not, we just have to have cars and "accidents" are just that.
Nobody cares about cars killing people and animals. So she's probably right.
More so when you take her actual statement in context: that they're actually reducing deaths by being safer. The comments on lemmy are turning out to be just as biased and ungrounded in reality as they were on Reddit.
However I'm pretty sure that a standard transit system not made up of single cars that can only transport one or two person at a time and spy on them is also much safer.
I agree with you, public transport is the best option. However, let's not let perfect be the enemy of good.
Wow, you think the "company's data" is a trustworthy source? Where is your critical thinking skills?
They released actual data in line with the NHTSA regulations. If the data is falsified that'd be illegal. Do you have a reason to think otherwise?
https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting
Oh no! It would be illegal!
And what would be the punishment if it was found out that they released illegal data? A fine that could amount to hundreds of thousands of dollars? On top of their tens of millions of dollars of profits?
Yes, they are directly incentivized to either push their data in a biased direction or outright falsify their numbers, in order to facilitate the marketing strategy of these taxis being a "safe" technology, and increase their profit margin.
Fuck... have we learned nothing from the tobacco industry?!
The difference is accountability. If a human kills another human because of a car accident, they are liable, even criminally liable, given the right circumstances. If a driverless car kills another human because of a car accident, you're presented with a lose-lose scenario, depending on the legal implementation:
If the car manufacturer says that somebody must be behind the wheel, even though the car is doing all of the driving, the person is suddenly liable for the accident. They are expected to just sit there and watch for a potential accident, but the behavior of what an AI model will do is undefined. Is the model going to stop in front of that passenger as expected? How long do they wait to see before they take back control? It's not like cruise control, a feature that only controls part of the car, where they know exactly how it behaves and when to take back control. It's the equivalent of asking a person to watch a panel with a single red light for an hour, and push a button as fast as possible when it blinks for a half-second.
If the model is truly driverless (like these taxis), then NOBODY is liable for the accident. The company behind it might get sued, or might end up in a class-action lawsuit, but there is no criminal liability, and none of these lawsuits will result in enough financial impact to facilitate change. The companies have no incentive to fix their software, and will continue to parrot this shitty line about how it's somehow better than humans at driving, despite these easily hackable scenarios and zero accountability.
Humans have an incentive to not kill people, since nobody wants to have that on their conscience, and nobody wants to go to prison over it.
Corporations don't. In fact, they have an incentive to kill people over profits, if the choice presents itself!
I think that's overstating it a bit, of course many care, and we have people who are responsible for setting safety standards.
Just because accidents are unavoidable doesn't mean we aren't trying to minimize them and avoid fatalities.
Mandatory safety belts is an example of this. But other than that there are actual scientific studies into road safety, and even city wide implementations of such studies. At least in Europe there is, but I'm guessing USA has it too.
Just because traffic accidents happen, and we obviously need "traffic" to be able to move around, doesn't mean nobody cares.
As an anecdotal example, here (Denmark) the speed limit was increased from 110 to 130 on our equivalent to Autobahn, which may seem like accepting more accidents for convenience or efficiency. But in reality it was to divert more traffic to the safer "Autobahn" to actually reduce the number of accidents on smaller roads.
Traffic safety is as much about psychology as it is about making safer systems.
PPS:
Regarding the animals we have just had warnings about deer, and some places have small tunnels made for frogs.
And there are warning signs where deer tend to cross in almost any country that has them.
Self driving cars will have far less accidents and deaths than human driven cars. But the idea of being killed by human error is acceptable to us but the idea of a machine fucking up and killing us is terrifying, even if it means one self driving accident will create algorithms to avoid that same incident on all cars. Whereas human error can happen over and over in the same situation