this post was submitted on 01 Aug 2025
1095 points (99.1% liked)
Technology
73567 readers
2982 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do. Tesla has done plenty wrong, but this case isn't much of an example of that.
There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.
I can't say for sure that they are responsible or not in this case because I don't know what the person driving then assumed. But if they assumed that the "safety features" (in particular autopilot) would mitigate their recklessness and Tesla can't prove they knew about the override of such features, then I'm not sure the court is wrong in this case. The fact that they haven't changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.
Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn't know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that's a lot of the problem. Other cars have warning about what their "assisted driving" systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don't claim the car can drive itself.
Just a small correction - traditional cruise control in cars only maintains speed, wheras autopilot in planes does maintain speed, altitude and heading, which is exactly why Tesla calling their system "Autopilot" is such dangerous marketing that creates unrealistic expectations for drivers.
I'm not sure what you're correcting. The autopilot feature has adaptive cruise control and lane keeping assist, and auto steering.
Adaptive cruise control will brake to maintain a distance with the vehicle in front of it but maintain the set speed otherwise, lane keeping assist will keep the vehicle in it's lane/prevent it from drifting from its lane, and combined with auto steering will keep it centered in the lane.
I specifically explained that a planes auto pilot does those things (maintain speed, altitude, and heading), and that people don't know that this is all it does. It doesn't by itself avoid obstacles or account for weather etc. It'd fly right into another plane if it was occupying that airspace. It won't react to weather events like windsheer (which could cause the plane to lose altitude extremely quickly), or a hurricane. If there's an engine problem and an engine loses power? It won't attempt to restart. It doesn't brake. It can't land a plane.
But Musk made some claims that Teslas autopilot would drive the vehicle for you without human interference. And people assume that autopilot (in the pop culture sense) does a lot more than it actually does. This is what I'm trying to point out.