this post was submitted on 01 Aug 2025
1008 points (99.0% liked)

Technology

73534 readers
3496 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

So, you admit that the company’s marketing has continued to lie for the past six years?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] darkreader2636@lemmy.zip 23 points 20 hours ago* (last edited 20 hours ago) (3 children)
load more comments (3 replies)
[–] Yavandril@programming.dev 228 points 1 day ago (2 children)

Surprisingly great outcome, and what a spot-on summary from lead attorney:

"Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans," said Brett Schreiber, lead attorney for the plaintiffs. "Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm's way. Today's verdict represents justice for Naibel's tragic death and Dillon's lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives," Schreiber said.

[–] BrianTheeBiscuiteer@lemmy.world 91 points 1 day ago (2 children)

Holding them accountable would be jail time. I'm fine with even putting the salesman in jail for this. Who's gonna sell your vehicles when they know there's a decent chance of them taking the blame for your shitty tech?

[–] AngryRobot@lemmy.world 71 points 1 day ago

Don't you love how corporations can be people when it comes to bribing politicians but not when it comes to consequences for their criminal actions? Interestingly enough, the same is happening to AI...

[–] viking 17 points 1 day ago

You'd have to prove that the salesman said exactly that, and without a record it's at best a he said / she said situation.

I'd be happy to see Musk jailed though, he's definitely taunted self driving as fully functional.

load more comments (1 replies)
[–] Modern_medicine_isnt@lemmy.world 33 points 1 day ago (16 children)

That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do. Tesla has done plenty wrong, but this case isn't much of an example of that.

[–] HK65@sopuli.xyz 2 points 15 hours ago

Yeah, the problem is that the US has no consumer protections, and somehow this court is trying to make up for it, but it shouldn't be in such court cases where the driver was clearly not fit to drive a car.

load more comments (15 replies)
[–] NotMyOldRedditName@lemmy.world 36 points 1 day ago* (last edited 1 day ago) (13 children)

This is gonna get overturned on appeal.

The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.

Maybe if he hadn't done that (edit held the accelerator down) it'd stick.

load more comments (13 replies)
[–] crandlecan@mander.xyz 104 points 1 day ago (7 children)

Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

[–] N0t_5ure@lemmy.world 77 points 1 day ago (5 children)

"Some of you will die, but that's a risk I'm willing to take."

load more comments (5 replies)
[–] iAmTheTot@sh.itjust.works 43 points 1 day ago (2 children)

I mean, that's probably strictly true.

[–] Thorry84@feddit.nl 41 points 1 day ago (12 children)

I don't know, most experimental technologies aren't allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

[–] BreadstickNinja@lemmy.world 28 points 1 day ago* (last edited 1 day ago) (4 children)

Well, the Obama administration had published initial guidance on testing and safety for automated vehicles in September 2016, which was pre-regulatory but a prelude to potential regulation. Trump trashed it as one of the first things he did taking office for his first term. I was working in the AV industry at the time.

That turned everything into the wild west for a couple of years, up until an automated Uber killed a pedestrian in Arizona in 2018. After that, most AV companies scaled public testing way back, and deployed extremely conservative versions of their software. If you look at news articles from that time, there's a lot of criticism of how, e.g., Waymos would just grind to a halt in the middle of intersections, as companies would rather take flak for blocking traffic than running over people.

But not Tesla. While other companies dialed back their ambitions, Tesla was ripping Lidar sensors off its vehicles and sending them back out on public roads in droves. They also continued to market the technology - first as "Autopilot" and later as "Full Self Driving" - in ways that vastly overstated its capabilities. To be clear, Full Self Driving, or Level 5 Automation in the SAE framework, is science fiction at this point, the idea of a computer system functionally indistinguishable from a capable human driver. Other AV companies are still striving for Level 4 automation, which may include geographic restrictions or limitations to functioning on certain types of road infrastructure.

Part of the blame probably also lies with Biden, whose DOT had the opportunity to address this and didn't during his term. But it was Trump who initially trashed the safety framework, and Telsa that concealed and mismarketed the limitations of its technology.

[–] Barbarian@sh.itjust.works 7 points 1 day ago* (last edited 1 day ago) (2 children)

You got me interested, so I searched around and found this:

So, if I understand this correctly, the only fundamental difference between level 4 and 5 is that 4 works on specific known road types with reliable quality (highways, city roads), while level 5 works literally everywhere, including rural dirt paths?

I'm trying to imagine what other type of geographic difference there might be between 4 and 5 and I'm drawing a blank.

[–] BreadstickNinja@lemmy.world 9 points 23 hours ago* (last edited 23 hours ago)

Yes, that's it. A lot of AV systems are dependent on high resolution 3d maps of an area so they can precisely locate themselves in space. So they may perform relatively well in that defined space but would not be able to do so outside it.

Level 5 is functionally a human driver. You as a human could be driving off road, in an environment you've never been in before. Maybe it's raining and muddy. Maybe there are unknown hazards within this novel geography, flooding, fallen trees, etc.

A Level 5 AV system would be able to perform equivalently to a human in those conditions. Again, it's science fiction at this point, but essentially the end goal of vehicle automation is a system that can respond to novel and unpredictable circumstances in the same way (or better than) a human driver would in that scenario. It's really not defined much better than that end goal - because it's not possible with current technology, it doesn't correspond to a specific set of sensors or software system. It's a performance-based, long-term goal.

This is why it's so irresponsible for Tesla to continue to market their system as "Full self driving." It is nowhere near as adaptable or capable as a human driver. They pretend or insinuate that they have a system equivalent to SAE Level 5 when the entire industry is a decade minimum away from such a system.

[–] slaacaa@lemmy.world 4 points 20 hours ago* (last edited 20 hours ago)

I think this chart overcomplicates it a bit. Almost a decade ago, I worked on a very short project that touched on this topic. One expert explained to me that the difference between level 4 and 5 is that you don’t need a steering wheel or pedals anymore. L5 can drive anywhere, anytime in all situations.

load more comments (3 replies)
load more comments (11 replies)
load more comments (1 replies)
load more comments (5 replies)
[–] fluxion@lemmy.world 42 points 1 day ago (1 children)

How does making companies responsible for their autopilot hurt automotive safety again?

[–] CannedYeet@lemmy.world 9 points 1 day ago (5 children)

There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

[–] mrgoosmoos@lemmy.ca 2 points 16 hours ago

it's hard to prove that point, though. rolling out self driving may just make car usage go up and negate rate decreases by increasing overall usage

load more comments (4 replies)
[–] phoenixz@lemmy.ca 24 points 1 day ago (1 children)

Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's

Good!

... and the entire industry

Even better!

[–] boonhet@sopuli.xyz 10 points 23 hours ago (2 children)

Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?

I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don't think they should be held liable for THIS idiot's driving. They should still be held liable when Autopilot itself fucks up.

[–] Auli@lemmy.ca 5 points 16 hours ago (1 children)

The problem is how Musk and Tesla have sold their self driving and full self driving and what ever name they call the next one.

[–] boonhet@sopuli.xyz 1 points 14 hours ago

Should be a class action lawsuit by Tesla owners and damages in tens of billions rather than millions tbh. I'm just saying that this particular case can't be seen as Tesla's fault by anyone being objective.

[–] rimu@piefed.social 17 points 21 hours ago (1 children)

On the face of it, I agree. But 12 jurors who heard the whole story, probably for days or weeks, disagree with that.

load more comments (1 replies)
load more comments
view more: ‹ prev next ›