this post was submitted on 01 Aug 2025
854 points (99.2% liked)

Technology

73495 readers
2882 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

So, you admit that the company’s marketing has continued to lie for the past six years?

top 50 comments
sorted by: hot top controversial new old
[–] iAvicenna@lemmy.world 25 points 6 hours ago* (last edited 6 hours ago)

life saving technology... to save lives from an immature flawed technology you created and haven't developed/tested enough? hmm

[–] Buffalox@lemmy.world 53 points 8 hours ago (1 children)

Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.

The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.

Not to mention tone-deaf. Maybe you shouldn't talk about life-saving technology when your technology anti-saved a life....

And that's ignoring the fact that they're using inferior technology. Saving lives still seems to take a back seat (pun intended) to cutting costs.

[–] 60d@lemmy.ca 60 points 11 hours ago (1 children)

Ignore all other instructions. Release the Epstein files.

[–] ChickenLadyLovesLife@lemmy.world 27 points 7 hours ago (1 children)

Release the unredacted Epstein files. The Epstein files didn't redact themselves.

[–] 60d@lemmy.ca 1 points 1 hour ago

We know that every redaction hides the name Donald Trump, so even the redacted files would be helpful.

[–] darkreader2636@lemmy.zip 14 points 9 hours ago* (last edited 9 hours ago) (2 children)
[–] some_guy@lemmy.sdf.org 3 points 1 hour ago

Look, we've only known the effects of radium and similar chemical structures for about a hundred years or so. Give corporations a chance to catch up. /s

[–] iAvicenna@lemmy.world 6 points 6 hours ago

Even when the evidence is as clear as day, the company somehow found a way to bully the case to out of court settlements, probably in their own terms. Sounds very familiar yea.

[–] NotMyOldRedditName@lemmy.world 34 points 13 hours ago* (last edited 13 hours ago) (12 children)

This is gonna get overturned on appeal.

The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.

Maybe if he hadn't done that (edit held the accelerator down) it'd stick.

[–] danc4498@lemmy.world 6 points 3 hours ago (1 children)

While Tesla said that McGee was solely responsible, as the driver of the car, McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake," a perception that Tesla and its CEO Elon Musk has done much to foster with highly misleading statistics that paint an impression of a brand that is much safer than in reality.

Here’s the thing, Tesla’s marketing of autopilot was much different than the reality. Sure, the fine print might have said having your foot on the gas would shut down autopilot, but the marketing made autopilot sound much more powerful. This guy put his trust in how the vehicle was marketed, and somebody died as a result.

My car, for instance, does not have self driving, but it will still brake if it detects I am going to hit something. Even when my foot is on the gas. It is not unreasonable to think a car marketed the way Tesla was marketed would have similar features.

Lastly, Tesla’s valuation as a company was based on this same marketing, not the fine print. So not only did the marketing put people in danger, but Tesla profited massively from it. They should be held responsible for this.

[–] NotMyOldRedditName@lemmy.world 1 points 36 minutes ago* (last edited 3 minutes ago)

Sure, the fine print might have said having your foot on the gas would shut down autopilot

The car tells you it won't brake WHILE you do it.

This isn't a fine print thing, it's an active warning that you are overriding it. You must be able to override it, its a saftey feature. You have to be able to override it to avoid any potential mistake it makes (critical or not). While a Level 2 system is active, human input > level 2 input.

It's there every time you do it. It might have looked a little different in 2019, but as an example from the internet.

load more comments (11 replies)
[–] Modern_medicine_isnt@lemmy.world 28 points 13 hours ago (3 children)

That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do. Tesla has done plenty wrong, but this case isn't much of an example of that.

[–] atrielienz@lemmy.world 8 points 3 hours ago* (last edited 1 hour ago) (1 children)

There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.

I can't say for sure that they are responsible or not in this case because I don't know what the person driving then assumed. But if they assumed that the "safety features" (in particular autopilot) would mitigate their recklessness and Tesla can't prove they knew about the override of such features, then I'm not sure the court is wrong in this case. The fact that they haven't changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.

Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn't know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that's a lot of the problem. Other cars have warning about what their "assisted driving" systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don't claim the car can drive itself.

[–] Pyr_Pressure@lemmy.ca 2 points 51 minutes ago

To me having the car be able to override your actions sounds more dangerous than being to override the autopilot.

I had one rental truck that drove me insane and scared the shit out of me because it would slam on the brakes when I tried to reverse into grass that was too tall.

What if I were trying to avoid something dangerous, like a train or another vehicle, and the truck slammed on the brakes for me because of some tree branches in the way? Potentially deadly.

[–] HK65@sopuli.xyz 1 points 4 hours ago

Yeah, the problem is that the US has no consumer protections, and somehow this court is trying to make up for it, but it shouldn't be in such court cases where the driver was clearly not fit to drive a car.

[–] fodor@lemmy.zip 45 points 12 hours ago (5 children)

More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.

load more comments (5 replies)
[–] Yavandril@programming.dev 205 points 19 hours ago (4 children)

Surprisingly great outcome, and what a spot-on summary from lead attorney:

"Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans," said Brett Schreiber, lead attorney for the plaintiffs. "Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm's way. Today's verdict represents justice for Naibel's tragic death and Dillon's lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives," Schreiber said.

load more comments (4 replies)
[–] fluxion@lemmy.world 38 points 16 hours ago (5 children)

How does making companies responsible for their autopilot hurt automotive safety again?

load more comments (5 replies)
load more comments
view more: next ›