this post was submitted on 29 Jul 2025
859 points (99.3% liked)

Technology

73534 readers
2455 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Trouble@lemmy.blahaj.zone 9 points 4 days ago (23 children)

The technology is NOT DOING WHAT ITS MEANT TO DO - it is IDENTIFYING DAMAGE WHERE THERE IS NONE - the TECHNOLOGY is NOT working as it should

[–] papertowels@mander.xyz 0 points 4 days ago* (last edited 4 days ago) (9 children)

Do you hold everything to such a standard?

Stop lights are meant to direct traffic. If someone runs a red light, is the technology not working as it should?

The technology here, using computer vision to automatically flag potential damage, needed to be implemented alongside human supervision - an employee should be able to walk by the car, see that the flagged damage doesn't actually exist, and override the algorithm.

The technology itself isn't bad, it's how hertz is using it that is.

I believe the unfortunate miscommunication here is that when @Ulrich@feddit.org said the solution was brilliant, they were referring to the technology as the "solution", and others are referring to the implementation as a whole as the "solution"

[–] Clent@lemmy.dbzer0.com 2 points 4 days ago (2 children)

The stop light analogy would require the stop light be doing something wrong not the human element doing something wrong because.

There is no human element to this implantation, it is the technology itself malfunctioning. There was no damage but the system thinks there is damage.

[–] papertowels@mander.xyz 0 points 4 days ago* (last edited 4 days ago)

There is no human element to this implantation, it is the technology itself malfunctioning. There was no damage but the system thinks there is damage.

Let's make sure we're building up from the same foundation. My assumptions are:

  1. Algorithms will make mistakes.
  2. There's an acceptable level of error for all algorithms.
  3. If an algorithm is making too many mistakes, that can be mitigated with human supervision and overrides.

Let me know if you disagree with any of these assumptions.

In this case, the lack of human override discussed in assumption 3 is, itself, a human-made decision that I am claiming is an error in implementing this technology. That is the human element. As management, you can either go on a snipe hunt trying to find an algorithm that is perfect, or you can make sure that trained employees can verify and correct the algorithm when needed. Instead hertz management chose option 3 - run an imperfect algorithm with absolutely 0 employee oversight. THAT is where they fucked up. THAT is where the human element screwed a potentially useful technology.

I work with machine learning algorithms. You will not, ever, find a practical machine learning algorithm that gets something right 100% of the time and is never wrong. But we don't say "the technology is malfunctioning" when it gets something wrong, otherwise there's a ton of invisible technology that we all rely on in our day to day lives that is "malfunctioning".

load more comments (1 replies)
load more comments (7 replies)
load more comments (20 replies)