this post was submitted on 16 Jul 2025
126 points (80.9% liked)

Technology

73495 readers
3496 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] latenightnoir@lemmy.blahaj.zone 50 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Uuh... skipping over the fact that this is a pointless article, didn't Asimov himself write the three laws specifically to show it's a very stupid idea to think a human could cover all possible contingencies through three smart-sounding phrases?

[–] Obi@sopuli.xyz 14 points 2 weeks ago (2 children)

Most of the stories are about how the laws don't work and how to circumvent them, yes.

[–] SonOfAntenora@lemmy.world 7 points 2 weeks ago (1 children)

Most peope never read Asimov and it shows.

[–] ziggurat@lemmy.world 5 points 2 weeks ago (1 children)

I dont think reading Asimov would help for most people. I think most people will just not get the point of anything unless you spell it out

[–] SoleInvictus@lemmy.blahaj.zone 2 points 2 weeks ago

Critical thinking is indeed dead for much of the population.

[–] CheeseNoodle@lemmy.world 1 points 2 weeks ago

Some of the stories do also include solutions to those same issues, though that also tends to lead to limiting the capabilities of the robots. The message could be interpreted as it being a trade off between versatility and risk.

[–] Zwuzelmaus@feddit.org 10 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Asimov had quite a different idea.

What if robots become like humans some day?

That was his general topic through many of his stories. The three laws were quite similar to former slavery laws of Usa. With this analogy he worked on the question if robots are nearly like humans, and if they are even indistinguishable from humans: Would/should they still stay our servants then?

[–] latenightnoir@lemmy.blahaj.zone 1 points 2 weeks ago* (last edited 2 weeks ago)

Yepyep, agreed! I was referring strictly to the Three Laws as a cautionary element.

Otherwise, I, too, think the point was to show that the only viable way to approach an equivalent or superior consciousness is as at least an equal, not as an inferior.

And it makes a lot of sense. There's not much stopping a person from doing heinous stuff if a body of laws would be the only thing to stop them. I think socialisation plays a much more relevant role in the development of a conscience, of a moral compass, because empathy (edit: and by this, I don't mean just the emotional bit of empathy, I mean everything which can be considered empathy, be it emotional, rational, or anything in between and around) is a significantly stronger motivator for avoiding doing harm than "because that's the law."

It's basic child rearing as I see it, if children aren't socialised, there will be a much higher chance that they won't understand why doing something would harm another, they won't see the actual consequences of their actions upon the subject. And if they don't understand that the subject of their actions is a being just like them, with an internal life and feelings, then they wouldn't have a strong enough* reason to not treat the subject as a piece of furniture, or a tool, or any other object one could see around them.

Edit: to clarify, the distinction I made between equivalent and superior consciousness wasn't in reference to how smart one or the other is, I was referring to the complexity of said consciousness. For instance, I'd perceive anything which reacts to the world around them in a deliberate manner to be somewhat equivalent to me (see dogs, for instance), whereas something which takes in all of the factors mine does, plus some others, would be superior in terms of complexity. I genuinely don't even know what example to offer here, because I can't picture it. Which I think underlines why I'd say such a consciousness is superior.

I will say, I would now rephrase it as "superior/different" in retrospect.

[–] vrighter@discuss.tchncs.de 3 points 2 weeks ago (1 children)

exactly. But what if there were more than just three (the infamous "guardrails")

[–] latenightnoir@lemmy.blahaj.zone 2 points 2 weeks ago* (last edited 2 weeks ago)

I genuinely think it's impossible. I think this would land us into Robocop 2, where they started overloading Murphy's system with thousands of directives (granted, not with the purpose of generating the perfect set of Laws for him) and he just ends up acting like a generic pull-string action figure, becoming "useless" as a conscious being.

Most certainly impossible when attempted by humans, because we're barely even competent enough to guide ourselves, let alone something else.