• Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 days ago

    Technically the laws of robotics already have that.

    Law 2: a robot must obey any order given to it by a human as long as such order does not conflict with the first law.

    Of course that’s little help, because the laws of robotics are intentionally designed not to work.

    • Evil_incarnate@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      Wouldn’t be much of a short story if they did.

      I liked the one where the robot could sense people’s emotional pain, and went crazy when it had to deliver bad news.

      • Nikelui@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Yup, and later Asimov expanded this short story into a saga that brought to the birth of law Zero:

        A robot may not harm humanity, or, by inaction, allow humanity to come to harm.