Asimov’s Three Laws and Paternalism

Isaac Asimov had a great many short stories and a number of novels that involved humanoid robots.  A common feature of most of these (there were a few exceptions) involved his “Three Laws of Robotics.”

  1. A robot may not harm, nor through inaction allow to come to harm, a human.
  2. A robot must obey the orders of a human so long as those orders do not violate the first law.
  3. A robot must act to preserve its own existence provided that action does not violate either of the first two laws.

Most of the stories involved unexpected consequences of those laws or, in some cases, what happens if the laws are modified a bit.  One story involved strengthening the third law a bit and weakening the second causing the robot to get caught in a loop requiring setting up a situation invoking the first to break it free.

The stories were basically upbeat.  The robots, limited by their laws, a net positive to humanity.

And most of this relies on the robots being, on the whole, rather dim and not carrying those three laws to their ultimate nature.  Yes, some robots were presented as quite intelligent–R. Daneel Olivaw of the original “Robot Novels” was a police detective fully equal to his human compatriots–they still were “dim” when it came to carrying out the laws to their fullest.

To show where those laws could lead consider Jack Williamson’s Humanoids as presented in the story “With Folded Hands.” The Humanoids’ Prime Directive was simple: “To serve and obey and guard men from harm.” Parse that and it’s basically the first two of Asimov’s laws of robotics.  And while “To serve and obey” is placed before “guard men from harm” it becomes rapidly clear that the latter takes priority over the former.

The Humanoids offer their services for free.  And they soon become very popular.  And because they are interested in guarding men from harm they get jobs directing traffic and many other ways.  But soon a darker side becomes apparent.  Oh, they’re not trying to take over humanity to enslave or exterminate us or anything like that.  No.  They want to “protect” us.

Drive?  Oh, no, it’s much too dangerous for a human to drive.  Let me do it for you.  No.  I insist.  I really insist.

The tools in your workshop?  Too dangerous.  You could lose a finger or put out an eye.  No, these are much safer.  You can play with this foam board.  If you need any real furniture or anything like that, we’ll make it for you.  That’s much safer.

Exploring?  Oh, good heavens no.  People get hurt, even killed exploring the unknown.  Just stay here where it’s safe.  I insist.

And so on, anything with the least component of risk, they are oh so sorry but you simply cannot be allowed to do that.  They need to protect you don’t you know.  The Humanoids didn’t want to enslave or exterminate humanity.  They wanted to turn us into pampered pets, not allowed the least little bit of challenge or risk.  And so the protagonist accepts his ride home “with folded hands” for there is nothing left to do.

Jack Williamson wrote this story in the aftermath of World War II.  In interviews he said that it was with atomic weapons in mind, showing how some inventions turn out to be far more dangerous than ever imagined.

Personally, I think it speaks poignantly to the danger of government paternalism.   Rules and restrictions designed to keep people “safe” not just advice where reasoning adults can make an informed decision for themselves but a governmental pat on the head saying “now, now.  Daddy knows best.”  Daddy knows best what you should drive.  Daddy knows best what you should eat.  Daddy knows best what you should drink.  Daddy knows best what activities you can engage in.  Oh, it starts “reasonably” enough.  There are some things that are recklessly dangerous not just to the person doing them but to everyone around them.  But it never stops there.  There’s always some new “too dangerous to allow” activity.  And one after that and another after that.  And there’s no definitive stopping point, particularly once you go past “people will have to use resources to care for you” to “people will be sad” (had that one used against me about why drugs should remain illegal–people will be unhappy if you get harmed by drugs) as an excuse for further restrictions.

Jack Williamson gave us the Humanoids, insinuating themselves into society taking away all choice in the name of “safety.”

I give you the governments of the world.

5 thoughts on “Asimov’s Three Laws and Paternalism”

  1. You actually heard THAT as a reason to keep certain drugs illegal? WTF

    I can think of better reasons (not that I’m willing to argue with you about that subject).

    As for the rest of your post, all I will say is AMEN Brother. 😀


  2. And in one Asimov story, the robots at a facility with radiation sources had to be programmed with a modified version of the First Law, removing the “through inaction” clause. Unmodified robots were making nuisances of themselves because any time a human ran into a radiation field to do his job, a robot would grab him and pull him out. Sure, there was a margin of safety before the exposure became harmful, but the human might forget.


  3. This has become an increasing problem with parents already, especially working parents who feel guilty they’re not spending more time with their kids. So everything has to be scheduled, and monitored, and the kids must be constantly protected.

    No being outside alone. If a kid is, a “helpful neighbor” is likely to call the police — and you, the parent, can be arrested for criminal child neglect.

    There’s a growing “Free Range Childhood” movement in reaction to the overprotection.

    I think many of the university snowflakes needed a lot more free range than they got — and now it’s too late for their optimal child development.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: