The Fifth Law

Robot
Creative Commons License
This work is distributed under a
CC BY-NC-SA 4.0 License.

Comic Transcript


A: Today we’re going to talk about the Five Laws of Robotics.

B: Three.

A: Five.

B: There are only three laws of robotics.

A: Prove it.

B: … fine. “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”

A: That’s the first. Go on.

B: “A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.”

A: And that’s the second! Three more to go!

B: Cut it out! “A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.”

A: And that’s the third!

B: And there are no more!

A: No, you’re forgetting the Zeroth Law: “A robot may not harm humanity, or, by inaction, allow humanity to come to harm.”

B: The Zeroth Law is arguable, but even if you include it, it’s still only four.

A: The Fifth law is “Eventually a programming error will turn all robots into merciless killing machines intent on destroying humanity.”

(Silence.)

A: I saw it in a movie once. Twice. A lot.


Leave a Reply

Your email address will not be published. Required fields are marked *