Laws of Robotics

Tagged: Theme

The three Laws of Robotics which govern the behaviour of Isaac Asimov's fictional Positronic Robots (and various other Robots and AIs in sf by other hands) were formally stated by Asimov in his story "Runaround" (March 1942 Astounding):

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

The First Law, alone, is introduced in his earlier story "Liar!" (May 1941 Astounding). Asimov credited John W Campbell Jr with the formulation of all three laws in a December 1940 conversation; Campbell, however, felt that the laws were already implicit in the early Asimov Robot stories beginning with "Strange Playfellow" (September 1940 Super Science Stories; vt "Robbie" in I, Robot, coll 1950). Many ensuing puzzle-plots are developed from quibbles with definitions of terms – what is "human"? – and conflicts between modified versions of the laws. For example, a valuable robot in "Runaround" has a strengthened Third Law of self-preservation that places it in a dilemma of clashing priorities. Decades later Asimov added the Zeroth Law, preceding and taking precedence over the well-known Three Laws:

0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

This had previously been implied in Asimov's "The Evitable Conflict" (June 1950 Astounding), involving vast immobile AIs which technically violate the First Law (though for only very small values of "harm") to advance the greater good of humanity. Asimov ultimately stated the Zeroth Law explicitly, as above, in Robots and Empire (1985).

John Sladek, who Parodied Asimovian three-laws quibbles in "Broot Force" (September 1972 F&SF), introduces a genially murderous robot whose "asimov circuits" have failed as the title character of Tik-Tok (1983). One of the protagonists of Repo Man (1984) affirms his mission – that of repossessing cars without damaging them – through a parody of the First Law. Roger MacBride Allen's authorized Sequel by Other Hands, beginning with Isaac Asimov's Caliban (1993), feature traditional Three Laws robots, plus a faction of New Law robots which – as in Asimov's own "Little Lost Robot" (March 1947 Astounding) – lack the "through inaction ..." clause of the First Law, and finally the experimental no-laws robot Caliban who has complete free will and develops his own sense of ethics. Further legal proliferation is implied by the mechanical barrack-room lawyer in Terry Pratchett's The Dark Side of the Sun (1976):

"Can't disassemble a robot for obeying orders: Eleventh Law of Robotics, Clause C, As Amended ..."

The RoboCop-like Golems of Pratchett's Discworld have a highly plausible variant First Law, as indicated when the con-man hero of Going Postal (2004) protests:

"Wait! Wait! There's a rule! A golem mustn't harm a human being or allow a being to come to harm!"


"... Unless Ordered To Do So By Duly Constituted Authority," said the golem.

Many other additional laws have been suggested, including: "A robot may not fall in love", proposed by Tong Enzheng (whom see); "A robot must reproduce as long as such reproduction does not interfere with the First or Second or Third Law", from Harry Harrison's "The Fourth Law of Robotics" (in Foundation's Friends: Stories in Honor of Isaac Asimov, anth 1989; exp 1997, ed Martin H Greenberg); and "A robot must know it is a robot", as in The Fifth Law of Robotics (1983) by Nikola Kesarovski of Bulgaria [see also links below].

Cory Doctorow, in "I, Row-Boat" (Fall 2006 Flurb), proposes an "Asimovist" Religion or ethos whose adherents – not only robots and AIs but Uplifted and Uploaded beings – freely choose to live by the Three Laws. [DRL]

see also: Prime Directive.


Previous versions of this entry

Website design and build: STEEL

Site ©2011 Gollancz, SFE content ©2011 SFE Ltd.