Asimov's Three Laws of Robotics
-kepher: December 14, 2001
Ironically, Asimov wrote these laws as science fiction back in the 40s, before robots even existed, but interestingly enough, they are already being violated not only in spirit, but in practice. Technically destructive technologies like "smart" cruise missiles (which can be considered robots) are in direct contradiction with Asimov's laws. They are as follows:
1. A robot may not injure a human being or, through in action, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the first law.
3. A robot must protect its own existence, as long as this does not conflict with the first two laws.