Isaac Asimov's "Three Laws of Robotics"
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Just something interesting. Using these three laws, you can come up with some interesting scenarios. Just an example:
Person A want to kill Person B. Then a robot is around. It must obey First Law, so protect B and it must do so without hurting A. If B tell robot to kill A, then robot will refuse cause it violate Second Law. Then robot cannot allow itself to come to harm while protecting B cause of the Third Law.
Well, lousy illustration...go find some scenarios yourself. Game Of Life coming up next:)