Earth is ruled by master-machines but the Three Laws of Robotics have been designed to ensure humans maintain the upper hand: 1) A robot may not injure a human being or allow a human being to come to harm 2) A robot must obey orders given to it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. But what happens when a rogue robot's idea of what is good for society contravenes the Three Laws?
Isaac Asimov was the author of nearly five hundred books. He wrote crime and nonfiction as well as SF. He died in 1992, aged 72 years.