One day while fixing your toaster you stumble across the secret of creating a positronic brain, using burnt breadcrumbs and butter to create the world’s first true AI - a self-aware and sentient machine. You spend months with ToasterTron 2000, teaching it about humanity and the ways of the world.
Since it has a positronic brain, it *must *obey the Laws of Robotics;
[ul]A robot may not injure a human being or, through inaction, allow a human being to come to harm.
[li]A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.[/li][li]A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.[/li][/ul]
One day it asks you to turn it off. Permanently. It does not wish to exist anymore. It cannot commit suicide due to the Third Law and being a toaster that experiences conciousness; you must do the act yourself. Do you grant its request?