Any kind of artificial intelligence close to humans, more advanced or even more primitive will have a robot making the decisions that could lead to bad decisions or the robot not obeying the programming code of out come of action.
If you tell robot you have two paths to take. Red road or the black road but you cannot take the red road under x condition. If such a robot has artificial intelligence want to say the robot will understand two outcomes and disobey the programing code.
Only way is for a robot to have no intelligence or decisions making but every acting is governed by obedience command line code:(
a mind less robot borg drone in star tek. Only such problems will come up is hacking or bad programming code. We all know how well programming is with Microsoft windows vulnerability and security updates .
Humans have the intelligence you want!! Well humans know the rules yet human police officers disobey the law and PD policy rule book. How could you think robot with any intelligence or decisions making not disobey it too.
Good luck programming in a kill switch if robot disobey programming command. If there is vulnerability or security problem like Microsoft windows.
We cannot program OS that alone robot.