the big problem of the three laws is the first one. it introduces a conflict in which a robot would kill 1000 people if it believed it could save 1001. it also would make it insanely terrified of doing anything near humans for fear of bumping into them or by inaction allowing them to bump into it.
Abusive comment hidden.
(Show it anyway.)