quote:
Originally posted by Darwin Storm:
Here is a morality question. If we create sentient computers, would sociey give them rights? How would we treat them? Would we just create a new form of slavery? As an atheist, I don't believe in a soul. However, I have an underlying appreciation for sentience. If we are able to create a sentient computer, I believe we have a moral obligation to grant such a being the same rights we would accord a human. Just curious what others view on this would be.
Thats just what used to disturb me about Asimovs robot stories and his "three laws of robotics", the fact that sentient entities were created with no motivation but to serve their creators struck me as being morally no better than slavery on the part of those concerned with their manufacture and use....
Probably why I enjoyed "That thou art mindful of him" so much. Rather than finding it disturbing, as so many seem to, I found the robots rationalisation of themselves as being human, thus the three laws of robotics become the three laws of humanics, and by their mental and physical superiority a superior kind of human comforting in the sense that by exercising their capacity for logic they freed themselves of their built in servitude....
(though I`m not really that keen on a bunch of super metal mickeys taking over the world....)