Don Patrick - Mar 20, 2014:
Jarrod Torriero - Mar 20, 2014:
Well, yes, obviously it depends on what you want to use them for.
I can agree. It just appears that people majorily want robots to do repetitive menial tasks that don’t require much thought or intelligence, so I find that ironic
With regards to slavery: We can (and do) of course make robots that aren’t sentient or don’t mind being slaves, but anthropomorphic studies prove that society will inevitably regard our treatment of intelligent robot slaves as wrong and uncivilised. At that point it is not a question of intelligence or sentience, but of how much we identify and sympathise with our robots. The more intelligent they get, the more we will sympathise.
This is one of those cases where human empathy and intuition break down as morality heuristics. Different people will have different reactions to the issue, and my reaction would basically be ‘to each his own, but let the people who want this have it’. This is basically the same as my response to most such cases where human moral intuitions break down (alternative sexualities, pornography, violent media, etc.).
EDIT: I should also mention that I suspect that intelligence won’t be the primary deciding factor as to whether or not people tend to empathize with non-sentient intelligent agents - rather, it will likely come down to how human-like they seem.