Try asking the question in a way that has the person define the qualities that grant rights. EG: List Warren's 5 criteria for moral consideration as A-F and tell the person to select all that are relevant. Alternatively, describe a hypothetical intelligence capable of some of those traits and ask if it deserves full human rights.kc8tbe wrote:Thanks to everyone for your helpful feedback!
I want to add an additional question, but I'm having trouble formulating it in a way that isn't totally hypothetical. The question is basically this: if you could simulate a human brain on a computer down to the molecular level, would the resultant computer program be "alive"? Or in other words, if we eventually develop sentient AI, should the AI have rights? I expect materialists will answer yes while spiritualists will mostly answer no.
Can anyone think of a good way to ask this? I've come up with something like the following, but it still sounds silly to me.Chimpanzees, a species closely related to humans, are known to exhibit complex behaviors like altruism, self-awareness, play, tool making, and abstract problem solving. One evening, your friend the neuroscientist excitedly tells you that he has stumbled upon a tremendous discovery. The chimpanzee brain is not comprised of neurons like the human brain is, he claims, but rather of silicon structures that resemble the microprocessors in a computer. You doubt his sobriety and the veracity of his claim because:
A. It contradicts a large body of scientific evidence.
B. If the chimpanzee brain were not comprised of neurons, the chimpanzee would not exhibit the complex behaviors enumerated above.
C. Both A and B.
D. Neither A nor B.
E. You would not doubt the veracity of your friend's claim.
And as an aside, if you dig through the legal definitions, an AI would probably get classified as a corporation with the resulting rights and responsibilities that entails rather than being a person. In its own way, it is actually a better deal then we get.