Can a Machine Have Rights?
No conceivable machine would be entitled to personal rights.*
This is a Q1 violation if it is in conflict with your criteria for determining whether or not something has personal rights.
This may be a Q2 violation if there is disagreement about what qualifies as a machine. For example, would a cyborg be a machine?
There are several ways in which you might conclude that a machine cannot have personal rights. For example, a machine would likely not be entitled to personal rights if:
- Only humans have personal rights.
- Only things with a soul have personal rights.
- Only self-aware beings (beings with personal thoughts and knowledge that they exist) have rights.
- Only beings that are born and not created have personal rights.
However, there is a downside to each of these:
- If only humans have personal rights, then personal rights would not be granted to a non-human, even if that being was as intelligent as a human.
- There may be no objective way to prove that something has a soul, so this may not be compelling to someone who does not share your religious viewpoint.
- It may be impossible to test if something is self-aware.
- This may fail if humans can one day be created or machines can reproduce (in some sense).
Unfortunately, arguments for machines eventually having personal rights do not necessarily fare much better. If personal rights are granted to all living things, it must be shown how a machine can be determined to be alive. If rights are granted to sentient beings, we again have the problem of having to determine whether a machine is sentient.*
Other topics to consider:
- Would it matter if the machine were made of computer chips or lab-created DNA?
- If scientists engineered, from the ground up, a new kind of animal and it was as intelligent as a human, would it have personal rights?
- Is it likely that a machine that can act self-aware will be created before a machine that is self-aware?
- Would it be immoral to unplug a self-aware computer?
- If a human’s body was slowly replaced by robotic and computer parts as natural parts wore out, would there be a point at which you no longer consider this human/machine hybrid to be human?
- If your significant other looked, felt, and acted exactly as he or she does now but was a machine, could you have significant feelings for him/her?
- If it was possible to make a robot that appeared to be self-aware, would it be moral to program the robot not to want to have personal rights?
- Under what conditions (if any) would you grant an animal human rights (or some subset of human rights)?
*For the purpose of this discussion, I’m defining personal rights as the right to live, to not be abused, and to defend yourself.
*We essentially take it on faith that other people — let alone machines — are self-aware. There’s no way to unequivocally distinguish self-awareness from an excellent imitation of self-awareness.
You are encouraged to leave your answers to the questions posed in this post in the comments section. This post is based on an excerpt from Ask Yourself to be Moral, by D. Cancilla, available at LuLu.com and Amazon.com. See the 2Q system page for details of the philosophical system mentioned in this post.