Is There Morality Without Free Will?
Free will is an illusion, so nobody is responsible for their actions.
This may be a Q1 violation if you sometimes hold people responsible for their actions or sometimes want credit for your actions.
This is a Q2 violation if you would criticize the behavior of someone who believes free will is an illusion.
The implication here is that if there is no such thing as free will, you really aren’t choosing whether to do bad or good — you’re just doing what your deterministic body must do, governed by the rules of physics. So if you are not choosing to do bad or good, you cannot be held responsible for your actions any more than a clock can be held responsible for the time of day.
But does it follow from this that all actions are allowed — that there is no meaningful distinction between good and bad behavior? And if there is no responsibility without free will, does this imply that actions should have no consequences?
Imagine that the government builds a completely automated robot sentry that has powerful arms and a machine gun. Imagine also that the sentry is struck by lightning, which scrambles its programming so that it considers anything that moves an enemy which must be destroyed. Which of these statements (if any) would you say is true about the malfunctioning sentry?
- The sentry has no free will.
- If the sentry kills a person, it is a murderer.
- The sentry is evil.
- The sentry is not responsible for its actions.
- Because the sentry is not responsible for its actions, it should not be punished.
- Because the sentry is not responsible for its actions, it should not be stopped.
- Because the sentry is not responsible for its actions, it should not be considered "bad."
- The sentry should be stopped and reprogrammed.
- The sentry should be destroyed.
Which (if any) of these do you think be an effective counter to the statement, "Free will is an illusion, so nobody is responsible for their actions"?
- If you do certain things because you have no choice, then I have no choice but to hold you responsible.
- We are essentially machines, and as such are programmed to behave in a certain way. Part of that programming compels us to detect and interfere with other human machines that have bad programming. We call this holding people responsible for their actions.
- As a biological system, I am set up to defend myself. Holding you responsible for your actions is part of that defense.
- Subjectively, our behavior feels like free will, so we might as well treat it like free will.
You are encouraged to leave your answers to the questions posed in this post in the comments section. This post is based on an excerpt from Ask Yourself to be Moral, by D. Cancilla, available at LuLu.com and Amazon.com. See the 2Q system page for details of the philosophical system mentioned in this post.