Morality
- 1Explicating our innate moral system(393w~2m)
- 2Key idea: Morality, intention, and causality are inseparable(37w~1m)
- 3Inaction can be a contributing factor, but not a cause(639w~4m)
- 4Counterproductive fragments?(81w~1m)
- 5Is universal morality possible or desirable?(714w~4m)
- 6Morality of survival(108w~1m)
- 7A moral system should justify itself?(29w~1m)
1Explicating our innate moral system
Here we are going to do these:
- Assume that we evolved an innate moral system.
- Formally define "moral system" and "quandary".
- Use quandaries arising from universal moral principles, to pinpoint our innate moral system. We want to come up with a statement such as "don't harm people".
- Construct a moral system without quandaries.
- 1.1Innate morality, natural morality(24w~1m)
- 1.2Formalizing quandary-free moral systems(219w~2m)
- 1.3Explicating our innate moral system(100w~1m)
1.1Innate morality, natural morality
See these:
- https://en.wikipedia.org/wiki/Natural_morality
- https://en.wikipedia.org/wiki/Evolution_of_morality
- https://www.nytimes.com/2006/10/31/health/psychology/31book.html
We have morality, but we can't say what it is.
Why do most people agree that the utilitarian surgeon is wrong?
1.2Formalizing quandary-free moral systems
Key ideas:
- A moral system is a formal system that contains modal logic.
- A quandary is a formula with the shape "(must S) and (must not S)".
- Conjecture: If a moral system only requires or only forbids but never both requires and forbids, then it won't have quandaries.
A moral system is a formal system that contains modal logic. A formal system has a formal language, a set of axioms, and a set of inference rules. A formal language has an alphabet and a grammar (a set of formation rules).
"Moral system" is sometimes also called "morality".
Two kinds of moral statements are requirement and forbiddance. Synonyms for "requirement" are necessity, obligation, duty. Synonyms for "forbiddance" are prohibition, restriction.
A quandary is a formula with the shape "(must S) and (must not S)".
A quandary is almost a contradiction. A contradiction is both true and false; likewise, a quandary both requires and forbids. Compare the difference in where the "not" is:
- A quandary has the shape "(must S) and (must not S)".
- A contradiction has the shape "(must S) and not (must S)".
A note about language: in this discussion, we never write "you must avoid harming others"; we always write "you must not harm others". We don't use negative verbs such as "avoid" in a formal moral system; we use only positive verbs in this discussion.
1.3Explicating our innate moral system
Here we are going to repeat these until we arrive at a quandary-free moral system:
- Propose a universal moral principle that at first seems good.
- Find that it raises a quandary.
- Revise the moral principle to immunize it against the quandary.
Here we try to state our innate moral system in English.
Given an action, most people agree on whether that action is good or bad, but most people cannot satisfactorily define what good is and what bad is.
What I think we believe is moral:
- Retaliating.
- Forcing children to do things for their own good.
Frans de Waal's TED talk "Moral behavior in animals"1
2Key idea: Morality, intention, and causality are inseparable
Morality cannot be determined by merely looking at the outcomes. Determining morality requires knowing intention and understanding the chain of causes.
Why does intention matter? How do we know others' intentions?
3Inaction can be a contributing factor, but not a cause
Key idea: Inaction cannot be a cause, but inaction can be a contributing factor. A contributing factor shares some blame.
About the trolley problem, utilitarianism suggests killing the lowest number of people, but our innate morality suggests unconditional inaction. We refuse to interfere with the natural course of events.
Guilt, harm, blame, and causality:
This suggests that our moral principle may be "do not cause harm". This suggests that we feel guilt when we cause harm. Blaming requires understanding causality.
If I don't do anything, then I have less responsibility if the runaway tram crashes into people. If I switch, I have more responsibility.
The solution to the trolley problem: Inaction cannot be a cause, but can be a contributing factor. If you do not act, then your existence does not affect the natural course of events.
The runaway tram problem differs from the crashing airliner problem. The airliner pilot has a third option: killing everyone on the plane. The tram driver doesn't have that option.
What do we evolve morality for?
Morality improves species survival, but does not maximize species survival. Here's a proof by contradiction. Suppose that it is moral to do everything that promotes survival of the species. Quandary: the utilitarian surgeon kills 1 patient to save 5 patients. But we feel that this is immoral. Why don't doctors kill people to save more people? Why do we prefer letting people die to making people die? https://plato.stanford.edu/entries/doing-allowing/
Suppose that two men are starving. It is moral for a man to sacrifice himself to be eaten by the other man, but it is immoral for a man to sacrifice-and-eat the other man. Why is this asymmetry? There are four possible actions:
- X kills X. Y eats X's carcass.
- X kills Y. X eats Y's carcass.
- …
The outcome is the same: one of them will die anyway. Why is self-sacrifice moral? Why is Jesus the most moral person?
We are willing to sacrifice ourselves in an impending doom, in a hopeless situation; but we refuse to sacrifice ourselves so that the utilitarian doctor may save 5 other people. If we are going to die 5 hours from now, then dying an hour earlier doesn't make a difference. We are willing to sacrifice one day of life expectancy, but not 60 years of life expectancy. Why does our willingness to sacrifice depend on how long we expect to live? Why does our willingness to sacrifice depend on how far we look into the future? A young suicidal man readily sacrifices himself while knowing that he would expect himself to live for 60 more years if he didn't kill himself? Does a suicidal person know that a human, on average, lives for 80 years? What does a suicidal person think about life expectancy? How long do you expect to live? I want to live forever.
The opposite of a suicidal person is a person who wants to live forever?
One thing is clear in our evolved morality: It is moral to harm oneself, but it is immoral to harm others. Did we evolve altruism?
Suppose two groups. Actively harms others. Avoids harming others.
Which is true? We evolve morality so that we may form groups. Or. We form groups so that we may evolve morality.
Morality is what is required to form a group. A group can only exist if most of its members are moral. Morality is whatever prevents the collapse of the group. ???
How do we balance the individual's will to live and the group's survival?
Morality evolved to minimize feeling guilty?
Morality evolves to minimize the harm that a group inflicts to its members?
Why do we feel guilty?
Feeling guilty requires knowing causality. We feel guilty because we think we cause harm.
Imagining evil vs doing evil
Relationship between morality, agency, cause, guilt?
What is the relationship between intention and morality? Is it moral: an action with good intention but bad outcome?
4Counterproductive fragments?
- 4.1Ethics is the study of moral quandaries?(20w~1m)
- 4.2Nature is amoral. Why should we be moral?(25w~1m)
- 4.3Default-allow of default-forbid?(37w~1m)
4.1Ethics is the study of moral quandaries?
The goal of ethics is to create a moral system free of moral quandaries?
4.2Nature is amoral. Why should we be moral?
What difference is between dying today and dying tomorrow? Aren't we all dead in the long run anyway?
4.3Default-allow of default-forbid?
There are two rules of conduct:
- Everything is allowed unless forbidden. Fast, loose, and entrepreneurial. Liberal.
- Everything is forbidden unless allowed. Safe, slow, bureaucratic. This makes sense for computer security, but does it make sense for human?
5Is universal morality possible or desirable?
- 5.1What?(131w~1m)
- 5.2The problem with prioritizing individual survival over species survival(77w~1m)
- 5.3Is there an ultimate moral quandary?(31w~1m)
- 5.4Moral particularism(65w~1m)
- 5.5Improbability of equality of outcome in doomsday spaceship scenario(202w~2m)
- 5.6Contextual/circumstantial morality/ethics(60w~1m)
- 5.7Natural morality? Survival?(17w~1m)
- 5.8Survivalism moral quandary: two people on a sinking ship, in which only one person can be saved(133w~1m)
5.1What?
There are always problems with moral systems? Ethical dilemmas: no universal morality? https://philosophynow.org/issues/60/Why_You_Shouldnt_Be_A_Person_Of_Principle
If every moral system is problematic, why should we have any moral system?
Egocentric survivalism's answer to trolley problem is "It doesn't matter what you do, because it doesn't have anything to do with your survival."
Chance-survivalism's answer to trolley problem is "You should act in the way that maximizes the human race's chance of survival." But you don't know who to save for the best survival of the human race.
Problem: If the 5 people are all homosexual, and the 1 person is heterosexual, then chance-survivalism implies that you should direct the train to the 5 homosexual people? What if those 5 homosexuals found a cure to cancer, and that heterosexual became a war criminal? What if the other way around?
5.2The problem with prioritizing individual survival over species survival
Survivalism suffers this problem. Consider this dilemma: A superpowered alien abducts you, starves you, and offers you two options:
- If you eat the food, then he destroys the Earth, killing all humans on Earth.
- If you don't eat the food, then he leaves the Earth alone.
The problem: survivalism prescribes that you eat the food, and let everyone else go to hell.
But it makes sense. Even Utilitarianism suggests that you get into
5.3Is there an ultimate moral quandary?
Is there a situation in which no morality has any solutions?
How do we generate moral quandaries?
Given a moral system, can we always generate a moral quandary?
5.4Moral particularism
https://philosophynow.org/issues/60/Why_You_Shouldnt_Be_A_Person_Of_Principle
Is there a universal moral principle that coincides with the majority intuition about these issues?
- It is moral to kill fewer people to save more people?
- Is it moral to kill a serial killer to prevent 100 murders?
- Is it moral to kill a healthy innocent person and distribute his organs to save 5 people in need?
- Is it moral to annex a mismanaged country and improve it?
5.5Improbability of equality of outcome in doomsday spaceship scenario
Consider this "doomsday spaceship" scenario:
- A huge asteroid will hit the Earth 1 week from now. That will kill all 7 billion people.
- But we have one spaceship that can save 1000 people. That is the only way out of Earth.
Which people should we save? Why? There is no satisfactory answer to this; we should just use a truly random lottery. But if we pick people randomly, the ship will be full of poor people, because the majority of the Earth is poor. Should pregnant women be prioritized before non-pregnant women? Should older women be prioritized before younger women? Should women be prioritized before men? Should children be prioritized before women? Nobody should be prioritized. If we don't pick people randomly, then we don't practice what we preach about equality.
With a lottery, all 7 billion people have a chance to board the spaceship, but it is physically impossible for all 7 billion people to actually board the spaceship. No amount of political correctness will change the laws of nature.
Of course, when the spaceship is big enough for 7 billion people, we can have equality of outcome. The question is how we make a spaceship that big.
5.6Contextual/circumstantial morality/ethics
Does context change the goodness? Is context relevant to morality? Do circumstances affect judgment? Example: Stealing is wrong. Is stealing in distress to survive less wrong? Self-defense can justify killing. Why can't self-defense justify stealing? Should intentional killing be more wrong than unintentional killing? Rioting is wrong. But is rooting to overthrow a tyrant wrong?
Is killing a terrorist good?
5.7Natural morality? Survival?
http://atheistnexus.org/m/discussion?id=2182797%3ATopic%3A131131 "Morality is a consideration among the living about that which affects survival." Clarence Dember
https://en.wikipedia.org/wiki/Natural_morality
5.8Survivalism moral quandary: two people on a sinking ship, in which only one person can be saved
Consider this scenario:
- Two people X and Y are on a sinking ship in the middle of the ocean.
- There is only one way to safety: by a lifeboat.
- But the lifeboat can only carry one person.
Remember that survivalism is about the species, not the individual.
Survivalism implies that we should prefer the one most fit to continue the survival of the species. Survivalism implies survival of the fittest.
- It is moral to sacrifice oneself to let the other live.
- It is moral for one to kill the other.
- It is not moral to die together.
Survivalism seems to suggest that they should fight until one dies, but without hurting the other too much that both can't survive.
Here survivalism is ambivalent about egoism and altruism.
6Morality of survival
It is moral to survive.
If X tries to kill Y first, then it is moral for Y to kill X.
If X tries to kill Y first, then it is moral for Z to help Y survive.
Here, an extremist is a member of ISIS, and a victim is a victim of ISIS. Therefore, it is moral for us to help the victims defend against the extremists. It is not moral for us to kill the extremists, but it is moral for us to help the victims kill the extremists, and it is moral for the victims to kill the extremists, because the extremists attacked the victims first.
7A moral system should justify itself?
"What moral system should we subscribe to?" is a moral judgement about morality.
A moral system should suggest itself as the answer to such question?