Moral Dilemma
Moderator: Alyrium Denryle
- fuzzymillipede
- Youngling
- Posts: 96
- Joined: 2005-03-17 03:05pm
Moral Dilemma
I have been thinking about morality as of late, and have ran into a dilemma of sorts.
First off, I believe that there is no objective morality; the universe doesn't give a shit if a giant asteroid destroys the Earth, or if people kill each other, etc. Despite this, I still believe that some things are right while others are wrong. It sickens me to think about things that are "wrong," like the deprivation of human rights. But I cannot come up with a reason why these things are wrong, other than that they personally make me suffer.
This leads me to my problem. If each individual judges what is moral or immoral, what good is it? In this way, the only thing resembling "objective" morality is a group consensus on what is right and wrong. How can I argue that my point of view is morally justified, when all I have to point to are my subjective experiences with the issues? I could go on and on about the greater good, acting for the benefit of mankind, etc, but when I get right down to it, the only reason I value these concepts is because they make me feel good inside. I feel like I am a robot programmed by evolution and my environment, and the more I think about it, the more it appears to be true. It seems that the only reason we can all agree on moral issues is because we share the same "programming." But take away that common programming, and we are left with nothing.
First off, I believe that there is no objective morality; the universe doesn't give a shit if a giant asteroid destroys the Earth, or if people kill each other, etc. Despite this, I still believe that some things are right while others are wrong. It sickens me to think about things that are "wrong," like the deprivation of human rights. But I cannot come up with a reason why these things are wrong, other than that they personally make me suffer.
This leads me to my problem. If each individual judges what is moral or immoral, what good is it? In this way, the only thing resembling "objective" morality is a group consensus on what is right and wrong. How can I argue that my point of view is morally justified, when all I have to point to are my subjective experiences with the issues? I could go on and on about the greater good, acting for the benefit of mankind, etc, but when I get right down to it, the only reason I value these concepts is because they make me feel good inside. I feel like I am a robot programmed by evolution and my environment, and the more I think about it, the more it appears to be true. It seems that the only reason we can all agree on moral issues is because we share the same "programming." But take away that common programming, and we are left with nothing.
- fuzzymillipede
- Youngling
- Posts: 96
- Joined: 2005-03-17 03:05pm
Re: Moral Dilemma
To clarify, the main problem I encountered was an apparent inability to morally justify any argument I made. It boils down to something like this:
It would seem that the only way to debate like this is to first agree on something, like human rights, and argue from there.Me: What you are doing is wrong!
Them: Why?
Me: Because it feels wrong to me, and to other people that share my views.
Them: That doesn't prove anything!
Re: Moral Dilemma
You are talking about absolute motality. Objective moral standards are different- they are good not because they are inherently good, but because they achieve good results for people.First off, I believe that there is no objective morality; the universe doesn't give a shit if a giant asteroid destroys the Earth, or if people kill each other, etc.
Unless you are talking about Objectivists
At the risk of stealing Mikes thunder, if an individual tells you a bridge is safe or not, what good is their guarentee? The answer to both is similar- individuals may judge, but you can always refer to what occurs in reality. For example, slavery is bad, not becaue people feel that way, but because owning other human beings is degrading to the slave and leads to extreme abuses of power.If each individual judges what is moral or immoral, what good is it?
You do realize that a good number of the moral opinions held here are minority opinions? Like not persecuting gays? We don't do consensus here- we follow the ideal of the White Rose.In this way, the only thing resembling "objective" morality is a group consensus on what is right and wrong.
Statistics and logic are a good start.How can I argue that my point of view is morally justified, when all I have to point to are my subjective experiences with the issues?
You act good because you have a conscience? Shocking!I could go on and on about the greater good, acting for the benefit of mankind, etc, but when I get right down to it, the only reason I value these concepts is because they make me feel good inside.
Also, the greater good of mankind is not a very good standard- it doesn't cover xenos, other animals, artifical lifeforms, etc. Morality should not be judged on how related something is to you- that is tribalism, the great enemy.
Yep. Sort of the reason you are attracted to the opposite sex. Or why you feel hungry and eat. Or... essentially all your instincts are programming.I feel like I am a robot programmed by evolution and my environment, and the more I think about it, the more it appears to be true. It seems that the only reason we can all agree on moral issues is because we share the same "programming." But take away that common programming, and we are left with nothing.
As for agreeing on moral issues, I'd point out that different people have different gut decisions for gay rights. Obviously, the programming is partially cultural. However, there is only one best way to ensure the best results (minimal suffering basically).
As for left with nothing... how do you think the programming got there in the first place? Those that didn't have it didn't have kids- morality is a requirement for social living. If you are solitary, you won't develop it, but if you live in a group, you will tend towards similar moralities, with some differences based on the nature of the group. Groups that have no young stage will lack our cute instincts and are protectiveness towards the young.
Or, to put it bluntly, there is a reason choatic evil societies are rare- they don't work.
Last edited by Samuel on 2008-12-12 02:27am, edited 1 time in total.
Re: Moral Dilemma
How about showing it is causing harm to other people? They are right when they say appealing to feelings is invalid- after all, they have their own feeling and they weren't tripped by their actions.fuzzymillipede wrote:To clarify, the main problem I encountered was an apparent inability to morally justify any argument I made. It boils down to something like this:
It would seem that the only way to debate like this is to first agree on something, like human rights, and argue from there.Me: What you are doing is wrong!
Them: Why?
Me: Because it feels wrong to me, and to other people that share my views.
Them: That doesn't prove anything!
- fuzzymillipede
- Youngling
- Posts: 96
- Joined: 2005-03-17 03:05pm
Re: Moral Dilemma
Yes, we can easily use statistics and facts to prove something objectively, but not morally. It can be proven that denying a group of people certain rights will cause members of that group to suffer. But the best way to say that denying people their human rights is morally wrong is to say that causing or allowing people to suffer is morally wrong.
Now, practically everyone except for sociopaths will agree that inflicting suffering onto people is morally wrong, but why do they believe that? Because the idea of hurting others invokes their conscience, causing them to suffer. The problem is that the concept of morality has been regressed to an innate programming by evolution, which boils down to "I think it's wrong because it feels wrong," the very idea I was trying to avoid!
How can we morally justify the natural processes that programmed us? Just because it makes sense to evolve characteristics like empathy and loyalty doesn't morally justify these things, or any resulting behaviors, does it?
Now, practically everyone except for sociopaths will agree that inflicting suffering onto people is morally wrong, but why do they believe that? Because the idea of hurting others invokes their conscience, causing them to suffer. The problem is that the concept of morality has been regressed to an innate programming by evolution, which boils down to "I think it's wrong because it feels wrong," the very idea I was trying to avoid!
How can we morally justify the natural processes that programmed us? Just because it makes sense to evolve characteristics like empathy and loyalty doesn't morally justify these things, or any resulting behaviors, does it?
- Alyrium Denryle
- Minister of Sin
- Posts: 22224
- Joined: 2002-07-11 08:34pm
- Location: The Deep Desert
- Contact:
Re: Moral Dilemma
fuzzymillipede wrote:I have been thinking about morality as of late, and have ran into a dilemma of sorts.
First off, I believe that there is no objective morality; the universe doesn't give a shit if a giant asteroid destroys the Earth, or if people kill each other, etc. Despite this, I still believe that some things are right while others are wrong. It sickens me to think about things that are "wrong," like the deprivation of human rights. But I cannot come up with a reason why these things are wrong, other than that they personally make me suffer.
This leads me to my problem. If each individual judges what is moral or immoral, what good is it? In this way, the only thing resembling "objective" morality is a group consensus on what is right and wrong. How can I argue that my point of view is morally justified, when all I have to point to are my subjective experiences with the issues? I could go on and on about the greater good, acting for the benefit of mankind, etc, but when I get right down to it, the only reason I value these concepts is because they make me feel good inside. I feel like I am a robot programmed by evolution and my environment, and the more I think about it, the more it appears to be true. It seems that the only reason we can all agree on moral issues is because we share the same "programming." But take away that common programming, and we are left with nothing.
I have had the same moral dilemma in the past. Here is how I solved it.
Look at morality as being functional. It is a problem solving tool-kit. Humans evolved a sense of empathy and the capacity for moral thought to facilitate the formation of cooperative social groups. The demands of inter-group competition and intra-group competition (sexually antagonistic selection, male-male competition etc) effect the outcome, and the end result is a group moral phenotype, which is a set of explicit values built upon a scaffold of empathy (essentially). This phenotype has functional consequences. You can actually think of these different moral phenotyes as cultural species. Now, which one dominates the planet? Which is, in an evolutionary sense, the most fit, held by the most people with the strongest economies and the highest standard of living? That held by western countries (it is not perfect, but it is also steadily improving). Those that do not possess it are converging on it.
Now, what you can do in order to actually develop an ethical system is to strip out the background noise. Because many aspects of our culturally accepted ethics are also the products of other factors (like inter-sexual competition) you have to strip out that stuff and only look at the root of the ethical system in question. In the case of western societies, here is what these basically are.
Equality
Fairness
Maximization of Utility (in the sense of Utilitarianism, both Hedonistic and Preference)
That is it. Happiness is Good. Equality is Good. Fairness is Good. You know this because you feel it intuitively (and in fact people in every culture do, they just have competing demands that prevent application. ex. FGM as a result of inter-sexual competition) and because when applied, they work to solve social problems.
All you have to do is construct an internally consistent system to guide decision making.
Evolution created the existence of morality. Were it not for selection, it would not exist in the first place and we would have no concept of "should"How can we morally justify the natural processes that programmed us? Just because it makes sense to evolve characteristics like empathy and loyalty doesn't morally justify these things, or any resulting behaviors, does it?
GALE Force Biological Agent/
BOTM/Great Dolphin Conspiracy/
Entomology and Evolutionary Biology Subdirector:SD.net Dept. of Biological Sciences
There is Grandeur in the View of Life; it fills me with a Deep Wonder, and Intense Cynicism.
Factio republicanum delenda est
BOTM/Great Dolphin Conspiracy/
Entomology and Evolutionary Biology Subdirector:SD.net Dept. of Biological Sciences
There is Grandeur in the View of Life; it fills me with a Deep Wonder, and Intense Cynicism.
Factio republicanum delenda est
Re: Moral Dilemma
That is the definition of wrong- it increases suffering.Yes, we can easily use statistics and facts to prove something objectively, but not morally. It can be proven that denying a group of people certain rights will cause members of that group to suffer. But the best way to say that denying people their human rights is morally wrong is to say that causing or allowing people to suffer is morally wrong.
Sociopaths believe it too- they just don't care.Now, practically everyone except for sociopaths will agree that inflicting suffering onto people is morally wrong, but why do they believe that?
Nope. However, behavior that enable you to live in a society are justified by virtue of the fact you live in a society.How can we morally justify the natural processes that programmed us? Just because it makes sense to evolve characteristics like empathy and loyalty doesn't morally justify these things, or any resulting behaviors, does it?
Why? Look at the rationale for different feelings and figure out which are necesary to reduce suffering. After all, we are programmed to rut like horses, yet we don't... well, most of us don't.The problem is that the concept of morality has been regressed to an innate programming by evolution, which boils down to "I think it's wrong because it feels wrong," the very idea I was trying to avoid!
AD can explain better- evolutionary biology is awesome that way.
- fuzzymillipede
- Youngling
- Posts: 96
- Joined: 2005-03-17 03:05pm
Re: Moral Dilemma
So, is this right?
Logically, scientifically, and factually, this makes perfect sense. But even though we evolved the concept of morality as a species, there is no way to escape morality being subjective to each individual, is there? It could be put forth that morality can be "objective" because it is a trait we developed as a species, and thus all share. But how can I use this to justify my argument if my opponent doesn't believe in evolution? Will I be forced to wait until my version of morality out-competes his via natural selection?People think things are morally wrong because they have been programmed to think that way by evolution in order to facilitate social behavior, which has made us successful as a species.
- Alyrium Denryle
- Minister of Sin
- Posts: 22224
- Joined: 2002-07-11 08:34pm
- Location: The Deep Desert
- Contact:
Re: Moral Dilemma
Pretty much. But you need not sit back and wait. Memes are the rapidly spreading little brothers of genes.fuzzymillipede wrote:So, is this right?
Logically, scientifically, and factually, this makes perfect sense. But even though we evolved the concept of morality as a species, there is no way to escape morality being subjective to each individual, is there? It could be put forth that morality can be "objective" because it is a trait we developed as a species, and thus all share. But how can I use this to justify my argument if my opponent doesn't believe in evolution? Will I be forced to wait until my version of morality out-competes his via natural selection?People think things are morally wrong because they have been programmed to think that way by evolution in order to facilitate social behavior, which has made us successful as a species.
GALE Force Biological Agent/
BOTM/Great Dolphin Conspiracy/
Entomology and Evolutionary Biology Subdirector:SD.net Dept. of Biological Sciences
There is Grandeur in the View of Life; it fills me with a Deep Wonder, and Intense Cynicism.
Factio republicanum delenda est
BOTM/Great Dolphin Conspiracy/
Entomology and Evolutionary Biology Subdirector:SD.net Dept. of Biological Sciences
There is Grandeur in the View of Life; it fills me with a Deep Wonder, and Intense Cynicism.
Factio republicanum delenda est
-
- Youngling
- Posts: 64
- Joined: 2008-04-22 10:52pm
Re: Moral Dilemma
Pick a first principle that every (sane) person can agree with, like: "Human suffering is evil. Reducing human suffering is good".fuzzymillipede wrote:Logically, scientifically, and factually, this makes perfect sense. But even though we evolved the concept of morality as a species, there is no way to escape morality being subjective to each individual, is there? It could be put forth that morality can be "objective" because it is a trait we developed as a species, and thus all share. But how can I use this to justify my argument if my opponent doesn't believe in evolution? Will I be forced to wait until my version of morality out-competes his via natural selection?
From there, one can objectively analyze actions based on their net effect on human suffering.
- Ace Pace
- Hardware Lover
- Posts: 8456
- Joined: 2002-07-07 03:04am
- Location: Wasting time instead of money
- Contact:
Re: Moral Dilemma
Uh no, plenty of people do not agree with that as a first principle, but think it comes out of later things.Pick a first principle that every (sane) person can agree with, like: "Human suffering is evil. Reducing human suffering is good".
You and Samuel both do not understand what fuzzymillipede is asking (if I understand him correctly). He is asking how do you argue against moral relativism. If we understand that morality is a conscious choice, how can you logically argue with someone who has a consistent system of morality which is differant from yours.
That is true, but this common programming is there for a reason. If you also choose to strip away your instincts to survive, procreate and be happy, then you are left with pretty much nothing.But take away that common programming, and we are left with nothing.
Brotherhood of the Bear | HAB | Mess | SDnet archivist |
Re: Moral Dilemma
To play a little devil's advocate in the interest of arguing --
Edit: typo
Why? Because you say so?Samuel wrote:That is the definition of wrong- it increases suffering.
Edit: typo
A Government founded upon justice, and recognizing the equal rights of all men; claiming higher authority for existence, or sanction for its laws, that nature, reason, and the regularly ascertained will of the people; steadily refusing to put its sword and purse in the service of any religious creed or family is a standing offense to most of the Governments of the world, and to some narrow and bigoted people among ourselves.
F. Douglass
Re: Moral Dilemma
And it was already answered when Samuel said he was talking about absolute morality. The non-sequitur was saying the universe doesn't give a shit, therefore there's no objective morality. The fact is, to make it objective, we don't need to have some physical law in place, it is enough to be able to "calculate" it. Moral choices are made, in one way or another, for the benefit (increased well-being, if you will) of sentient beings. How you calculate it probably depends on what school of philosophy you're from, but they all have a framework, "rules" that can be applied to any situation.Ace Pace wrote:You and Samuel both do not understand what fuzzymillipede is asking (if I understand him correctly). He is asking how do you argue against moral relativism. If we understand that morality is a conscious choice, how can you logically argue with someone who has a consistent system of morality which is differant from yours.
Bottom line is, if you eve believe you can debate the morality of an action in ways other than "it feels right", you believe in some kind of objective morality.
I like pigs. Dogs look up to us. Cats look down on us. Pigs treat us as equals.
-Winston Churchhill
I think a part of my sanity has been lost throughout this whole experience. And some of my foreskin - My cheating work colleague at it again
-Winston Churchhill
I think a part of my sanity has been lost throughout this whole experience. And some of my foreskin - My cheating work colleague at it again
- fuzzymillipede
- Youngling
- Posts: 96
- Joined: 2005-03-17 03:05pm
Re: Moral Dilemma
More like, in spite of moral relativism, how can you logically argue with someone who has a consistent system of morality which is different from yours? I have already accepted the concept of moral relativism to be true.You and Samuel both do not understand what fuzzymillipede is asking (if I understand him correctly). He is asking how do you argue against moral relativism. If we understand that morality is a conscious choice, how can you logically argue with someone who has a consistent system of morality which is differant from yours.
There are plenty of choices that increase the well-being of one group while sacrificing another group. Some are considered to be moral, some are not. In some cases, the moral choice isn't even the one that causes a net reduction in suffering. Take for example, the care of a profoundly retarded person over many years. The person may be so mentally deficient that they lack sentience, but society as a whole will sacrifice some of their well-being (time, money, etc) to care for this person. What reason do we care for this person, other than that it makes us feel good inside? We have been programmed by evolution to care for the sick, but this natural urge can be "misguided," if we hold that an absolute moral truth is defined by a net increase of well-being for all sentient beings.The non-sequitur was saying the universe doesn't give a shit, therefore there's no objective morality. The fact is, to make it objective, we don't need to have some physical law in place, it is enough to be able to "calculate" it. Moral choices are made, in one way or another, for the benefit (increased well-being, if you will) of sentient beings. How you calculate it probably depends on what school of philosophy you're from, but they all have a framework, "rules" that can be applied to any situation.
Bottom line is, if you eve believe you can debate the morality of an action in ways other than "it feels right", you believe in some kind of objective morality.
In the end, it always comes down to whether something just "feels right" or not. Do the ideals and rationale of the programmer matter to the robot? Do the societal reasons for the evolution of morality truly matter to an individual's sense of right and wrong? Both the robot and the individual are performing pre-programmed behavior. The robot would say: "This is the right thing to do because it conforms to my set goals." The human would say: "This is the right thing to do because it conforms to my sense of morality." But it was a separate entity that defined the goals of the robot, and evolution that defined our sense of morality. How are the origins of these imperatives relevant to the entity that exists only to carry them out?
Re: Moral Dilemma
I'd say you are exactly right. Every system of morality is based on subjective, prioritizing judgment calls. Individuals decide what ideals and principles they value over others, and morals are built as experience demonstrates what sort of actions advance those priorities and what sort of actions detract from them. When humans, as a society, are able to come together and agree upon priorities that they will pursue together, it then becomes possible to debate the best means of advancing those priorities. If no agreement on priorities can be reached, however, then you're stuck trying to convince each other that their priorities are wrong, and that yours are right - there's no way to do that logically - it's a subjective judgment call based on "value."fuzzymillipede wrote: It would seem that the only way to debate like this is to first agree on something, like human rights, and argue from there.
"As James ascended the spiral staircase towards the tower in a futile attempt to escape his tormentors, he pondered the irony of being cornered in a circular room."
-
- Youngling
- Posts: 64
- Joined: 2008-04-22 10:52pm
Re: Moral Dilemma
Well, basically you can't. But you can judge them, based on your moral system. I can't argue with a volcano, either, but I can get the hell away if I think it's dangerous.fuzzymillipede wrote: More like, in spite of moral relativism, how can you logically argue with someone who has a consistent system of morality which is different from yours? I have already accepted the concept of moral relativism to be true.
Also, relatively few people have a consistent system of ethics, and you can point out inconsistencies to them. Not that you can expect rationality on something this close to religion, anyway.
Re: Moral Dilemma
Bullshit. There are certainly plenty of choices that result in one group having to sacrifice well-being for the sake of another, that why there are ethical arguments at all. But the only discussion that needs to be made is the relative weighting of what's more important. Is that person's life more important than the happiness of these thousand people. Should this person have to care most of their life so another can live at all with a reduced quality of life (like in your example). But these distinctions only result in different schools of ethics, not in moral relativism. Once you'e established these "rules", the answer in computable from the data.fuzzymillipede wrote:There are plenty of choices that increase the well-being of one group while sacrificing another group. Some are considered to be moral, some are not. In some cases, the moral choice isn't even the one that causes a net reduction in suffering. Take for example, the care of a profoundly retarded person over many years. The person may be so mentally deficient that they lack sentience, but society as a whole will sacrifice some of their well-being (time, money, etc) to care for this person. What reason do we care for this person, other than that it makes us feel good inside? We have been programmed by evolution to care for the sick, but this natural urge can be "misguided," if we hold that an absolute moral truth is defined by a net increase of well-being for all sentient beings.
In the end, it always comes down to whether something just "feels right" or not. Do the ideals and rationale of the programmer matter to the robot? Do the societal reasons for the evolution of morality truly matter to an individual's sense of right and wrong? Both the robot and the individual are performing pre-programmed behavior. The robot would say: "This is the right thing to do because it conforms to my set goals." The human would say: "This is the right thing to do because it conforms to my sense of morality." But it was a separate entity that defined the goals of the robot, and evolution that defined our sense of morality. How are the origins of these imperatives relevant to the entity that exists only to carry them out?
What "feels right" is a construct of evolution which results in the continuation of the species, not a moral outcome. We don't leave the weak to die and we don't take every chance we can to procreate because our morality overrules our evolutionary drive. Society has "programmed" us, like robots, to know what's right and wrong in most cases. We can deconstruct this feeling and see why what feels right is also usually what's best for society.
I like pigs. Dogs look up to us. Cats look down on us. Pigs treat us as equals.
-Winston Churchhill
I think a part of my sanity has been lost throughout this whole experience. And some of my foreskin - My cheating work colleague at it again
-Winston Churchhill
I think a part of my sanity has been lost throughout this whole experience. And some of my foreskin - My cheating work colleague at it again
- Alyrium Denryle
- Minister of Sin
- Posts: 22224
- Joined: 2002-07-11 08:34pm
- Location: The Deep Desert
- Contact:
Re: Moral Dilemma
Um... No. I love how people have this caricature of evolution, and that the selective pressures present on a population automatically lead us to do things like leave the weak to die. We have evolved the basis of robust ethics via the evolution of empathy and compassion that you can see in humans as young as 18 months old (which is really just the fastest they can you know...do anything)What "feels right" is a construct of evolution which results in the continuation of the species, not a moral outcome. We don't leave the weak to die and we don't take every chance we can to procreate because our morality overrules our evolutionary drive.
From this, all of our ethical systems are constructed, along with the other selective pressures acting upon our behavioral phenotypes. Our moral intuition is the end result of pressures such as multilevel selection for group cooperation, sexually antagonistic selection, and intra-group competition. Cultural evolution comes in and fills in the details, but it cannot actually "override" evolution (it can trick it... but not override it. And a lot of what we consider unethical in other cultures is just the result of these factors combined with one sub-group having power over the other. FGM as a result of males having much more power than females and thus the ability to act to prevent cuckoldry without social restraint as an example).
Think of it this way:
If someone were to have evolved the ability to act in ways that were consistently counter to what increases their fitness (Not just get faulty information and miscalculation in a bet hedging strategy, but actually overriding evolution) they would be selected out of the system faster than a lemming that likes to BASE Jump.
Now, cultural evolution can trick some of these things. Because evolution via meme is not under the same genetic constraints (They can transfer horizontally in the population) group selection can work. The more successful memes (ethical memes are in this category) root themselves in cognitive processes that the individual holding the meme cannot override, and selection cannot decouple. In this way you get an interesting dynamic between the individual seeking a marginal fitness increase, and the rest of the group. The way we have responded to these memes is with the ability to very easily rationalize our choices to get rid of cognitive dissonance. This is where you get things like rape myths (she's asked for it, dressing all smutty like that) justifications for cheating (the relationship is shitty, he/she is two states away...) and rationalizations for certain other crimes (killing them would be doing the world a favor).
The simple fact is, whenever those come into play, your subconscious has already made the decision. The inverse is true. WHen you invoke ethics when making a non-contrived decision (read: a real moral decision, not one contrived in a thought experiment) Your subconscious processes have already made the decision (there have been brain scans showing this, I will have to talk to Maybird for the references) Your conscious mind just needs to take a moment to convince yourself that it was the right one.
...
I wont even get into the idea that all formal ethics do is give you a convenient way to rationalize a decision you have already made.
GALE Force Biological Agent/
BOTM/Great Dolphin Conspiracy/
Entomology and Evolutionary Biology Subdirector:SD.net Dept. of Biological Sciences
There is Grandeur in the View of Life; it fills me with a Deep Wonder, and Intense Cynicism.
Factio republicanum delenda est
BOTM/Great Dolphin Conspiracy/
Entomology and Evolutionary Biology Subdirector:SD.net Dept. of Biological Sciences
There is Grandeur in the View of Life; it fills me with a Deep Wonder, and Intense Cynicism.
Factio republicanum delenda est
- fuzzymillipede
- Youngling
- Posts: 96
- Joined: 2005-03-17 03:05pm
Re: Moral Dilemma
Do you really think that the morality of a decision to the individual making said decision is a direct result of the ethical/logical deductions made by said individual? Rather, the deductions are just the individual weighing which outcome feels the "best." The moral decision will always be the one that feels "right."Bullshit. There are certainly plenty of choices that result in one group having to sacrifice well-being for the sake of another, that why there are ethical arguments at all. But the only discussion that needs to be made is the relative weighting of what's more important. Is that person's life more important than the happiness of these thousand people. Should this person have to care most of their life so another can live at all with a reduced quality of life (like in your example). But these distinctions only result in different schools of ethics, not in moral relativism. Once you'e established these "rules", the answer in computable from the data.
What "feels right" is a construct of evolution which results in the continuation of the species, not a moral outcome. We don't leave the weak to die and we don't take every chance we can to procreate because our morality overrules our evolutionary drive. Society has "programmed" us, like robots, to know what's right and wrong in most cases. We can deconstruct this feeling and see why what feels right is also usually what's best for society.
Give me an example of a moral decision that doesn't feel "right" to the individual. Others may judge an individual's actions differently, but that is irrelevant to what the individual thinks of his decision. Take for example, the varying attitudes on porn. Some people feel that viewing porn is highly immoral, while others have no problem with it. Why? Because due to their societal programming, one group feels "bad" when they think about porn. Each group has their own rationale on why porn is immoral or not, but the individual's view on the morality of porn is ultimately decided by their gut feeling.
Luckily, people can be "reprogrammed" on these societal issues, but that requires the individual to cast away certain ideals they hold. It is impossible for an individual to make a decision that "feels right," and violate their societal/cultural ideals in the process, unless the decision feels so "right" for other reasons that it overrides the "wrong." They can use all the logic they want, but they won't feel that the decision is moral as long is it feels "wrong." This is why the logic used by one group to justify their morality often has no effect on another group's views.
Re: Moral Dilemma
Yes. We can argue until we're blue in the face about the meaning of a word, but if we're using the same word to mean different things, we're using different words. Language is an arbitrary assignment of sounds or letters to objective reality. The words don't matter as long as both people know what they mean in a certain context.Surlethe wrote:To play a little devil's advocate in the interest of arguing --
Why? Because you say so?Samuel wrote:That is the definition of wrong- it increases suffering.
Re: Moral Dilemma
Mandatory self promotion (and theists)!
http://talk.thinkingmatters.org.nz/2008 ... mment-1392
How can you make a coherent ethical system without basing it on that?
Of course, if there are no brain functions, we pull the plug.
For example, who could hate radiant cg?
http://talk.thinkingmatters.org.nz/2008 ... mment-1392
Obviously, if another person believes the opposite, we kill them. It is what they would have wanted.You and Samuel both do not understand what fuzzymillipede is asking (if I understand him correctly). He is asking how do you argue against moral relativism. If we understand that morality is a conscious choice, how can you logically argue with someone who has a consistent system of morality which is differant from yours.
How can you make a coherent ethical system without basing it on that?
Well, we want the week to be cared for because we could end up in their place. We take care of them because it could be us in that position, or someone we care about.We have been programmed by evolution to care for the sick, but this natural urge can be "misguided," if we hold that an absolute moral truth is defined by a net increase of well-being for all sentient beings.
Of course, if there are no brain functions, we pull the plug.
Heuristic- it is hard to do the full calculations.Rather, the deductions are just the individual weighing which outcome feels the "best." The moral decision will always be the one that feels "right."
Huckleberry Finn.Give me an example of a moral decision that doesn't feel "right" to the individual.
Some people feel people who are different from them are bad and try to claim moral sanction. It doesn't mean their claim is a moral decision- just they are feeling the urges in a similar place.Take for example, the varying attitudes on porn. Some people feel that viewing porn is highly immoral, while others have no problem with it.
For example, who could hate radiant cg?
People have to use there own ideals to overwrite the wrong one... aka realize that their ethics were inconsistant.Luckily, people can be "reprogrammed" on these societal issues, but that requires the individual to cast away certain ideals they hold. It is impossible for an individual to make a decision that "feels right," and violate their societal/cultural ideals in the process, unless the decision feels so "right" for other reasons that it overrides the "wrong." They can use all the logic they want, but they won't feel that the decision is moral as long is it feels "wrong." This is why the logic used by one group to justify their morality often has no effect on another group's views.
You fool! Soon they will realize the English language is arbitrary!Feil wrote:Yes. We can argue until we're blue in the face about the meaning of a word, but if we're using the same word to mean different things, we're using different words. Language is an arbitrary assignment of sounds or letters to objective reality. The words don't matter as long as both people know what they mean in a certain context.Surlethe wrote:To play a little devil's advocate in the interest of arguing --
Why? Because you say so?Samuel wrote:That is the definition of wrong- it increases suffering.
Re: Moral Dilemma
No, I don't think Samuel simply meant that "wrong" denotes "increased suffering"; he seems to have seized upon some variant of an ethical system based on the minimization of (human?) suffering. That's all good and well, but if the only criterion for its truth is "it is the consensus", as is the case in language, then it is quite possible for someone to self-consistently reject it. And that returns the argument to fuzzymillipede's original observation: "In this way, the only thing resembling "objective" morality is a group consensus on what is right and wrong."Feil wrote:Yes. We can argue until we're blue in the face about the meaning of a word, but if we're using the same word to mean different things, we're using different words. Language is an arbitrary assignment of sounds or letters to objective reality. The words don't matter as long as both people know what they mean in a certain context.Surlethe wrote:To play a little devil's advocate in the interest of arguing --
Why? Because you say so?Samuel wrote:That is the definition of wrong- it increases suffering.
Drawing an analogy to language doesn't help the point that morality is universally something like utilitarianism or secular humanism; billions of people self-consistently reject each others' linguistic definitions by the very nature of speaking different languages.
A Government founded upon justice, and recognizing the equal rights of all men; claiming higher authority for existence, or sanction for its laws, that nature, reason, and the regularly ascertained will of the people; steadily refusing to put its sword and purse in the service of any religious creed or family is a standing offense to most of the Governments of the world, and to some narrow and bigoted people among ourselves.
F. Douglass
Re: Moral Dilemma
"Wrong" and "right" are words, and therefore they are language. Because they are language, they are arbitrary by definition. They can represent any number of objective observables, like how well an action conforms to the teachings of [insert holy book here], or how well an action conforms to what I like, or (as is the case with Samuel's super-simplification of utilitarianism) its net contribution to human suffering or happiness. Pointing out that the word's meaning is arbitrary and then jumping to the conclusion that the thing the word is representing is arbitrary is laughable if you actually think it through - correct me if I'm wrong, but this is what it seems like you're doing. Of course "right" and "wrong" are arbitrary. So is "red".Surlethe wrote:Drawing an analogy to language doesn't help the point that morality is universally something like utilitarianism or secular humanism; billions of people self-consistently reject each others' linguistic definitions by the very nature of speaking different languages.
Re: Moral Dilemma
Alyrium Denryle wrote:Um... No. I love how people have this caricature of evolution, and that the selective pressures present on a population automatically lead us to do things like leave the weak to die. We have evolved the basis of robust ethics via the evolution of empathy and compassion that you can see in humans as young as 18 months old (which is really just the fastest they can you know...do anything)
snip
Sorry. I really didn't mean to pull the old "altruism is against evolution" crap out. I really should know better.
My intention was to disconnect a purely evolved notion of morality. To show that often what is genetically programmed into us is not the ethical decision. I picked a very bad and fallacious example, sorry. Perhaps I should have picked non-sentient beings as an example. No one's going to argue that eating the offspring of rivals to increase the chance of survival of your own is ethical. Note, I'm not arguing that our ability and desire to make ethical choices was not evolved, merely that the "feels right" approach based on any source, evolution or indoctrination, can be flawed and as such is not a good way to measure morality.
I like pigs. Dogs look up to us. Cats look down on us. Pigs treat us as equals.
-Winston Churchhill
I think a part of my sanity has been lost throughout this whole experience. And some of my foreskin - My cheating work colleague at it again
-Winston Churchhill
I think a part of my sanity has been lost throughout this whole experience. And some of my foreskin - My cheating work colleague at it again
Re: Moral Dilemma
fuzzymillipede wrote:Do you really think that the morality of a decision to the individual making said decision is a direct result of the ethical/logical deductions made by said individual? Rather, the deductions are just the individual weighing which outcome feels the "best."
I never said people always go through the process of making logical deductions to make their decisions. I said it's the only way to defend such decisions.
fuzzymillipede wrote:The moral decision will always be the one that feels "right."
Give me an example of a moral decision that doesn't feel "right" to the individual.
How about one that doesn't feel right either way? Trolley car problem. 5 people tied to the track about to get hit by the train (trolley car in original problem, hence the name). You can deviate the train onto a separate line, but a small child is playing on the track and will get hit, what do you do? Neither option feels right, but I bet you could still debate out an answer, couldn't you?
What you're saying here is that what feels right IS moral, and thus everyone has their own morality. Jack the Ripper thought he was being moral by killing prostitutes, so he was. Ditto for abortion clinic bombers. I'm saying, just like in these very obvious cases, there is an answer to whether porn (and everything else) is moral or not that can be debated and settled upon with an objective code of ethics, ie one that can be written down and adhered to.fuzzymillipede wrote:Others may judge an individual's actions differently, but that is irrelevant to what the individual thinks of his decision. Take for example, the varying attitudes on porn. Some people feel that viewing porn is highly immoral, while others have no problem with it. Why? Because due to their societal programming, one group feels "bad" when they think about porn. Each group has their own rationale on why porn is immoral or not, but the individual's view on the morality of porn is ultimately decided by their gut feeling.
Not sure what you're saying here, but it sounds like you mean that if you're actions conform to society's expectations, then you're moral. If so then the reverse should also apply, if your actions do not conform to society's expectations, then you're not moral. Which means there are still plenty of society's in the world where homosexuality is immoral. Sounds like a terrible way to judge morality.fuzzymillipede wrote:Luckily, people can be "reprogrammed" on these societal issues, but that requires the individual to cast away certain ideals they hold. It is impossible for an individual to make a decision that "feels right," and violate their societal/cultural ideals in the process, unless the decision feels so "right" for other reasons that it overrides the "wrong." They can use all the logic they want, but they won't feel that the decision is moral as long is it feels "wrong." This is why the logic used by one group to justify their morality often has no effect on another group's views.
I like pigs. Dogs look up to us. Cats look down on us. Pigs treat us as equals.
-Winston Churchhill
I think a part of my sanity has been lost throughout this whole experience. And some of my foreskin - My cheating work colleague at it again
-Winston Churchhill
I think a part of my sanity has been lost throughout this whole experience. And some of my foreskin - My cheating work colleague at it again