Critera for Morality.

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

Post Reply
User avatar
Boyish-Tigerlilly
Sith Devotee
Posts: 3225
Joined: 2004-05-22 04:47pm
Location: New Jersey (Why not Hawaii)
Contact:

Critera for Morality.

Post by Boyish-Tigerlilly »

help if you wish, but I am working, IN ethics, on some ideas about morality, and I am trying to come up with some practical rules/criteria for a moral theory based on rationality.


How are these, and should I add anything, in your opinion. They are made to be used in concert with one another or in a hierarchy like Prima Facie duties.



1. Principles of Non-necessity and Necessity
2. Auxillary Mandate: Are there other options for accomplishing the same goal or one which proves less economically/physically damaging?

3. Practicality: Is the moral maxim or other option available practically doable Economically or Physicaly? Something is useless if it can't be realistically achieved.

4. Cultural Comparison and Purpose of Morality: MOrality is an adaptation and a tool, if it does more harm than good on a macro-scale, one usually shouldn't do it, unless there is a mitgating circumstance (IE. Emergency).

If moral maxim is applied on a macro-scale and causes dmg (fear, paranoia, needless deaths) you shouldn't do it, because logically, what would be the point of a moral system everyone could apply if it causes culture not to be able to provide for it's basic function? It would irrationally run counter to the purpose of a moral system/ culture. (This is why I think it is generally good to have a system of justice/Rights, because, while it might prevent citizens from doing "ultimate" happiness in the short run, it prevents the problems in the long run).


The goal is to create a responsible, rational and logical argument for a right/wrong. Oh, and I will give anyone name credit in the credits of my work for anyone who wamts something in, or thinks that they have a logical idea.
User avatar
Darth Wong
Sith Lord
Sith Lord
Posts: 70028
Joined: 2002-07-03 12:25am
Location: Toronto, Canada
Contact:

Post by Darth Wong »

At some point you're going to have to simply ask that the reader accept your proposed "purpose of morality" (whatever you decide it is) as self-evident. As perverse as it is, there is no real way to prove logically that "promote the general welfare of humanity and minimize suffering and injustice" is intrinsically superior to "promote the prosperity of the fittest in society and ruthlessly crush the rest" as an overriding goal, even if it seems manifestly obvious to you.
Image
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing

"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC

"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness

"Viagra commercials appear to save lives" - tharkûn on US health care.

http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
User avatar
Boyish-Tigerlilly
Sith Devotee
Posts: 3225
Joined: 2004-05-22 04:47pm
Location: New Jersey (Why not Hawaii)
Contact:

Post by Boyish-Tigerlilly »

At some point you're going to have to simply ask that the reader accept your proposed "purpose of morality" (whatever you decide it is) as self-evident. As perverse as it is, there is no real way to prove logically that "promote the general welfare of humanity and minimize suffering and injustice" is intrinsically superior to "promote the prosperity of the fittest in society and ruthlessly crush the rest" as an overriding goal, even if it seems manifestly obvious to you.
That was my primary fear. I am worried about making sweeping assumptions, but I don't think I will be able to completely avoid it. As for my purpose of morality, I just cannot see a tool as intentionally being maladaptive. Who would create a system of values which actively seeks to do the most dmg, for example.

I guess I could go from a naturallistc or cultural perspectve: It has been shown, through observatiion, that the purpose of culture is survival and flourishing as a species, but then I am afaid of the Naturalistic fallacy.
User avatar
Boyish-Tigerlilly
Sith Devotee
Posts: 3225
Joined: 2004-05-22 04:47pm
Location: New Jersey (Why not Hawaii)
Contact:

Post by Boyish-Tigerlilly »

would humanity be seen as a more socially competitive or cooperative species. It seems like things work out more when people cooperate. I also took a look at some other primate groupes, like the Bonobo, and it seems that way for them too.
User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Post by Kuroneko »

Boyish-Tigerlilly wrote:1. Principles of Non-necessity and Necessity
Hn?
Boyish-Tigerlilly wrote:2. Auxillary Mandate: Are there other options for accomplishing the same goal or one which proves less economically/physically damaging?
However, what kind of goals are worthy of pursuit in the first place?
Boyish-Tigerlilly wrote:3. Practicality: Is the moral maxim or other option available practically doable Economically or Physicaly? Something is useless if it can't be realistically achieved.
Surely not! A partial success is often much better than nothing! For example, eliminating poverty is something that is not economical or practical. However, that does not mean that efforts or reduce it should not be made. A more colorful example is the penal system, which fails to eliminate murder.
Boyish-Tigerlilly wrote:4. Cultural Comparison and Purpose of Morality: MOrality is an adaptation and a tool, if it does more harm than good on a macro-scale, one usually shouldn't do it, unless there is a mitgating circumstance (IE. Emergency).
This does not seem to be substantially different from restricted utilitarianism.
Darth Wong wrote:As perverse as it is, there is no real way to prove logically that "promote the general welfare of humanity and minimize suffering and injustice" is intrinsically superior to "promote the prosperity of the fittest in society and ruthlessly crush the rest" as an overriding goal, even if it seems manifestly obvious to you.
It may be possible as an analogue of Rawls' argument for egalitarianism. With no more axiom than 'bias is bad', ask the person to mentally remove themselves from their specific context, and ask a question along the lines of: 'what would you prefer if you were to exchange places with a completely random person in the world?' On the other hand, it is not at all clear whether an egalitarian distribution of rights is actually optimal under this argument. Under strict probabilistic rules, the rational thing to do would be to maximize expectation, and that might allow extreme skewing in order to push the average happiness (or whatnot) up.


Oh, and as far as that debate, I'll get to it by tomorrow morning at the latest. Apologies for the delay.
User avatar
Boyish-Tigerlilly
Sith Devotee
Posts: 3225
Joined: 2004-05-22 04:47pm
Location: New Jersey (Why not Hawaii)
Contact:

Post by Boyish-Tigerlilly »

This does not seem to be substantially different from restricted utilitarianism.
Well. I like the basic premise of Utiltiarianism, but I don't like the whole happiness idea. I think that's wrong. I think it should be something more substantial and less subjective, but I think since humans are laregly social creatures, adn culture is important, there has to be something which, outside of emotions or feelings, can be used to judge safety, health, stablity, and I think trying to act the most logical/rational can help, when combined with intensive science/math/logic training.
However, what kind of goals are worthy of pursuit in the first place?
Personally, I would see to it that people are mentally/ physically/ and economically stable in general. Everyone, to the best of abilities, should be provided with the basic necessities, since culture's purpose is to provide the Bio/integral, and instrumental needs of a civilization. This does not mean that they get hand-outs if they have alternatives to such.

I think the most worthy goals are learning, saftey, and health. I just want to avoid the subjective emotional aspects and keep morality more to something that can be calculated more easily. Some of the above are sketchy, but I think overall the health and stability can work with other tools. Still thinking of stuff.

Surely not! A partial success is often much better than nothing! For example, eliminating poverty is something that is not economical or practical. However, that does not mean that efforts or reduce it should not be made. A more colorful example is the penal system, which fails to eliminate murder.

When I say practicality is a criteria, I am not saying drop everything, but only do what is reasonably within reach. I try to be a realist and a "minimally decent citizen." I am not gonna be Mr. Charity, and I don't think I am gonna be immoral for it. THe idea of the greatest good I believe is taken too far. You can never do good enough if you are always trying to do the greatest for the greatest number.

My basic idea was to only worry about that which can be realistically achieved and not to overdo it. For instance, i would think it practical to allieviate some problems with poverty, but to think about doing everything and anything is impractical and also completing the task. Don't worry about things you have no realistic control over. What is practical is what you are capable of doing while not hurting yourself needlessly in the process. People have to have some self-interest, at least to a degree.

I especially think you shouldn't sacrifice your own life unless you have a reasonable chance to save more than equal to your own number of persons or those who are of higher value, because exchanging one life for another is tricky. At least then the cost-benefit would be +.

A more colorful example is the penal system, which fails to eliminate murder.
Good example. I think the penal system is horribly inefficient, but it could be better, but many options are totally impractical. They really need to look at auxillery options and compare them to find the most practical approach to fixing the problem. If punishing crimals by execution doesn't prevent more murders/crimes as a deterrant, then what are the practial options? Life in prison is one. People have to choice what's doable. A radical new change in the system might be wonderful,but if no one woudl tolerate it/follow it, there is little point.

Boyish-Tigerlilly wrote:
1. Principles of Non-necessity and Necessity

Hn?
Basically, this idea is meant to work in tandem with other principles. If you look at a situation i believe you can divide choices/items into things that you need to do and things that you don't need to do or are just a convenience. I don't think convenience shoudl be an excuse to do something.

For example. Theft. Theft, in general, I would consider immoral because it violates the basic function of a culture. Now, there is no rational reason, other than selfish want or desire, to steal a non-necessity item, so I can't see how it could ever be moral, especially since there are other ways to get the same object. Get a job and stop being a leech.

Now, non-necessities, on the other hand, are things liek food, water, medicine, or basic shelter.

In modern society, there are many auxilleries. You can acquire basics by many means. Your own job and money, insurance plans, social security, welfare, but if you can't get it through these means, and no family members can help you, then what are you left to do? Starve, Die? I think not. I think, this minority of people, if they cannot get jobs, if they cannot get money, despite the availability, would be ok to steal it. Of course, this is an extreme, so I would suppose it would do the least ammout of damage possible.

Of course, this isn't a license for anyone to steal, since this is only geared toward rare minority cases. It shoudl do the least dmg possible, even if it exists at all.

1. Are there are options and is what you need necessity? If not, survial is better than death and so is social stability, which, since the group doing this would be a minority and the majority wouldn't be allowed to do this, it should do a minimum of dmg.

2. Never steal a non-necessity, because I can't see there being a rational reason, it would lead to far more social damage.


I think the principle of necessity can be combined wth auxilleries and the rest to even form opinions on Euthanazia, suicide, abortion.
User avatar
Nova Andromeda
Jedi Master
Posts: 1404
Joined: 2002-07-03 03:38am
Location: Boston, Ma., U.S.A.

Post by Nova Andromeda »

--Darth Wong is right in that you need to start from somewhere before you can use logic to get anywhere. If I were you I would strictly define what you mean by right and wrong. Once you have done that you can then go about analyzing which actions are right and which are wrong.
-As a side note, you could start from the observation that people have a "goal." Strictly speaking, they receive input and transform it into output. The nature of this transform is the goal I'm referring to and includes everything a person wants (plus other stuff such as involuntary actions). One could start their analysis from this direction. It might also help to know that a person can only do something they want to, they are forced to, or by accident.
Nova Andromeda
User avatar
Boyish-Tigerlilly
Sith Devotee
Posts: 3225
Joined: 2004-05-22 04:47pm
Location: New Jersey (Why not Hawaii)
Contact:

Post by Boyish-Tigerlilly »

What confuses me, is how to ethicists come up with stuff without sounding liek they completely make shit up? Many theories I have read seem totally arbitrary.
User avatar
Boyish-Tigerlilly
Sith Devotee
Posts: 3225
Joined: 2004-05-22 04:47pm
Location: New Jersey (Why not Hawaii)
Contact:

Post by Boyish-Tigerlilly »

Take Utilitarianism: They say the most useful is the most happiness, but I always thought of Utilitarian being the most efficient and the most practial for a situation/thing. That seems more in sinc, to me, with the definition of utility.
User avatar
Nova Andromeda
Jedi Master
Posts: 1404
Joined: 2002-07-03 03:38am
Location: Boston, Ma., U.S.A.

Post by Nova Andromeda »

Boyish-Tigerlilly wrote:What confuses me, is how to ethicists come up with stuff without sounding liek they completely make shit up? Many theories I have read seem totally arbitrary.
--It's not like ethicists are necessarily trained in logical thought and let me point out that most morality systems are not logically thought out at all. Some if not many are rationalized after that fact. Add to this that fact that morality systems (the vast majority) start from some base set of assumptions and I'm not surprised people just make s*#$ up.
Nova Andromeda
User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Post by Kuroneko »

Boyish-Tigerlilly wrote:Take Utilitarianism: They say the most useful is the most happiness, but I always thought of Utilitarian being the most efficient and the most practial for a situation/thing. That seems more in sinc, to me, with the definition of utility.
The most striking feature of utilitarianism is that one can redefine 'utility' with little change to overall structure of the system. For example, cat-utilitarianism would hold that the most morally preferable alternative would be one that maximizes the number of cats. Very silly, but...

As for happiness in particular, I agree completely--it's far from enough by itself. Would you rather be an unhappy Socrates or a contented pig? Which one does the utilitarianism of Bentham or Mill say should be preferable?
User avatar
Boyish-Tigerlilly
Sith Devotee
Posts: 3225
Joined: 2004-05-22 04:47pm
Location: New Jersey (Why not Hawaii)
Contact:

Post by Boyish-Tigerlilly »

YEs. Personally, Mill's seems more refined and usefull, but it needs futher modification, something more concrete, yet non-kantian goodness. :D

I still don't get Kant's justificaiton of Capital Punishment
User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Post by Kuroneko »

Boyish-Tigerlilly wrote:I still don't get Kant's justificaiton of Capital Punishment
It's paralleling universalizability of maxims. Basically, those who commit a moral crime should be treated accordingly to the maxims of their actions while still preserving their dignity. Thus, to respect a murderer, one must execute him. Leaving him unpunished, on the other hand, would effectively say that they are not moral agents (things are not punishable, people are), and thus disrespect his humanity.
User avatar
Boyish-Tigerlilly
Sith Devotee
Posts: 3225
Joined: 2004-05-22 04:47pm
Location: New Jersey (Why not Hawaii)
Contact:

Post by Boyish-Tigerlilly »

I read in my textbook that it deprives people of dignigty by killing them. It is the ultimate removal of dignity.


He said you should treat a crime with a punishment equal to that of the crime so that justice cannot stay unbalanced, but wouldn't that be using a person as a means to an end? To achieve balance in justice?

Also, he says seems to disagree with the concept of relative punishment, and then agree to it, since he says torture for torture would be wrong.

ah well. He is confusing. I like Mill better. Kant is overly obtuse and esotericly absolute.
User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Post by Kuroneko »

Boyish-Tigerlilly wrote:I read in my textbook that it deprives people of dignigty by killing them. It is the ultimate removal of dignity.
However, it is substantially different in the case of the murderer. Things are not morally responsible, but people are. By not punishing, we would effectively deny that such a person is even a person. The 'relative justice' comes in that the best punishement would be according to the maxim of the criminal's own actions.
Boyish-Tigerlilly wrote:He said you should treat a crime with a punishment equal to that of the crime so that justice cannot stay unbalanced, but wouldn't that be using a person as a means to an end? To achieve balance in justice?
No, because justice is a purely metaphysical concept.
Boyish-Tigerlilly wrote:Also, he says seems to disagree with the concept of relative punishment, and then agree to it, since he says torture for torture would be wrong.
That would be immoral, because in such a case, the retributionary-torturer make himself inhuman. The point is not that relative punishment is denied, but limited so as not to increase immorality.
Boyish-Tigerlilly wrote:ah well. He is confusing. I like Mill better. Kant is overly obtuse and esotericly absolute.
Perhaps. I would say that Mill is overly simplistic.
User avatar
Boyish-Tigerlilly
Sith Devotee
Posts: 3225
Joined: 2004-05-22 04:47pm
Location: New Jersey (Why not Hawaii)
Contact:

Post by Boyish-Tigerlilly »

AHh thanks. Now his possition makes a bit more sense.


I am not against CP, however, as many utiltarians are. I think it would be right, but only if the could find a way to make it more cost-effictive or productive of revenue, or more practical.
User avatar
Darth Wong
Sith Lord
Sith Lord
Posts: 70028
Joined: 2002-07-03 12:25am
Location: Toronto, Canada
Contact:

Post by Darth Wong »

One of the biggest problems with ethics in general is the fact that people already know what they consider to be moral, so you tend to end up with what rather obviously appear to be various peoples' struggles to generate some kind of ad hoc explanation for a conclusion that is usually pre-ordained.
Image
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing

"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC

"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness

"Viagra commercials appear to save lives" - tharkûn on US health care.

http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Post by Kuroneko »

Boyish-Tigerlilly wrote:I am not against CP, however, as many utiltarians are. I think it would be right, but only if the could find a way to make it more cost-effictive or productive of revenue, or more practical.
Remember that you're also under no obligation to accept every Kant's application of his own principles. Some of them are rather... iffy.
Darth Wong wrote:One of the biggest problems with ethics in general is the fact that people already know what they consider to be moral, so you tend to end up with what rather obviously appear to be various peoples' struggles to generate some kind of ad hoc explanation for a conclusion that is usually pre-ordained.
There is certainly a lot of that, but attitudes towards ethics are far from homogeneous. The even some with the extreme opposite kind of view--they accept every consequence of an ethical system unquestioningly, regardless of what their prior moral inuition told them (if anything).
Frank_Scenario
Padawan Learner
Posts: 155
Joined: 2002-11-10 12:23am

Post by Frank_Scenario »

Darth Wong wrote:One of the biggest problems with ethics in general is the fact that people already know what they consider to be moral, so you tend to end up with what rather obviously appear to be various peoples' struggles to generate some kind of ad hoc explanation for a conclusion that is usually pre-ordained.
There is certainly a lot of that, but attitudes towards ethics are far from homogeneous. The even some with the extreme opposite kind of view--they accept every consequence of an ethical system unquestioningly, regardless of what their prior moral inuition told them (if anything).[/quote]

I tend to think that the role of moral theory is to clarify and systematize our pre-theoretic intuitions. When it comes to ordinary events, our moral intuitions will work just fine; it's when we start getting to the extraordinary that we run into problems and need to reference some system. For example, we know that rape is wrong, and any theory on which rape was permissible (or worse, obligatory or laudable) would immediately be dismissed. In this way, our intuitions can act as the final test for a theory. In this scenario, our intuitions are informed by the whole of human history and moral thought. However, there are plenty of issues where our intuitions aren't well-formed. Cloning is one such issue; it simply wouldn't have been possible to engage in substantive discussion about cloning until recently. Abortion is another; while it's existed for some time, and been the subject of widespread disapproval, our understanding of what goes on in pregnancy and in what reproductive rights women have has changed. This means that our pre-theoretic intuitions probably won't be sufficient to act as the final test of the theory. Therefore, if an otherwise uncontroversial theory would tell us that abortion is acceptable (or unacceptable) and we think we intuitively disagree, we ought to abandon our intuitions.

Unfortunately, I have trouble formalizing these notions. I don't really specialize in ethics, so I don't keep up on the terminology and the state of the art.
User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Post by Kuroneko »

Frank_Scenario wrote:
Kuroneko wrote:There is certainly a lot of that, but attitudes towards ethics are far from homogeneous. The even some with the extreme opposite kind of view--they accept every consequence of an ethical system unquestioningly, regardless of what their prior moral inuition told them (if anything).
I tend to think that the role of moral theory is to clarify and systematize our pre-theoretic intuitions. When it comes to ordinary events, our moral intuitions will work just fine; it's when we start getting to the extraordinary that we run into problems and need to reference some system. For example, we know that rape is wrong, and any theory on which rape was permissible (or worse, obligatory or laudable) would immediately be dismissed. In this way, our intuitions can act as the final test for a theory.
I had unrestricted utilitarianism in mind. Although such situations would be quite extraordinary, it is possible for there to be situations in which rape is morally obligatory or even supererogatory under such a system, regardless of desert. For many people, myself included, pre-theoretic moral intuition includes the principle that despicable acts should not be put upon the undeserving. Such extraordinary cases are at best a necessary evil, but never a moral act.
Frank_Scenario wrote:Abortion is another; while it's existed for some time, and been the subject of widespread disapproval, our understanding of what goes on in pregnancy and in what reproductive rights women have has changed. This means that our pre-theoretic intuitions probably won't be sufficient to act as the final test of the theory. Therefore, if an otherwise uncontroversial theory would tell us that abortion is acceptable (or unacceptable) and we think we intuitively disagree, we ought to abandon our intuitions.
A problem comes in when there are two competing theories that are for the most part extensionally equivalent but differ in a few cases--say, on abortion. The problem is then how to decide between the two theories without begging the question on some of those cases.
Post Reply