• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

A Rational Definition of Morality

GreyICE

Unregistered
Joined
Mar 6, 2008
Messages
7,149
Is it possible to arrive at a rational definition of morality that you can use to judge situations?

I'm asking because I don't personally believe there is a rational basis for morality. I have a code of personal ethics, and I believe it is possible to determine actions that, as a society, will have a positive or negative impact upon the society, but I can't figure out a rational way to create moral judgments.

So here's my question: Is there any way we can judge any subset of human actions - murder, necrophilia, charitable donations, homosexuality, idol-worship, volunteer work, saving someone's life, whatever - right or wrong without positing a force greater than ourselves that establishes the criteria for this right or wrong?

P.S. Not really looking for Theists' perspectives on the matter.
 
Last edited:
No, I don't think so.

Probably the best I have heard is a sort of reworking of Kant's schema. Take the end's principle and extend it into the idea of a moral sphere where everyone woks according to the end's principle. If someone breaks the moral sphere and begins to treat someone else as a means and not an end, then we are morally justified in not respecting that person's wishes.

So, when the Nazi's come knocking at the door I can still blithely lie to their faces about the Jewish family I have hidden in the basement.

I still don't think it helps much when there are five workers on one track and one worker on the other track and a train headed their way.
 
You might want to look into Objectivism. It's Ayn Rand's secular philosophy positing a series of rational moral ideas.
 
Well, there have been attempts by philosophers, but basically, there is always something you have to 'just assume' to be good or bad.

For instance, utalitarianism is rational in a way, but you still need to insert some basic values into the system for it to be useable.
 
Is it possible to arrive at a rational definition of morality that you can use to judge situations?

I'm asking because I don't personally believe there is a rational basis for morality. I have a code of personal ethics, and I believe it is possible to determine actions that, as a society, will have a positive or negative impact upon the society, but I can't figure out a rational way to create moral judgments.

So here's my question: Is there any way we can judge any subset of human actions - murder, necrophilia, charitable donations, homosexuality, idol-worship, volunteer work, saving someone's life, whatever - right or wrong without positing a force greater than ourselves that establishes the criteria for this right or wrong?

P.S. Not really looking for Theists' perspectives on the matter.
What you propose as an alternative isn't really "moral" either, or at least violates ethics as I see it. Plus, you run into the problem of zero evidence for any sort of force that dictates rules.
 
So here's my question: Is there any way we can judge any subset of human actions - murder, necrophilia, charitable donations, homosexuality, idol-worship, volunteer work, saving someone's life, whatever - right or wrong without positing a force greater than ourselves that establishes the criteria for this right or wrong?

Morality based upon force is no morality at all: Its an ultimatum or contract of some sort. Morality is a personal recognition -- not an externally dictated rule.
 
Last edited:
IMO, the best we can do is to understand why we have a moral sense in the first place and what that moral sense is and then work from there.

If you haven't read it I highly recomend Shermer's The Science of Good and Evil.
 
Last edited:
I'm a "market force" kind of guy. Meaning, if you try to kill me, I'll stop at nothing to stop you - I'm willing to bet my life against your act. Ditto if you try to kill a loved one. Is murder 'bad' from a rational point of view. Dunno, don't care, but I'll kill you if you try it. I just won't put up with it, and I'm not going to sit around in an armchair debating the merits. You don't get to kill me or my family, period.

Take it down a step, say freedom. I can see fighting a war to protect my freedom. It's not quite as important and immediate as a murder, so we will try diplomatic means, etc., but if it comes down to it, ya, I'll kill you if you try to irrevokeably remove my freedoms.

Take it down several more steps, and I'll sit around in an armchair and argue whether it is moral to lie. I'll do things like enact laws, pay a police force, etc., to keep you from lying (such as, say, false advertising, cooking your books, etc), but I certainly won't kill you for it.

Etc. Essentially you end up with a marketplace, where what is right is more or less determined by what the society as a whole is willing to do about something. I recognize the possibility for abuse or failure here - honor killings are very immoral to me, but not in some cultures. But I don't see an alternative, any way to develop a rational set of morals that everyone agrees with. In time, as people become educated about various ways of living I trust that we will settle upon a set of guidelines that works for the vast majority of people.
 
The way I see it the fundamental basis for morality should be empathy. I think that its immoral to cheat a person because I realize that I too am a person. If it is okay to cheat people then its okay to cheat me. If you believe its okay for you to kill another person for economic gain then you're sating that it is okay for you to be killed for the same reasons.
 
I believe we don't need such a definition (nor a religious one btw). I believe that our morality is already encoded in our biology and shaped (to a lesser extent) by our culture. Now, note that I also believe we will not have an answer unless we could create a society from nowhere.

Every individual we know is already shaped, what we see is the behavior resulting from the combination of biological and cultural (expressed and hidden) rules. In order to explore your questions we would have to choose certain individuals (the less culturally shaped, probably only cultural skeptics:p), take them to an isolated situation (an island) and then let them interact, without any possible contact with current civilization) for a fairly big period of time.

As control groups we would need two more (at least) one composed by criminals (not sociopaths or serial killers) and a second one by "normal individuals" (if such a thing exists), like maybe woos that believe they behave morally because of hell.

In the end, I also believe that we can BELIEVE a lot of things regarding why we behave the way we do, and all of them will be wrong. Take for instance "freedom". It is a key concept for every society that believes in other two "responsibility" and "morality". And I believe there is not such a thing, we have this feeling about being free, but in reality our biology (and to a lesser extent our culture) "chooses for us".
 
What you propose as an alternative isn't really "moral" either, or at least violates ethics as I see it. Plus, you run into the problem of zero evidence for any sort of force that dictates rules.
Morality based upon force is no morality at all: Its an ultimatum or contract of some sort. Morality is a personal recognition -- not an externally dictated rule.
Well consider the system proposed by Isaac Asimov in Foundation - a system where a 'higher intelligence' has created a set of rules to live by. The higher intelligence intends only good in the long run for those following the Foundation, so it becomes moral to follow the dictates of the higher intelligence, and immoral to oppose them. Foundation is really entirely about religion and the nature of higher intelligence.
AkuManiMani said:
The way I see it the fundamental basis for morality should be empathy. I think that its immoral to cheat a person because I realize that I too am a person. If it is okay to cheat people then its okay to cheat me. If you believe its okay for you to kill another person for economic gain then you're sating that it is okay for you to be killed for the same reasons.
Morality is extended personal ethics? An interesting idea, but not necessarily one I'd endorse. For instance, I wouldn't particularly like to be cheated, I don't cheat others, but where do you draw the line? Most of us would be willing to take a person's money for something they don't really need if they wanted to give the money to you. How far of a stretch is it from taking someone's money for a product or service they may or may not need because they insist on buying it to televangalism? And how far from there to the shell game on the street?

I believe we don't need such a definition (nor a religious one btw). I believe that our morality is already encoded in our biology and shaped (to a lesser extent) by our culture. Now, note that I also believe we will not have an answer unless we could create a society from nowhere.

I don't believe in any morality. I don't think there's a morality gene, or hardwired instincts that certain things are immoral. There's probably various genes and traits we have hardwired that enhance survival, but morality can't simply be about propagation of your personal genetic code. If it was hardwired into me, wouldn't I lack a choice about believing it?

IMO, the best we can do is to understand why we have a moral sense in the first place and what that moral sense is and then work from there.

If you haven't read it I highly recomend Shermer's The Science of Good and Evil.

Damn, I'm going to have to pick that up. Looks interesting.
 
Last edited:
Morality is extended personal ethics? An interesting idea, but not necessarily one I'd endorse. For instance, I wouldn't particularly like to be cheated, I don't cheat others, but where do you draw the line? Most of us would be willing to take a person's money for something they don't really need if they wanted to give the money to you. How far of a stretch is it from taking someone's money for a product or service they may or may not need because they insist on buying it to televangalism? And how far from there to the shell game on the street?
How does empathy not help against this?
 
How does empathy not help against this?

"If I was that stupid I'd deserve to lose my money." That help you understand?

If we assume absolute sympathy for others, then we develop other problems. If we assume limited sympathy, we have to account for the fact that people consider themselves superior (yes, the vast majority consider themselves above average) so they see no problems with doing things that they'd never fall for, or which they'd see through. Self-justification, a wonderful thing.
 
"If I was that stupid I'd deserve to lose my money." That help you understand?

If we assume absolute sympathy for others, then we develop other problems. If we assume limited sympathy, we have to account for the fact that people consider themselves superior (yes, the vast majority consider themselves above average) so they see no problems with doing things that they'd never fall for, or which they'd see through. Self-justification, a wonderful thing.

Did you mean to use sympathy in that reply? You are responding to a point made about empathy. They are not identical.

If you meant

"If we assume absolute empathy for (with?) others, we develop other problems"

I will agree with you, since an absolute empathy interferes with rational self interest.

DR
 
Last edited:
Morality is whatever you decide it is. It's entirely subjective.

While morality is heavily subjective I would say that some systems of morality have more objective value than others -- i.e they leave fewer dead/miserable people if followed.
 
Did you mean to use sympathy in that reply? You are responding to a point made about empathy. They are not identical.

If you meant

"If we assume absolute empathy for (with?) others, we develop other problems"

I will agree with you, since an absolute empathy interferes with rational self interest.

DR
I apologize. I did mean empathy. And the problem I see with an empathy-based system is exactly what you said - at some point empathy is going to run smack into self-interest. Since empathy is abstract, and self-interest is very real (for the self-interested person) the system breaks. Now we have three personal options at this point. The first is to declare that you broke the system, and that you are penitent. Then the question arises 'would you do it again?' Since the answer is usually yes, the system becomes absolute when applied to others (no self-interest) but relative applied to yourself - slightly hypocritical, to say the least (and yes, we see it all the time). The second option is to say that it wasn't wrong, and adjust your morality. Then morality becomes secondary to best interest. The third option is to say that it was wrong to break it, and resolve that obeying personal self interest over abstract morality is wrong. That is the 'enlightened' view, but runs into implementation problems in the real world.
 
Last edited:
Is it possible to arrive at a rational definition of morality that you can use to judge situations?


Does a rational definition have to give universal/user independent results or is it enough if the method is objective?
 
Last edited:
Does a rational definition have to give universal/user independent results or is it enough if the method is objective?

I'd say that the results for two people in similar situations would have to be reasonably similar. I mean the definition "It is moral to do whatever you want to do, and immoral to do anything you don't want to do" is completely objective ("Do as thou wilst should be the whole of the law") while giving you very little consistency. I don't think that's a moral code, in the common vernacular.

Morality needs to be applied to situations to be morality. Otherwise its personal ethics.
 

Back
Top Bottom