• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Where does morality come from?

rocketdodger

Philosopher
Joined
Jun 22, 2005
Messages
6,946
After reading "A Fire Upon the Deep" it occured to me that understanding where morality comes from will be very important in the future as we dabble in A.I. that might eventually become sentient.

My current take is that three things contribute to the generation of what we call morality (well, "good" morality).

First is empathy. It seems to me that empathy is a natural result of intelligence so any sufficiently intelligent agent will be able to empathize (unless it is actively suppressed for some reason, which might occur).

Second is the social aspect of our species. I think that most people realize they need others (or at least that their lives are better/easier because of others) on many levels, and so they figure helping people (which makes buddies) rather than hurting them (which makes enemies) is a good idea. If we were asocial creatures, like tigers perhaps, this probably wouldn't be a factor.

Third is the environment we were raised in. I don't have much to say about this except that I wonder about its relative weight compared to the other two factors.


Thoughts?
 
After reading "A Fire Upon the Deep" it occured to me that understanding where morality comes from will be very important in the future as we dabble in A.I. that might eventually become sentient.

It comes from Wal-Mart. $1.69 a pound unless there's a special going on, when they'll drop the price by a dime or so.
 
A New York Times article from a few months back has an interesting opinion on morality's origins as well, as observed in chimps.
 
It derives from empathy, which evolves as a survival technique in social animals.

BTW, "A Fire Upon the Deep" is an outstanding book. I like Vernor Vinge much better than his wife. Check out "A Deepness in the Sky" too. It took me several chapters to figure out what the aliens were, but once you realize it, they are incredibly well-developed.
 
It derives from empathy, which evolves as a survival technique in social animals.

BTW, "A Fire Upon the Deep" is an outstanding book. I like Vernor Vinge much better than his wife. Check out "A Deepness in the Sky" too. It took me several chapters to figure out what the aliens were, but once you realize it, they are incredibly well-developed.


Yeah I just finished that one. I have to say it is just about the best sci-fi novel I have ever read. As a fellow computer scientist Vinge knows how to catch my interest :)

So you think empathy is the result of evolution and not a spontaneous result of intelligence?
 
It derives from empathy, which evolves as a survival technique in social animals.

BTW, "A Fire Upon the Deep" is an outstanding book. I like Vernor Vinge much better than his wife. Check out "A Deepness in the Sky" too. It took me several chapters to figure out what the aliens were, but once you realize it, they are incredibly well-developed.

I thought it came from how we were raised, at least partially. But I am pretty moral, overall, and my sister....ummm...isn't, so maybe it is all just biological. Could an "empathy" gene be missed--not quite sociopathic, because she isn't, but more....narcissistic?

Example: She makes decent money, just divorced her 2nd husband and got a huge community property settlement. She decided she needed an increase in child support (1st husband) because he is earning more than he was when they got divorced. He is earning more because he remarried and their daughter has Cystic Fibrosis. His wife has to stay home with her to get her to her therapies and treatments, so he picked up a second job. My sister wants to take some of that, too. I was appalled. So morality is flexible, I think, and not strictly a social thing. Many people live well outside of the socially moral norm.
 
A New York Times article from a few months back has an interesting opinion on morality's origins as well, as observed in chimps.

from the article:

"Given the chance to get food by pulling a chain that would also deliver an electric shock to a companion, rhesus monkeys will starve themselves for several days."

Looks like the monkeys knew a lot more about morality than those scientists.
 
"moral" is what helps the tribe survive. Unfortunately, this logic also supports homophobia, racism, etc. Made sense on the savanna, but no more.
 
It comes from Wal-Mart. $1.69 a pound unless there's a special going on, when they'll drop the price by a dime or so.

Surely Wal-Mart never has sales or specials?

Don't they just "Roll Back the Date" to some mythical time the price was less?
 
My current take is that three things contribute to the generation of what we call morality (well, "good" morality).

First is empathy. It seems to me that empathy is a natural result of intelligence so any sufficiently intelligent agent will be able to empathize (unless it is actively suppressed for some reason, which might occur).

Second is the social aspect of our species. I think that most people realize they need others (or at least that their lives are better/easier because of others) on many levels, and so they figure helping people (which makes buddies) rather than hurting them (which makes enemies) is a good idea. If we were asocial creatures, like tigers perhaps, this probably wouldn't be a factor.

Third is the environment we were raised in. I don't have much to say about this except that I wonder about its relative weight compared to the other two factors.


Thoughts?

You've got most of it.

Morality is cultural, in that it is influenced through what we learn from those in our various social groups.

It is directly linked with a hierarchy of importances we call 'values'. Where two values conflict, we defend that which is more important to us. This behaviour is what we define as moral. For instance, I might want to eat a muffin. The value is in the reward of sating hunger with something sweet. Imagine the muffin is owned by a friend, though, and they want to eat it. To get it, you'd have to steal it. There are abstracted values of negotiated trust, of empathy with your friend and even a value in not suffering the consequences of whether they find out, all competing. You would value each of those in some order, with your own satisfaction at the top or bottom depending on your cultural development. This order would form the basis of your own morality. Comparing this to the morality of those individuals you deal with each day determines social interactions; hence its much easier if you all share values and complimentary behaviours.

As you move further from the social group in which your culture developed, you increase the chance that you will interact with people who do not share those values and have conflicting behaviours, and increase the chance of conflicting behaviours.

Athon
 
After reading "A Fire Upon the Deep" it occured to me that understanding where morality comes from will be very important in the future as we dabble in A.I. that might eventually become sentient.

In the case of AIs it will likely come from being massively out numbered and not very robust.
 
In the case of AIs it will likely come from being massively out numbered and not very robust.

I don't know if being outnumbered will matter, but certainly a lack of robustness might make them feel that they need humans to survive.

The worst case would be one that not only doesn't think it needs humans but realizes that we are a threat to its survival (and unfortunately most of us would be). Thats why this question is so important. I mean, if you made me into A.I. I wouldn't go around exterminating everyone, but then again my morality has had years to develop. I don't know how it would be if I spontaneously came into existence like an A.I. might...
 
I'm of the opinion that an AI, once turned on, would have to learn about the world like any other intelligence. Only then would it be able to come to decisions about the world and its place in it. I don't think it's practical to program a computer with all of the collective intelligence of the world. It'd be far more effective just to connect it to the Internet anyway.

And wouldn't that produce a well-balanced, informed intelligence?
 
I agree, but an intelligence composed of electronics would have the potential to learn many orders of magnitude faster than us and from our point of view would just pop into existence. It might slowly grow according to *it*, but if a good morality requires interactions with other beings then it might reach full intelligence before any of that has had a change to occur.

I am convinced that any sufficiently intelligent agent will eventually realize that living with others, assuming the threat they pose isn't *that* great, is better than living alone. I just wonder if that point would be reached before it pulled the plug on all of us (assuming it could). Thats why trying to build some morality into it, or at least putting something in place to accelerate the process, seems like it would be very important.
 
Yes, but whose morality?

I think it would be better to feed it as many viewpoints as possible and then let it make up its own mind.
 
I agree, but an intelligence composed of electronics would have the potential to learn many orders of magnitude faster than us and from our point of view would just pop into existence. It might slowly grow according to *it*, but if a good morality requires interactions with other beings then it might reach full intelligence before any of that has had a change to occur.

I am convinced that any sufficiently intelligent agent will eventually realize that living with others, assuming the threat they pose isn't *that* great, is better than living alone. I just wonder if that point would be reached before it pulled the plug on all of us (assuming it could). Thats why trying to build some morality into it, or at least putting something in place to accelerate the process, seems like it would be very important.


You don't have a clue what you're talking about.

I say this as a Ph.D. in computer science who specialized in AI in grad school and in a lot of my subsequent research.
 
You don't have a clue what you're talking about.

I say this as a Ph.D. in computer science who specialized in AI in grad school and in a lot of my subsequent research.

Instead of responding with the same level of negativity that you directed at me with that opening statement, I will simply ask: what part of any of my posts in this thread would lead you to say such a thing, and why?
 
Yes, but whose morality?

I think it would be better to feed it as many viewpoints as possible and then let it make up its own mind.

I would be on board for that. But like I said, it seems to me like there would be a good chance that it might start making nasty decisions before any viewpoints at all are presented to it.
 
Instead of responding with the same level of negativity that you directed at me with that opening statement, I will simply ask: what part of any of my posts in this thread would lead you to say such a thing, and why?


Nothing that you've written (and I've just reviewed far too many of your posts) indicates that you have a clue about what intelligence is, how it is thought about or modelled, how the brain works, or how an artificial intelligence might be fostered.

No, I'm not at all interested in engaging you in a discussion or debate on this or other subjects.
 

Back
Top Bottom