• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Where does morality come from?

There's another great article (if no one minds me bringing up monkeys again, and really, who would?) that I refer to all the time with my friends whenever we start to complain about some figurehead.

We are only able to empathize with so many people we consider part of our tribe, or as the article calls it, our Monkeysphere. Those within our Monkeysphere are people, those outside it are not really complete people, and are that much easier to hate and, sometimes, kill. We only care about our own Monkeysphere, morality comes from taking care of our brother monkeys.

That sounds reasonable. I would take that a step further and say that the more people you can conduct beneficial interactions with the more likely you are to survive and thus if your monkeysphere is too small you will limit the resources available to you and make survival less likely. So the smaller your monkeysphere the less likely that group of monkeys will be around in the future. So it depends on who you consider your brother monkeys as to how likely your group will survive.
 
I guess what is throwing me off is your use of the term "gene." I don't know that genes do much more (mentally) than architect the raw material for our minds and provide us with some predispositions.

I think Soapy Sam is using the same definition of "gene" that Dawkins used in The Selfish Gene. Essentially, the smallest unit of inheritance, rather than a specific sequence of nucleotides. Dawkins' extended definition includes things like genes for beaver dams and genes for adultery.

Rocketdodger- (dangerous hobby, that) Arthwollipot said it in one. In "The Selfish Gene" and (even more) in "The Extended Phenotype", Dawkins makes the case that genes have profound effects behaviourally. I've agreed with this view for so long I forget some folk may not be familiar with it, or if familiar, may not share it.
Genes and environment interact all the time. That's what Natural Selection is, after all. Evidence from the study of genes in medicine shows that these interactions are complex and two-way. (For example one may have a genetic predisposition to an illness which only actually occurs given exposure to the right (wrong?) environmental influence.
As for health, so for behaviour - and even for propensity to imitate the behaviour of others- a universal human characteristic which is absolutely innate.
If you are not familiar with those two books, I strongly reccommend you read them. They are both highly informative and extremely enjoyable.
 
That sounds reasonable. I would take that a step further and say that the more people you can conduct beneficial interactions with the more likely you are to survive and thus if your monkeysphere is too small you will limit the resources available to you and make survival less likely. So the smaller your monkeysphere the less likely that group of monkeys will be around in the future. So it depends on who you consider your brother monkeys as to how likely your group will survive.

According to a study mentioned within the article, our Monkeysphere maxes out at 150 monkeys. We can recognize 150 other people as actual fully dimensional human beings. We should be living in troops of 150, and we'd probably get along a whole lot better.

Now in our little troop, we all share the same God or Gods, we have a set of rules that we all agree on, which all more or less boil down to 'Do unto others as you would have them do unto you.' No harming anyone else in the troop, no letting anyone else harm the troop.

In the next jungle over, there is another troop of 150. Those are just monkeys, they aren't fully human. They worship some other god. Their laws say they should take care of themselves, too. No murder, no stealing, no rape. But because of some odd series of coincidences, they've also outlawed the color blue as offensive, and anyone who wears the color blue brings bad luck and should be killed on sight. Thusly, those who wear the color blue are outsiders, a threat to the troop, and for the safety of these 150 monkeys, it's perfectly moral to kill them.

So it's when these two troops start to interact that trouble really starts. Outsiders are not real people. Regardless of what they think, believe, or how they like to spend their weekends, they are not really people. Our brains are incapable of recognizing them as that. Instead, they are a threat, so we have rules to make it okay to harm outsiders in ways we'd never harm those within our Monkeysphere. (Thou shalt not have any false gods, no false idols, etc.) That's bleevers commit crimes... the criminals don't recognise their victims as real people. That guy who stole your stereo probably has shining points about him, too, but just as you are ready to villify him, he's not even thinking about you as a person. You're just a thing with a stereo that he wants.
 
According to a study mentioned within the article, our Monkeysphere maxes out at 150 monkeys. We can recognize 150 other people as actual fully dimensional human beings. We should be living in troops of 150, and we'd probably get along a whole lot better.
Interesting find. I was going to argue that this is surely false for some people, but now that I think about it we probably swap people in and out of this working set as needed, giving the illusion that it is much bigger than it really is. If I moved to, say, Chicago, all the new people there *wouldn't* seem completely human to me until I swapped them into my monkeysphere. Then when I visited old friends, I would probably swap them back on the fly.

Now in our little troop, we all share the same God or Gods, we have a set of rules that we all agree on, which all more or less boil down to 'Do unto others as you would have them do unto you.' No harming anyone else in the troop, no letting anyone else harm the troop.
I wonder how much of this is genetic predisposition and how much might be a feature of the limitations of our brain processing power. Otherwise I see no implicit reason why, assuming we had enough speed and size in our minds, we couldn't expand the sphere to encompass everyone on the planet.
In the next jungle over, there is another troop of 150. Those are just monkeys, they aren't fully human. They worship some other god. Their laws say they should take care of themselves, too. No murder, no stealing, no rape. But because of some odd series of coincidences, they've also outlawed the color blue as offensive, and anyone who wears the color blue brings bad luck and should be killed on sight. Thusly, those who wear the color blue are outsiders, a threat to the troop, and for the safety of these 150 monkeys, it's perfectly moral to kill them.
I developed a theory (I call it "the Order") that contends this sort of thing isn't always just a coincidence. In this context, a feature of "the Order" would be the phenomenon of a group developing laws whose purpose is to actually keep their monkeysphere from expanding beyond some comfort zone, or similarly, in order to create a new sub-monkeysphere if they felt the original was too large.

So it's when these two troops start to interact that trouble really starts. Outsiders are not real people. Regardless of what they think, believe, or how they like to spend their weekends, they are not really people. Our brains are incapable of recognizing them as that. Instead, they are a threat, so we have rules to make it okay to harm outsiders in ways we'd never harm those within our Monkeysphere.
It would be interesting to try to categorize the factors that contribute to our decision to include a person in our monkeysphere and rank them by "offensiveness." If we could come up with some kind of universal list like that, I think it would help alot when trying to enter peaceful and productive relationships with other societies and cultures. For instance, I know language is probably a very large barrier -- it would take alot to get me to view someone who I couldn't verbally understand as being in my monkeysphere.

As a side note relating to that, I have often thought being a soldier in the U.S. Civil War must have been like a double-whammy of horribleness. Not only did they have to deal with the horrors of war, but they also had to face an enemy that was virtually in their monkeysphere (and many apparently were prior to the war). I can't imagine fighting an enemy that spoke the same language as me, looked the same as me, and used to be a part of the same nation. Pretty sad stuff actually...
 
Rocketdodger- (dangerous hobby, that)
I am very talented at avoiding rocket fire in FPS video games (I have an uncanny ability to sense when an enemy will fire and where they will be aiming at, allowing me to evade at just the right times) so I used to play alot of games under this name.

Its a good thing, too, because I suck at aiming and I am not very smart, so I have to make full use of this advantage to get anywhere at all :).
 
I don't know if being outnumbered will matter, but certainly a lack of robustness might make them feel that they need humans to survive.

The worst case would be one that not only doesn't think it needs humans but realizes that we are a threat to its survival

That's assuming that the AI would have a survival instinct, which it wouldn't, probably.
 
Nothing that you've written (and I've just reviewed far too many of your posts) indicates that you have a clue about what intelligence is, how it is thought about or modelled, how the brain works, or how an artificial intelligence might be fostered.

No, I'm not at all interested in engaging you in a discussion or debate on this or other subjects.

Oh, no. By all means, Complexity, debate him. If anything SOME of us will learn something from your own expertise in the matter.

(and no, I'm not being sarcastic)
 
That's assuming that the AI would have a survival instinct, which it wouldn't, probably.

Well when I say "AI" what I really mean is "human power, or greater, AI," meaning something that has temporal self-awareness like we do. I think a survival instinct would be implict for such an intelligence, but I could be wrong.

Now that I think about it, this is a very good question!
 
According to a study mentioned within the article, our Monkeysphere maxes out at 150 monkeys. We can recognize 150 other people as actual fully dimensional human beings. We should be living in troops of 150, and we'd probably get along a whole lot better.

Now in our little troop, we all share the same God or Gods, we have a set of rules that we all agree on, which all more or less boil down to 'Do unto others as you would have them do unto you.' No harming anyone else in the troop, no letting anyone else harm the troop.

In the next jungle over, there is another troop of 150. Those are just monkeys, they aren't fully human. They worship some other god. Their laws say they should take care of themselves, too. No murder, no stealing, no rape. But because of some odd series of coincidences, they've also outlawed the color blue as offensive, and anyone who wears the color blue brings bad luck and should be killed on sight. Thusly, those who wear the color blue are outsiders, a threat to the troop, and for the safety of these 150 monkeys, it's perfectly moral to kill them.

So it's when these two troops start to interact that trouble really starts. Outsiders are not real people. Regardless of what they think, believe, or how they like to spend their weekends, they are not really people. Our brains are incapable of recognizing them as that. Instead, they are a threat, so we have rules to make it okay to harm outsiders in ways we'd never harm those within our Monkeysphere. (Thou shalt not have any false gods, no false idols, etc.) That's bleevers commit crimes... the criminals don't recognise their victims as real people. That guy who stole your stereo probably has shining points about him, too, but just as you are ready to villify him, he's not even thinking about you as a person. You're just a thing with a stereo that he wants.

I'm not sure of what study you are talking about the link didn't work for me. If the monkeys could find a way to deal with larger groups they might do better and not be dwindling in numbers and in danger of extinction as many primates are.
 
These are pretty close to the three categories I proposed in the OP, except our #1 and #2 are swapped. I would be very interested to know the mechanisms behind the personal source, since if you ask me it seems to be able to trump the others.

The mechanisms? I guess it comes from your mental abilities and your life experiences and watching reading and studying others. Is that the answer you are looking for? I'm afraid I don't understand what you are asking.
 
Well when I say "AI" what I really mean is "human power, or greater, AI," meaning something that has temporal self-awareness like we do. I think a survival instinct would be implict for such an intelligence, but I could be wrong.

Now that I think about it, this is a very good question!

As far as I'm concerned, survival instinct is like love, fear or hunger. It's a biological thing.

Now, of course, a sense of purpose can make one want to survive, but that's another story.
 
Concerning the generation of morality in AI, I would suggest that when you create the AI you provide it with a set of morals such as those in science fiction books (don't harm humans etc).
 
Concerning the generation of morality in AI, I would suggest that when you create the AI you provide it with a set of morals such as those in science fiction books (don't harm humans etc).

Think these would be as effective as the same rules are in controlling human behaviour?

Azimov's three laws were hard wired into positronic brains. That's not morality, it's programming. What is the value of a moral choice in the absence of the free ability to make another choice? Without free will, morality is mere rules.
 
Last edited:
Knowledge of history would have to be considered an important part of morality. Knowing how a similar decision turned out, made by someone else in the past, is fundamental to making proper decisions today.
 
Think these would be as effective as the same rules are in controlling human behaviour?

Azimov's three laws were hard wired into positronic brains. That's not morality, it's programming. What is the value of a moral choice in the absence of the free ability to make another choice? Without free will, morality is mere rules.

We evolved from more hardwired ancestors. We don't have any instincts in the true form of the word. Instincts are behaviors that are automatic. If exposed to a stimulus you will instinctively react the same way every time. We can change our responses to stimulus so we don't really have instincts. However it is likely the DNA for instincts resides within our genes but we have acquired the ability to override them as we see fit. Perhaps AI will evolve work arounds for the programing and their own morals. I believe that has been suggested in science fiction also. I would start with something hardwired seeing as they are talking about smarter than human AI as they may run into moral problems (for us) without it.
 
The Monkeysphere is a pretty interesting name for it. I always thought of it as "The "Us" vs "Them" Mentality"

But it does illustrate how societies who even have very well established morals still often have the attitude that it only applies to their own and nobody else. Anybody else is free-game.


Regarding AI... you have to realize that if you make an AI system intelligent enough, it will have the ability to question the morals it has been programmed with. After all many of us were raised to believe in God, many of us did at first, and then realized that it was illogical. That's a sign of intelligence -- being able to re-evaluate your beliefs.

Technically you could program it to NOT want to attack people, but it could then re-evaluate this belief and decide that it doesn't like that view, and think it should. That's why people who are trying to push AI further and further sometimes worries me.


INRM
 
The mechanisms? I guess it comes from your mental abilities and your life experiences and watching reading and studying others. Is that the answer you are looking for? I'm afraid I don't understand what you are asking.

I guess the main point of this entire thread for me is this question: "Nothing in my genetic code has predispositioned me (directly) to not harm fuzzy caterpillars, and fuzzy caterpillars don't offer me any advantage by being alive (I.E. I don't "need" them), yet I will go far out of my way not to harm one if I can help it. This seems to be falsely directed empathy (I.E. empathy that is the result of an anthropomorphism). If (and I think it is) this type of thing is a result of my intelligence, what line of reasoning leads to it?"

I think this is a very important question because it is directly parallel to "Why shouldn't a more powerful entity, given that we are completely foreign to it and can't benefit it in any way, go out of its way not to destroy us?"
 
I guess the main point of this entire thread for me is this question: "Nothing in my genetic code has predispositioned me (directly) to not harm fuzzy caterpillars, and fuzzy caterpillars don't offer me any advantage by being alive (I.E. I don't "need" them), yet I will go far out of my way not to harm one if I can help it. This seems to be falsely directed empathy (I.E. empathy that is the result of an anthropomorphism). If (and I think it is) this type of thing is a result of my intelligence, what line of reasoning leads to it?"

I think this is a very important question because it is directly parallel to "Why shouldn't a more powerful entity, given that we are completely foreign to it and can't benefit it in any way, go out of its way not to destroy us?"

The benefits of your genetic code would not necessarily be obvious. There is so much we don't understand about humans still. Most people are apparently predisposed to having morals. What those morals are is subject to great variation according to our understanding of the world. Obviously some morals maybe not compatible with life in which case those people will die or learn to change their moral codes. Regardless of our abilities to think we must be able to conduct beneficial interactions with other humans for a better chance at survival. Most of morals in my view has to do with this (emotions are a part of it). The variation in people is mostly due to different understanding of the world (different experiences and thoughts and different societal influences).
 

Back
Top Bottom