Evolution: Is there any survival value for human consciousness?

That's an interesting conclusion. Passing the Turing test would not allow for distinguishing machine and human intelligence. So, if that is the case, contrary to some posters here, intelligence is very different from self awareness.
So, if that is the case, is there any survival benefit from self awareness?

I took that distinction into account in my answer and others also seem to have done same.
 
You haven't proven that self awareness is any different from any other awareness...

You think it's cool only because it raises questions, but that doesn't make awareness any more ethereal than awareness at all.

It's as emergent a property as any other neurological function. We are able to communicate it among us through a more complex set of education, so perhaps other creatures are self aware, and do not communicate it towards their own species or any species. Maybe the fact that they are self aware doesn't alter their behavior, and it might not have for us if we didn't decide to start communicating more...

I don't know Perpetual. Self awareness doesn't seem any more strange than awareness itself. I I drink a beer, I am aware that I enjoy the beer. I may completely lack all function of memory but limbic systems still give me a sense of taste and pleasure, so I am still enjoying the beer. Self awareness seems more made up of our language than actually existing (assuming this is what we're actually talking about; there is no neuroscientific evidence for self awareness unless we equate self awareness to consciousness, which shouldn't be the case anyways)

Interesting response. Would a machine that passed the Turing test necessarily be aware?
If so, would that necessarily make it self aware?
 
Last edited:
Interesting response. Would a machine that passed the Turing test necessarily be aware?
If so, would that necessarily make it self aware?

Well that's more the question I can't answer because self-awareness seems almost...irrelevant as it's nothing more than normal awareness =\ Like I said, for all we know most animals could be self aware as their faculties and subjectivity allow them (remember we are as aware as our brains allow us; you can damage parts of a human's brain but really there doesn't seem to be a damage that can take away general awareness and leave self awareness and vice versa, which is why they are the same thing)

The Turing Test and a machine are not conscious; they can never be. They are only reasonably similar, but for the purposes of consciousness it is inherently dependent on neuronal machinations, not scripted machinations from a programming language, even if they perform similar functions.

It sounds nitpicky, I know, but the fact is we DON'T understand much about the brain other than that its determined by action potentials and they illicit responses. Computer code is 100% understood and predictable whereas the brain...not as predictable. Because of that I am VERY wary to compare out brains and how it operates to a machine, which is why I also find the Turing test dubious to compare to awareness as organisms may have it.
 
Is the Turing test truly valuable for determining consciousness, though, or only for determining how well the non-human subject can imitate a human? It's also highly subjective: spend half an hour on Facebook and you'll likely come across humans who would both fail to distinguish a modern chatbot from a human and fail to prove that they were human to a more intelligent interrogator.

If we could understand and replicate the conditions necessary for consciousness to arise, I don't see why we couldn't create a conscious machine, eventually.
 
What's the big kershfuffle?

Even without properly defining the inclusivity of the words intelligence and self-awareness, I think you can make some sense of it. If you can imagine a society of 'smart' robots, you should be able to imagine robots that are so 'smart' they process other robots' possible subjective experiences. Pretty much theory of mind. If it's possible to imagine robots representing other robots' beliefs, feelings, etc...why does self-consciousness seem like such a leap? It really shouldn't. It's theory of mind applied to yourself which means you have much more accurate information and much more of it...maybe that's what makes ALL the difference?
 
Yeah, I question it as well. I have the book in dead-tree form so I can't search it electronically but I'll see if I can find it. IIRC, it did have a reference to a medical journal. Stand by ......
Aha! I found it.

For example, a newborn's brain expects faces: even when they are less than ten minutes old, babies will turn toward face-like patterns, but not to scrambled versions of the same pattern. By two and a half months, an infant will express surprise if a solid object appears to pass through another object, or if an object seems to disappear, as though by magic, from behind a screen.

The reference in the book for the first sentence is "Johnson and Morton, 'CONSPEC and CONLERN'" which leads here.
 
That leads to question of memory's role in consciousness; can you literally have no ability to create memory and yet be conscious? I would say, yes. Can you have no memory and be self aware? I wouldn't think so.

It's a complex area - Clive Wearing lost the ability to store long-term memories through illness, so he only has literally a few seconds of memory, yet he appears to be conscious and self-aware, but who knows what really goes on in his mind:



Of course, his general behaviour and thought patterns were already well-established. He may well just be reusing pre-damage patterns over and over...
 
Last edited:
I think there is general consensus that consciousness is a matter of degree and not exclusive to humans. It seems to be an emergent property of brains and possibly quite independent of intelligence. General intelligence also appears to be a fabricated concept as it consists of so many different domains. You can be absolutely useless at maths and general logical reasoning but brilliant at social interaction and communication (or vice/verso). Does this in any way make someone less conscious or self aware? I don't think so.

I'm no expert, but intelligence seems to be seated in how well different brain modules process the data they were evolved for, and consciousness more in the interaction between those modules. Brain damage to specific areas of the brain can be very 'narrow' in their impact, affecting only speech (to the point where it might only damage your use of verbs for example) or only math etc. etc. I know of no, except massive extended, damage that seems to have an impact on consciousness.
As far as modeling the intentions of other people having something to do with it, do you think neurological disorders like autism or asperger's where people have difficulty deducing the intentions of others mean they are less conscious or self aware? I find that absurd. It could even be argued that autism makes you hyper self-aware, directing most of your thoughts inwards instead of encompassing other people (just an idea, don't really know).
This might make the ability to model other people just another specialized module.

This makes the concept even more slippery since neither intelligence nor the internal modeling of other peoples thought process, seem to impact self-awareness.

Any thoughts on this, I'm just speculating?
 
Last edited:
It's a complex area - Clive Wearing lost the ability to store long-term memories through illness, so he only has literally a few seconds of memory, yet he appears to be conscious and self-aware, but who knows what really goes on in his mind:



Of course, his general behaviour and thought patterns were already well-established. He may well just be reusing pre-damage patterns over and over...

Yea when I had thought the scenario of no memory, I mean literally no capable memory can ever be formed, but he's an interesting test case.

If you can't form memory and really only have sensory input faculties (we can suppose that memory itself is recreating sensory input sensations without the actual physical input; ie memory fakes experiences and in such we have memory. This isn't what memory actually is but it's reasonably close)

If you're just a sensory apparatus with no memory I wouldn't think you could be self aware but I don't know... I mean we talked about bees before. Bees can take in sensory input like pheromones and act off the input; is that self awareness because they themselves DO act on input. I mean it's superficial because I doubt there's some internal monologue that says "I must go this way" because it doesn't seem...like a decision so much as just a reaction, but then EVERY thing we do is really just a reaction and the illusion is that self awareness exists. Really we observe it and say it's something special that it's self awareness when really that's just the emergent behavior of many reactions to stimuli resulting in whatever we're observing.

Koch made note of this is The Quest for Consciousness, stating that for the most part we are very reactive zombies.
 
If you can't form memory and really only have sensory input faculties (we can suppose that memory itself is recreating sensory input sensations without the actual physical input; ie memory fakes experiences and in such we have memory. This isn't what memory actually is but it's reasonably close)

If you're just a sensory apparatus with no memory I wouldn't think you could be self aware but I don't know...

That might well be true, I also couldn't say, but the reverse certainly is not. Almost all animals even very simple ones have memory (and very long memories) and are able to learn, insects sure can (and darn quickly at that).

As an aside, I just recalled that in the 1950's memory experiments were conducted on planarian flatworms showing that they do learn and remember. (It was also claimed that chopping one up and feeding it to others transferred the memory 'chemically', this though was never repeated and attributed to observer bias.)

I don't think anyone would argue that a flatworm is conscious.
 
Last edited:
Sorry for the late answer....

That's an interesting conclusion. Passing the Turing test would not allow for distinguishing machine and human intelligence. So, if that is the case, contrary to some posters here, intelligence is very different from self awareness.
(my emphasis)

That does not necessarily follow. The successful Turing contender will effectively simulate intelligence, but it may not actually be intelligent. After all, the builder of the system is intelligent and has the potential to predict a sufficient structure of intelligent replies.

So, if that is the case, is there any survival benefit from self awareness?

Self awareness without intelligence? Difficult to imagine, so difficult to say if it has survival value.

Actually, to judge the survival value of any trait, one has to ask: Under which circumstances? The trait that is an advantage in some circumstances could be deadly in others.

Hans
 
Sorry for the late answer....


(my emphasis)

That does not necessarily follow. The successful Turing contender will effectively simulate intelligence, but it may not actually be intelligent. After all, the builder of the system is intelligent and has the potential to predict a sufficient structure of intelligent replies.



Self awareness without intelligence? Difficult to imagine, so difficult to say if it has survival value.

Actually, to judge the survival value of any trait, one has to ask: Under which circumstances? The trait that is an advantage in some circumstances could be deadly in others.

Hans

The point of the Turing test is that, if an entity (machine) can pass it, there is no distinction that can be made between its intelligence and that of a human; consequently, some would conclude it is an intelligent machine. It may not necessarily follow that the machine is self aware.
 
The point of the Turing test is that, if an entity (machine) can pass it, there is no distinction that can be made between its intelligence and that of a human; consequently, some would conclude it is an intelligent machine. It may not necessarily follow that the machine is self aware.

Slight nitpick here: The original Turing test stipulates that communication is conducted by typing terminals. Modern set-ups will probably be more sophisticated, still a Turing test will have a limited scope, in time and communication method. It is quite conceivable that a machine may successfully, in a practical test, simulate intelligence, without actually being intelligent.

You may also understand 'Turing test passed' as 'indistinguishable from intelligence' in a general scope, in which case we must conclude that the machine is indeed intelligent. If such a machine is one day built, it shall be interesting to see if it turns out to be self-aware, but I would expect so.

Hans
 
You may also understand 'Turing test passed' as 'indistinguishable from intelligence' in a general scope, in which case we must conclude that the machine is indeed intelligent. If such a machine is one day built, it shall be interesting to see if it turns out to be self-aware, but I would expect so.

Yes, the Turing test is very limited and might 'fool' you into believing that a machine is intelligent and self-aware, without it really being so. I suppose it depends on how the 'machine' is set up.
You could write software that has a response and an answer for every eventuality, parsing the input and picking a response, remembering what has gone before and responding appropriately etc., but I don't think it will be self-aware, or conscious.
On the other hand, modeling a human brain, with all its 100 billion or so neurons and axons, numerous dendrites and the way they interact and are connected with each other, will have to be self-aware and conscious.
With our current technology it will probably be a pretty slow thinker though:D.

But I think we might be drifting slightly off-topic again, any thoughts on my post #68, the question of self-awareness/consciousness?
 
Last edited:
I think there is general consensus that consciousness is a matter of degree and not exclusive to humans. It seems to be an emergent property of brains and possibly quite independent of intelligence. General intelligence also appears to be a fabricated concept as it consists of so many different domains. You can be absolutely useless at maths and general logical reasoning but brilliant at social interaction and communication (or vice/verso). Does this in any way make someone less conscious or self aware? I don't think so.

I'm no expert, but intelligence seems to be seated in how well different brain modules process the data they were evolved for, and consciousness more in the interaction between those modules. Brain damage to specific areas of the brain can be very 'narrow' in their impact, affecting only speech (to the point where it might only damage your use of verbs for example) or only math etc. etc. I know of no, except massive extended, damage that seems to have an impact on consciousness.
As far as modeling the intentions of other people having something to do with it, do you think neurological disorders like autism or asperger's where people have difficulty deducing the intentions of others mean they are less conscious or self aware? I find that absurd. It could even be argued that autism makes you hyper self-aware, directing most of your thoughts inwards instead of encompassing other people (just an idea, don't really know).
This might make the ability to model other people just another specialized module.

This makes the concept even more slippery since neither intelligence nor the internal modeling of other peoples thought process, seem to impact self-awareness.

Any thoughts on this, I'm just speculating?

Intelligence is an elusive concept as you point out. It seems to me to be related out how well an organism reacts to and deals with its environment, especially when presented with novel or unusual situations. An amoeba reacts to its environment in very predictable ways (hard wiring seems to be a good analogy). In contrast, a dog can vary its reaction depending on circumstances, a human even more so.

Consciousness, on the other hand, is a totally subjective experience so it eludes any definition that can be scientifically analyzed. I agree with those writers who have linked consciousness to language and memory. Although it is not evidence, I would point out that humans have no memories (and I believe any sense of consciousness) before acquiring language. The human brain develops as synapses are created as a response to repeated stimuli. It seems that when a sufficient number of synapses are created, as part of language acquisition, consciousness and memory emerge. Clearly, this is a much debated area because there is no scientific evidence for consciousness, other than the fact that most humans claim to experience it.
 
Intelligence is an elusive concept as you point out. It seems to me to be related out how well an organism reacts to and deals with its environment, especially when presented with novel or unusual situations. An amoeba reacts to its environment in very predictable ways (hard wiring seems to be a good analogy). In contrast, a dog can vary its reaction depending on circumstances, a human even more so.

Consciousness, on the other hand, is a totally subjective experience so it eludes any definition that can be scientifically analyzed. I agree with those writers who have linked consciousness to language and memory. Although it is not evidence, I would point out that humans have no memories (and I believe any sense of consciousness) before acquiring language. The human brain develops as synapses are created as a response to repeated stimuli. It seems that when a sufficient number of synapses are created, as part of language acquisition, consciousness and memory emerge. Clearly, this is a much debated area because there is no scientific evidence for consciousness, other than the fact that most humans claim to experience it.
Unfortunately, the bold part above can also be said about religious faith, which is equally non-scientific.
 
Intelligence is an elusive concept as you point out. It seems to me to be related out how well an organism reacts to and deals with its environment, especially when presented with novel or unusual situations. An amoeba reacts to its environment in very predictable ways (hard wiring seems to be a good analogy). In contrast, a dog can vary its reaction depending on circumstances, a human even more so.

Consciousness, on the other hand, is a totally subjective experience so it eludes any definition that can be scientifically analyzed. I agree with those writers who have linked consciousness to language and memory. Although it is not evidence, I would point out that humans have no memories (and I believe any sense of consciousness) before acquiring language. The human brain develops as synapses are created as a response to repeated stimuli. It seems that when a sufficient number of synapses are created, as part of language acquisition, consciousness and memory emerge. Clearly, this is a much debated area because there is no scientific evidence for consciousness, other than the fact that most humans claim to experience it.

Is a blind deaf-mute person (e.g. Helen Keller) who never learns sign language or Braille conscious?. Does he/she have intelligence? Is he/she capable of forming memories? Does "consciousness" vary in any way with the obvious variation in memory and intelligence between a human with trisomy 21, a musical savant, or Steve Jobs? Does a chimp who learns sign language or a parrot who learns to communicate with us by using our own words suddenly become "conscious"?

As always these discussions inevitably devolve into semantic arguments because of lack of clear definition of terms (specifically "consciousness"). Even the science of NCC is problematic in that all the experiments conflate attention with consciousness.

Self awareness, attention, intelligence, planning, memory, communication, emotion and sensory perception are all aspects of brain activity, and all of these brain functions are also found in other animals. The human's ability to plan is more detailed, his system of communication is (as far as we are so far aware) more elaborate, but he does not possess any ability that is not present in other living forms on planet earth. Indeed other animals hear things we do not hear, see things we do not see, and perform feats of strength and skill of which we are not capable.

All the various aspects discussed so far here as "consciousness" will be altered by brain damage, or by chemical or electrical interference in brain activity; i.e., they are brain functions.

If you mean by "consciousness" (as I think you do), our subjective feeling of being here, alive, perceiving the world around us, within our own shell separate from those around us, then your question as to whether it is evolutionarily adaptive is misguided IMHO. That feeling you have of being you is your brain doing what it does, and it isn't any different than what your dog's brain is doing when he decides he needs to pee and comes to find you to let him out (sensing, remembering, planning, and communicating the plan to achieve his goal).

Brain function is clearly evolutionarily adaptive; consciousness is just a vague term that we all seem to understand but that nobody can really pin down.
 
Last edited:
Although it is not evidence, I would point out that humans have no memories (and I believe any sense of consciousness) before acquiring language.
Plenty of memories, no conscious recall. During the pre-linguistic period, a mass of memories is being laid down, not least the elements of language itself. However, there is little evidence of conscious recall of pre-linguistic events in most people. There are a few people who claim to have authentic memories prior to this, but it's extremely difficult to verify them as true memories of that period rather than retrospective confabulations.
 

Back
Top Bottom