AkuManiMani said:
Earlier, I brought up examples of computation and cognition in the human brain that are not conscious.
Which isn't relevant.
If you understand what I'm trying to get at you would see that it's extremely relevant.
AkuManiMani said:
We know at this point that the brain is the medium for the phenomenon we call consciousness but there is not yet any understanding of what it is, exactly.
Reflection.
I recall you bringing up self-referencing programs earlier as an example of conscious awareness. I agree with you that it clearly has a critical role in intelligence but when I say 'consciousness' I am not referring merely to intelligence.
Reflection is clearly not identical to awareness. Its essentially a class of internal feedback that take place not only in the brain but as a means of self-regulation in cells.
I think that we can agree that even tho our cells regulate their gene expression they are not conscious -- at least not what we experience as conscious. The crux of what I'm saying is that feedback response is not necessarily identical to awareness. The fact that such processes are continually at work in our bodies even when we are not lucid is proof of that. To assume that it is (even in the case of computational Reflection) would be a leap of faith.
AkuManiMani said:
The very phenomenon of experiencing these frequencies as color is a part of what we call consciousness.
Why do you say that?
Seems I'm having a hard time communicating what I'm actually getting at so I'll try to use an example you might find 'relevant'.
Lets say that you've constructed an automaton with its own onboard AI system whose main function is to vacuum floors without constant human supervision. Its equipped with adaptive reflective programing and has optic sensory systems, responds to sounds and vibrations, and for good measure, lets just say it even has some chemosense to detect spills and odors.
Now, assuming that it has all the computational functions that you identify with consciousness how do we know it experiences EM radiation the way we do? How about smells or sounds? Better yet, how would we know it experiences it chemosense as taste smell or something entirely different? A more important question would be: Even tho it can algorithmically respond to external and internal stimuli how would we
know it subjectively experiences them as qualitative phenomenon at all?
I'll use another example to illustrate what I'm trying to convey.
Theres a condition in which a person's brain responds to visual information that comes thru the eyes. For all intents and purposes, the brain ''sees". The only problem is that people with this condition are not consciously aware of this sight -- they don't actually
see. I believe the condition is called
Blindsight. There are numerous examples one could point out of the brain unconsciously sensing and performing complex functions without full conscious awareness or without any conscious awareness at all.
Ah. Evolution. If pain was pleasurable we'd all be dead.
Its fully conceivable that evolution could have the brought about the pain response (i.e. avoiding or retreating from negative stimulus) without there necessarily being a conscious sensation of suffering.
AkuManiMani said:
Theres nothing in the currently known laws of physics that accounts for subjective experience.
Baloney. There is nothing in subjective experience that raises the slightest problem for a purely physical explanation.
My point is we don't have that explanation
yet. Are color, taste, sadness, joy, etc. physical properties of matter? What force is the carrier of pain? Is subjective perception inherent in any physics equation? Why is there subjective experience at all and what physical principle makes it happen?
Why is it that a particular subset of chemical systems (i.e. organism) not only physically interact and respond to events but sometimes
experience them as well?
AkuManiMani said:
What is it really? We honestly don't know yet.
Yes. Yes we do. We don't know every detail of how the brain functions, but we know perfectly well that part of its function results in subjective experience.
Pointing out that we have an idea of where it happens is not the same as demonstrating that we know what subjective experience
is or how it happens.
AkuManiMani said:
While its not justified to fill in that gap [in our understanding] with unsubstantiated 'magic' solutions its also not productive to pretend that it's not there.
Perhaps you could point it out to me, because I sure can't see it.
PixyMisa, have you ever sleepwalked or known someone who has? The person walks around and can be responsive to external stimuli but without any conscious awareness of the experience or their actions. I've personally had to be told of things I've done while sleep walking after the fact because I had no conscious experience of it at all.
If autonomous behavior and response is not a guarantor of consciousness in a human being how is it at all founded to assume that it is in the case of an inanimate automaton? What would makes the smart-vac robot in my earlier example conscious while the sleepwalker is clearly not?
Better yet, what if one were to utilize AI technology similar to that of the hypothetical smart-vac in a much different way. Say we had a person in a coma and we had the technology to have a direct computer interface between the brain/nervous system to the point where we could have an AI system control the motor function of their body. Lets say that this system could also utilize sensory information coming into the brain. With this in place, hypothetically, it would be possible to have a comatose person behave in an autonomous manner and even program them to perform relatively complex tasks.
By your definition, would they be conscious? Hows about the AI put in place to control their behavior and motor functions? Would you consider the AI to be their consciousness? Why or why not?
AkuManiMani said:
Your computer = conscious is as unfounded as the 'Overmind' postulate.
Have you read Dennett on this? Or Hofstadter? Do you know why Dennett regards a device as simple as a thermostat as conscious and not qualitatively different from a human brain or a human mind?
If not, then go read. Hofstadter's
Godel, Escher, Bach in particular.
If so, then why are you spouting rubbish?
I'm familiar with Dennett's definition of consciousness and why he states that a thermostat is not qualitatively different than the human brain/mind. Essentially his argument is that the difference between the two examples is a matter of degree rather than kind and that the only thing that truly distinguishes human 'consciousness' from thermostat 'consciousness' is that the process in question is more complex in humans.
I fully understand this position but, for reasons I've already mentioned, I find it extremely lacking.