Z said:
We don't perceive our unconscious processing. Thats what makes it UNconscious.
You're right - the part of your brain that identifies as 'you' does not perceive the parts of the brain that do not identify as 'you'; this doesn't mean they're not conscious themselves; only that they are not a part of YOUR consciousness.
That still leaves one having to identify the physical difference between periods when a conscious 'you' is online, and when it is not.
Z said:
I.E. We have to identify WHAT it is.
Which requires defining what we're looking for.
Can we atleast agree that the issue has to be framed properly before it can be scientifically approached?
Z said:
And here you go assuming your conclusion in you definition. Thats a form of question begging, Z.
Show me what else consciousness involves, and we'll talk.
Sensations with distinctive qualities and volition are a couple things that come to mind.
Z said:
I've already defined the terms I'm using in other threads. I'll just repeat them here:
Mind is basically a kind of virtual space generated by the "wetware" of the brain which contains all the elements of one's psyche, like memories, memes, etc. -- kind of like a biological database. [It may be a feature specific to neural tissue -- I'm still entertaining the notion that other tissue types may support something equivalent]
No big problem with this one.
Okay, so we can tentatively agree on this part.
Z said:
Consciousness would be a kind of active brain state during which the "lights" of the mind are "turned on", in some sense. Its during this state that the subject can subjectively experience mental elements as qualia. One's conscious mental activity is more energy intensive and, I suspect, is denoted by the metabolically more active areas of the brain seen in PET scans and the like.
Several problems:
1) 'Active brain state' assumes that brains are necessary for consciousness. Now, that's not to say we should assume consciousness can exist without any brain; but it also implies that only a brain - i.e. the mass of fat and neurons in our skulls - can produce consciousness. I would replace that with 'sensory processing state'.
Right now it seems a very safe bet to assume that certain material properties of brain tissue are necessary to produce consciousness. Since we do not yet know what these properties are or if they are replicable on a different substrate I figured I'd just define it a 'brain state'.
Z said:
2) Use of the word 'mind' in the definition is inherently dualistic to some people, but given your definition of 'mind' as a virtual workspace, it's not a large problem.
You may or may not have noticed, but I have absolutely no regard for people's ideological quibbles. Personally, I think the only tenable metaphysical assumption is some form of monism. Even if the 'mind' is not an atomic substance
[I don't think it is] it must still be a physical entity in order to interact with the brain. IMO, its probably more accurate to think of the mind as something the brain generates rather than just something the brain does.
Z said:
3) Use of the term 'qualia' is redundant, pointless, and irrelevant. Shorten that statement to 'the subject can experience mental elements'. Or, since we're talking about a mental activity to begin with, simply 'the subject can experience'.
Why type all of that when I can sum it up with one word: qualia? I'm not going to avoid using the term because it makes some folks squeamish.
Z said:
Lucidity would be the degree of vividness of one's conscious experience; how "brightly" the dimmer switch of one's mind is turned. High lucidity would be the period's when the subject is fully awake, or when they're experiencing a highly vivid hallucination/dream. Periods of low lucidity would be mental states like delirium or when the subject is "fading" into sleep. Zero lucidity would be mental states of complete unconsciousness, like comas and deep sleep.
No problem there. There's also non-lucid full awareness, or the zombie-like state that many people involved in a routine fall into.
Awareness is the mental extent of their short-term memory which -- to stick with the computer analogy -- would be equivalent to one's RAM. One's awareness would be a rough measure of how many different mental elements one can be conscious of
[i.e. the mental scope of their lucidity]. Stimuli and mental elements that a subject is not conscious of at all would be completely outside of their awareness.
Not so sure about this one; the distinction between 'awareness' and 'lucidity' seems vague in your definition. It is, after all, possible to be aware of things on one level without being consciously aware of them on another. There is a level of sensory awareness which remains subconscious, and allows us to react to stimuli we are otherwise unconcious of, for example. Then, of course, we can get into questions of peripheral awareness, sensory assimilation, and so forth.
I guess I'll clarify a bit more on this part. Generally speaking, I conceive of the mind as a kind of informational field maintained by neural activity; its the virtual space of our mental software and conscious activity.
I'm not sure how deep this analogy may go in reality, but I find it useful to think of consciousness a light illuminating one's mental space. Awareness is the 'volume' of conscious activity in mental space, while lucidity refers to the 'density' of conscious activity in a given volume of awareness. Qualia would be the spectral patterns created as the "light"
[i.e. consciousness] passes thru mental elements like sensory data.
In the scheme I'm working from, consciousness is the subject and experiences/qualia are emanations of the subject. Subjects do not merely 'have' experiences -- in a very literal sense -- experiences ARE the subject.
Z said:
CAM is an acronym for
Consciously
Accessible
Mind. As would be expected, this denotes the mental speace that one's conscious activity is confined to.
'Consciousness' works well without adding extra terms.
Qualia are mental datum within a subject's awareness.
Or 'sensations'.
Experience collectively refers to all the qualia within a subject's awareness.
Or 'consciousness'.
They're just labels. I'm going thru the trouble to defining them so you atleast have some idea of what I'm talking about when I use the terms.
Z said:
When we identify what physically constitutes mind & consciousness and posses a scientific theory of such [complete with falsifiable predictions] I'll be content. Until then I'll continue to maintain that we don't know what consciousness is, and you can continue to suck on your SRIP pacifier.
May I ask - why do you feel the need to be rude and uncivil during these discussions? Does it add anything to the discussion that was not present? Does it 'score points'? I think not.
I tend to get a bit snippy and frustrated when I don't feel I'm getting a point across and resort to sarcasm in an attempt to drive it home. Its a bad habit and I apologize.
Z said:
What physically constitutes mind & consciousness? Chemo-electrical activity. That's it. That's the entire she-bang. There - are you happy now?
IMO, knowing that it has something to do with the electrochemical activity of the brain is a good start but not nearly enough. I'm hoping to see something a considerably more rigorous developed. Perhaps something like table that charts the range of human qualia that could potentially predict other qualia that humans may not experience, as well as the physical conditions necessary to generate them. I'd like science to be able to physically identify mental elements like memes.
We
need a comprehensive and rigorous theory of mind/consciousness if we're to ever have any hope of creating synthetic consciousness. Until we have such a theory, as far as I'm concerned, AI researchers are in the same epistemic boat as alchemists were before the advent of modern chemistry.
Z said:
Overall, your definitions are Ok; but your definition of 'consciousness' includes terms and concepts that, themselves, are unproven and unsupportable, and are possibly irrelevant. I think that's why we cannot agree as to the nature of consciousness to begin with. If you and I were to enter into this discussion rationally, I think you'd have to begin by expaining and defining qualia, mind, and such, and we'd have to work toward a mutual understanding of these terms. I, for one, think qualia (conceptually) exist, but are utterly irrelevant; my sense of 'redness' is no more relevant than your sense of 'salmon-flavored'; the two can even be the same sense. What IS relevant is that my sense of 'redness' applies to those objects I call 'apples', and that your sense of 'salmon-flavored' does not.
'Qualia', then, are a non-issue with me. It seems readily apparent - blatantly obvious, in fact - that anything with sensations has qualia. Even machines.
Sensations
are qualia. Sensations/perceptions/qualia -- whatever one wants to call them, and inherent features of consciousness. Many here are of the view
[including myself] that the field of AI is not scientifically equipped to handle this problem yet.
I've no doubt that it should be possible to create conscious machines. The only problem is that all evidence indicates that what we call qualia are a feature specific to the biophysical conditions of the brain. Until we have a rigorous scientific understanding of what they are, and how brains generate them, its futile to attempt creating consciousness artificially.
Z said:
And quite a few of your terms above - 'CAM', for example - seem redundant, when other terms already exist that encompass those concepts.
I think
CAM has a lot of overlap with the
GWT model of consciousness.In my opinion tho, terms are not an issue as long as the concepts they denote are in order
Z said:
Still - at least you're willing to offer your definitions. That's a good start. And I've offered my objections to your definition of 'consciousness', or at least, where I perceive we need to start to reach some level of agreement. That's a far better start than many have made here.
Thanks!
NP.
I'll also try and tone down my snippyness
