• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
Seems other computationalists think that physics is not useful in explaining consciousness.
.... And, contra westprog, these neuroscientists are by and large not physicists. Suggesting that the idea that the best people to work on explanations of consciousness are not physicists.
 
And others

However, the question as to what specific structures give rise to what specific components of "consciousness" (again stuff like memory, proprioception, language, empathy, etc. are all components of consciousness) and how they do it are questions that I predict will be answered by neuroscience (or related fields) and not by physics.
 
Not sure about that criticism - I discount dualism, and see Libet's results as indications that subconscious processes rather than conscious ones drive volition (or at very least do not contradict that).

There are other examples indicating temporal compensation to bind events to awareness, such as this. The ubiquity of confabulation being discovered in various fields (e.g. marketing research ...) suggests to me that the narrative generator with its confabulatory facility may well be fundamental to conscious awareness. I have read research where students were given subconscious hints to solve puzzles in various situations, and when asked to explain how they arrived at the answers, almost all confabulated more-or-less plausible explanations that were demonstrably untrue or actually impossible in the circumstances. Unfortunately, I can't find details of it. I'll keep looking.
Okay. That explains why you have come to the conclusions you hold. I don't find the evidence quite as persuasive as you do, but that's merely a difference of opinion regarding the plausibility of various explanations for and limitations on that evidence. Thanks for taking the time to explain your point of view to me.
OK, I think we were at cross-purposes to some degree. If you assume that the processes that are always turned off when someone is unconscious are part of consciousness, then I think you may be including too much in your definition of consciousness. My point was that consciousness requires the processes that are always turned off when someone is unconscious, but that those processes are not necessarily part of consciousness. Also, there is a problem with the definition and identification of unconsciousness. It's generally characterised by a limp and unresponsive individual, but one can be conscious while limp and unresponsive (e.g. paralysis).
That's a valid point. What is included in the definition of consciousness does appear to be in dispute.
I often feel I'm conscious while asleep (in ordinary dreams or lucid dreams). It may be a limited form of consciousness, but where does it fit in? Is a sleepwalker conscious?
No, I don't generally consider sleepwalkers or dreamers to be conscious, although others do argue for that point. I think that consciousness, at least in other animals than oneself, is recognized by their ability to respond to current environmental stimuli. We can often discern when another creature is dreaming as well.

There is a good argument to be made for considering dreamers to be experiencing a form of consciousness. I usually hear it from people claiming that the dreamer is experiencing consciousness in another reality. But this thread has been full of surprises in that regard.
The problem seems to be that we don't have the sufficiently precise terminology - consciousness, unconsciousness, awareness, self, all seem to have multiple common context-dependent meanings and usages, quite apart from our own personal interpretations and usage.
Indeed. It's sometimes hard to discern whether or not I agree with someone because those terms can be interpreted so differently by others.
 
...
When I speak of consciousness or 'subjective stuff' I don't necessarily mean reflexive self-awareness. I mean simply the raw experience of anything -- just what the heck is it? Doesn't that question even give you any pause?
Not really, no. AIUI, 'subjective' stuff is stuff that happens happens to you, your experience. The raw experience of anything is the pattern of activation of neurons in the brain that it elicits. When it's your brain, that particular pattern of activation is your subjective experience of whatever the perception may be.

...the 'qualia' I'm referring to aren't just some metaphysical abstraction but a label for something I live every moment of my waking (and dreaming) life. They are the raw 'stuff' our experiences are composed of. They are what all our scientific observations are 'made of'. They are whats indubitably real beyond all doubt. If we can't fully and meaningfully integrate them with our physical model of whats 'outside' [and I don't mean simply a hand-waving assumption that its just in the model somewhere] then we have a huge scientific as well as philosophical problem on our hands.
There's no extra physical raw 'stuff' there - subjective experiences are the patterns of neural activation in the brain. Stimulate different parts of the brain, and you'll get a change of mood, or a memory, a sensory perception, or whatever. The pattern of neural activity is the subjective experience. If you think something feels hot, you are comparing it to the patterns of neural activation you get from touching hot things; when you look at an old photo of your family, the recognition triggers associated old memory patterns and perhaps related emotional states - and you might label the overall effect as nostalgia.
 
Last edited:
...
I think that consciousness, at least in other animals than oneself, is recognized by their ability to respond to current environmental stimuli.
Dreamers can respond to current environmental stimuli by incorporating them into the dream (alarm becomes fire-engine, full bladder becomes search for dream toilet, etc). But that apart, I don't really like it - responsiveness to the environment is a fundamental characteristic of life, something all living organisms do, so I don't think it's a very useful identifier of consciousness. OTOH if all living things are conscious, because consciousness is response to the environment, perhaps the title of the thread ought to be changed to 'Has conscious awareness been fully explained?', or something.

There is a good argument to be made for considering dreamers to be experiencing a form of consciousness.
I agree. The motor and some other areas are disengaged (more or less), and the narrative generator free-wheels using recent experiences and/or unfiltered memories, and conscious awareness seems to be active as the narrative subject.

I usually hear it from people claiming that the dreamer is experiencing consciousness in another reality. But this thread has been full of surprises in that regard.
I suppose it depends what they mean by 'another reality'. My dreams have fairly consistent generic geographies, buildings, and situations over time, and I often remember a place or situation from a previous dream, but if it qualifies as a 'reality' it is not a physical reality but one constructed internally from extrapolations and reconfigurations of bits and pieces of memory and experience. I consider myself fortunate in often dreaming that I am on holiday somewhere nice - it's not physically real, but it doesn't matter at the time :D

Indeed. It's sometimes hard to discern whether or not I agree with someone because those terms can be interpreted so differently by others.
Yes, it's tiring and tiresome having to constantly try to decipher (or just guess at) the particular semantics of a term in various contexts.
 
Dreamers can respond to current environmental stimuli by incorporating them into the dream (alarm becomes fire-engine, full bladder becomes search for dream toilet, etc). But that apart, I don't really like it - responsiveness to the environment is a fundamental characteristic of life, something all living organisms do, so I don't think it's a very useful identifier of consciousness.
I understand what you are saying; I've experienced such dreams. On the other hand, people in a coma are definitely alive, but not responding to the environment. One of the issues in studying or just discussing consciousness is defining what we mean by it.
OTOH if all living things are conscious, because consciousness is response to the environment, perhaps the title of the thread ought to be changed to 'Has conscious awareness been fully explained?', or something.
I don't know if all living things are conscious. I think mammal and bird species, without exception, are conscious. I think reptiles and fish are, but I'm not as sure about that. Insects on down through amoeba, depending on my preferred definition at the time, might or might not be considered conscious. Bacteria and viruses? Perhaps it's better to think of them as being on a scale of decreasing consciousness? And what about plants? Where should they be placed on such a scale?

Yes, it's tiring and tiresome having to constantly try to decipher (or just guess at) the particular semantics of a term in various contexts.

Yes, it certainly can be. But when the other person is pleasant to talk with, I find it helps me sharpen my own thinking about what such terms represent.
 
AkuManiMani said:
...
When I speak of consciousness or 'subjective stuff' I don't necessarily mean reflexive self-awareness. I mean simply the raw experience of anything -- just what the heck is it? Doesn't that question even give you any pause?

Not really, no. AIUI, 'subjective' stuff is stuff that happens happens to you, your experience. The raw experience of anything is the pattern of activation of neurons in the brain that it elicits. When it's your brain, that particular pattern of activation is your subjective experience of whatever the perception may be.

AkuManiMani said:
...the 'qualia' I'm referring to aren't just some metaphysical abstraction but a label for something I live every moment of my waking (and dreaming) life. They are the raw 'stuff' our experiences are composed of. They are what all our scientific observations are 'made of'. They are whats indubitably real beyond all doubt. If we can't fully and meaningfully integrate them with our physical model of whats 'outside' [and I don't mean simply a hand-waving assumption that its just in the model somewhere] then we have a huge scientific as well as philosophical problem on our hands.

There's no extra physical raw 'stuff' there - subjective experiences are the patterns of neural activation in the brain. Stimulate different parts of the brain, and you'll get a change of mood, or a memory, a sensory perception, or whatever. The pattern of neural activity is the subjective experience. If you think something feels hot, you are comparing it to the patterns of neural activation you get from touching hot things; when you look at an old photo of your family, the recognition triggers associated old memory patterns and perhaps related emotional states - and you might label the overall effect as nostalgia.

Oh, jeeze...I find the sheer lack of curiosity in the above post astonishing. As I've already told another poster some time ago...

"Not good enough. Not nearly good enough. I've metnioned before my criteria for a legit theory of consciousness. It must, at bare minimum, be able to adequately answer these questions:

'What is it about particular neural processes that causes some sensory input to be felt as a particular sensation or experience? What physical property differentiates the quality of these experiences? How is this process expressed thru the biochemistry of neurons? What part of the system actually has the experience(s) and what are the relevant physical properties of this portion of the system that causes it to be subjectively sensible?'

Any alleged scientific model of consciousness that cannot address these questions is just handwaving bull, as far as I'm concerned."


The brutal fact of the matter is that at our current state of the art all we have is a cargo cult level of understanding with regard to consciousness. The persistent failure to acknowledge and face this fact is probably the greatest hindrance to scientific progress in this area. Why bother trying to gain a deeper grasp of something if you think your understanding is sufficient? How can one work towards addressing a problem if they cannot -- or will not -- even acknowledge to themselves that there is one to begin with?

... I hear a lot of talk in these discussions about the computational architecture of the brain, but whenever the topic of the subjective aspect of the whole enterprise is broached theres just a few head-scratches and someone tries to sweep the issue under the rug. Knowing the functional details of the brain are all well n' good but it tells of nothing of the physics of conscious experience IAOI.
 
Last edited:
Oh, jeeze...I find the sheer lack of curiosity in the above post astonishing.
I've been thinking about how some people just don't seem to feel that there really is any "explanatory gap" that needs to be filled (between a pattern of neuronal firings in your brain on one hand and a first hand phenomenal experience on the other).

Perhaps it's a bit like looking at the Necker Cube or similar and once you have become accustomed to one particular point of view your mind just won't flip to the other any longer, no matter how hard other people try to point out the "problem" with the way you're looking at it. (Obviously this could be said to apply in both directions.) I wonder if a lot of the discussions on this thread essentially boil down to people talking right past each other over and over again because they are basically blind to the other point(s) of view, even though they may think they are not.

Anyway, I've recently read David Chalmers' original Facing Up to the Problem of Consciousness paper again and also his later Moving Forward on the Problem of Consciousness where he responds to various criticisms to the first paper and adds further details. Ignoring for the moment his own theory (that he introduces in the last part of the first paper) I think these two papers give a pretty good summary of a number of different viewpoints and would recommend people read them if they haven't already.

Have you read these dlorde? Do you think you properly understand what "The Hard Problem of Consciousness" is truly about (even if you don't agree it's actually a hard problem) or is it more that you just can't quite "see" what the fuss is really about to start with? I hope this doesn't doesn't sound patronising - it's not meant to - I'm asking seriously because I do realise how sometimes our minds can get locked into one way of looking at a problem. I'd like to know which category you think describes your point of view best in Chalmers' second paper (unfortunately quite long). PixyMisa appears to have made it clear that he falls into the Type A materialist group (following along with Dennett) and perhaps that also applies to rocketdodger although I don't remember him being as explicit in any of his posts that I have read.
 
That is an architecture that no one knows well enough as of yet. First we have to define feeling before we can even know what we are dealing with. Feeling, at least from the perspective from which I view it, necessarily involves an entire network. Single neurons wouldn't get close since they would only be able to provide direct stimulus response action.
From the above it sounds to me that you are basically saying the HPC is beyond our reach right now, rather than it being something that will definitely dissolve away in due course as we gain a better understanding of the so called "easy problems" or perhaps not even really being a problem to start with. Is that correct? (I was also probably being rather lazy with the words I chose to use - "feeling" was my shorthand for experiencing what some call qualia, being self-aware, etc.)

However, if I've misunderstood and you do in fact have a definite position opposing the reality of the HPC, then can I also ask you which category fits your pov best (as per the 2nd Chalmers' paper linked to in my immediately previous post)?
 
I'm asking about the computer, viewed as an agent. Its keyboard is this agent's sense of touch: the computer can report where it is being touched, as you could if someone touched different spots on your hand.

If you reject applying the term to a computer, is that due to a functional difference?

If we accept that a computer has a sense of touch, and that it feels something, then we have to accept that everything has a sense of touch and feels something. I don't think that there's any evidence that non-living things feel anything. Hence there is a functional difference.
 
I largely agree, but I think you might also agree that certain core 'beliefs' are in-built -- like ideas concerning causality, categories, etc. We couldn't think at all if we did not have a framework on which to build. The blank slate idea is wrong.

Coincidentally, I was just reading a book about the Commanche, according to which one unifying factor among Amerindians was their non-acceptance of cause and effect.
 
Coincidentally, I was just reading a book about the Commanche, according to which one unifying factor among Amerindians was their non-acceptance of cause and effect.


I hope you questioned that statement since it is clearly wrong. I'm not sure there is any unifying factor amongst Amerindian thought, but the most likely candidate would be non-accpetance of Western instrumental rationality, which is very different from saying that they had no notion of causality.
 
From the above it sounds to me that you are basically saying the HPC is beyond our reach right now, rather than it being something that will definitely dissolve away in due course as we gain a better understanding of the so called "easy problems" or perhaps not even really being a problem to start with. Is that correct? (I was also probably being rather lazy with the words I chose to use - "feeling" was my shorthand for experiencing what some call qualia, being self-aware, etc.)

However, if I've misunderstood and you do in fact have a definite position opposing the reality of the HPC, then can I also ask you which category fits your pov best (as per the 2nd Chalmers' paper linked to in my immediately previous post)?



I don't think there is a hard problem. There are several groups of folks who shouldn't think there is a hard problem either -- including dualists (their hard problem doesn't concern consciousness but the interaction of different substances); idealists (they have a hard problem of matter); and neutral monists (they have a hard problem of how to convert mind stuff into matter/energy since it should be possible).

The only group for whom there ever could be a hard problem of consciousness is physicalists. The 'problem' seems to disappear with a different way of looking at subjective (it's just neuron firings happening in you) experience (feelings are not that odd, just behavioral impulses toward an action). It's just a language/conceptual issue.
 
... idealists (they have a hard problem of matter);

...

The only group for whom there ever could be a hard problem of consciousness is physicalists.
Presumably physicalists tend towards supervenient epiphenomena for thought just as idealists tend towards supervenient epiphenomena for energy/matter.

Thought at least seems to have a quality we could name 'intent', something the physical, as usually defined, lacks.

The 'problem' seems to disappear with a different way of looking at subjective (it's just neuron firings happening in you) experience (feelings are not that odd, just behavioral impulses toward an action). It's just a language/conceptual issue.
There are certainly many ways to pretend one has The Answer, or gloss over the fact one doesn't.
 
Any thoughts on addressing the issues I raised in post 3907? 'Correct answers' are irrelevant if they sidestep the pertinent questions.
 
Last edited:
That's exactly what your p-zombie twin would say, since it would be convinced that *it's* conscious and not a p-zombie. You would say it's fooling itself. Can you give us a test that you use to convince yourself that you're not fooling yourself?

The p-zombie is not convinced it's conscious. A mechanism causes it to state that it's conscious.

How do we know that the condition which we experience, which we label consciousness, is actually consciousness? Because that's the label we give to what we experience. How is it possible for us to be wrong about it?
 
Me? I'm currently feeling the emotion of slight amusement, due the the mild but pleasant cognitive dissonance generated by this discussion. Sometimes it triggers a frustration response, but this time, amusement.

So this pattern of neural activation is a quale, is it? I thought there might be more to it than that.

No, it's not the pattern of neural activation. It's your feeling of mild amusement, which doesn't show up on the ECG.

The neural activation and the feeling of mild amusement are clearly linked, but equally clearly they aren't the same thing.
 
Thanks. That’s a different perspective that I’ve heard before. Let me try to restate it: You feel that the 'illusion of consciousness' is the illusion that we can control ourselves via our conscious thoughts. You don't feel that is actually what happens. Have I understood you correctly?
.

Even if free will is disproven, consciousness exists as a passenger. Control is not fundamental to consciousness. It might be that our feeling that we control ourselves is erroneous. That we are there experiencing is a different thing altogether.
 
Status
Not open for further replies.

Back
Top Bottom