My take on why indeed the study of consciousness may not be as simple

What it can't tell you, in any way, is what it feels like to be a mouse.

Exactly!!! It's like that "What is it like to be a bat" paper. We can, in principle find out every last detail of how a mouse feels hunger. The one thing we can never do is know what it feels like to be a hungry mouse.

Or even this; Some insects can see way into the UV part of the spectrum. What is it like to see those colours? Do they scale their range of colours into what we experience? Do they see colours that we can't imagine? We can study their eyes and brains but we can't experience what it is like to see into the UV. Seeing UV as an insect has a quale that we can never know.
 
Exactly!!! It's like that "What is it like to be a bat" paper. We can, in principle find out every last detail of how a mouse feels hunger. The one thing we can never do is know what it feels like to be a hungry mouse.

Or even this; Some insects can see way into the UV part of the spectrum. What is it like to see those colours? Do they scale their range of colours into what we experience? Do they see colours that we can't imagine? We can study their eyes and brains but we can't experience what it is like to see into the UV. Seeing UV as an insect has a quale that we can never know.

And all that computing science and philosophy and neuroscience has really given us no more insight into what it feels like to be a mouse than the first person who considered the question.
 
Rather that sensation of the subjective experience of perception arises from the pattern of action potentials across various synapses in various brain structures.

Yes, this is what I imagine happens, mainly because I can't think of any other possibilities. However, just because I agree with this statement, I still don't understand, even in the loosest possible way, how we go from "the pattern of action potentials across various synapses in various brain structures" to "sensation of the subjective experience" or even just "sensation."

It's not like going from bricks to houses, its going from wiring to sensation. How, broadly speaking does sensation arise from wiring and switches? What's silly is that I think you are right, I just don't see how.

westprog ,
My thermometer never feels hot or cold, it just is hot or cold and responds accordingly.

You may be surprised to hear that a number of people don't agree with that. They think that a thermometer feels hot and cold in exactly the same sense as a person. Or if not a thermometer, a thermocouple. Or a thermostat.

Well, even if they think that, it doesn't help any with understanding how consciousness arises. Unless we go for the 'consciousness is universal' model. I think that view has all sorts of problems but this is the wrong thread for that discussion.

Also, I think the multi-quote doesn't work for me because of my browser. I'm using chrome which is generally pretty good but it doesn't always behave. OOOoo, is that getting a bit anthropomorphic :)
 
Last edited:
If I do your leaf and tree experiment it will tell me all sorts of things about everything that isn't consciousness (using my understanding of the word consciousness). All the things you mention, awareness of groups of leaves not individuals, what goes through my head etc. are all things I am conscious of. The aren't the consciousness.

Has it occured to you that, since you (by your own admission) don't know anything about this issue, you might want to start at the very beginning?

By doing this exercise you are supposed to get an idea of what "the consciousness" is. If you can't ask yourself these kinds of questions then how on Earth do you hope to understand your own consciousness?
 
And all that computing science and philosophy and neuroscience has really given us no more insight into what it feels like to be a mouse than the first person who considered the question.

Neuroscience alone tells us that any component of experience reliant upon human specific brain structures will be absent from the overall experience of being a mouse.

So, once again, you are just wrong.
 
Also, I think the multi-quote doesn't work for me because of my browser. I'm using chrome which is generally pretty good but it doesn't always behave. OOOoo, is that getting a bit anthropomorphic :)

I'm using Chrome too. I had no problem getting multiple quotes, though I had to do some editing by hand.

There might be people who think that computer programs "behave" - but I share the opinion given in my signature.
 
And all that computing science and philosophy and neuroscience has really given us no more insight into what it feels like to be a mouse than the first person who considered the question.

I disagree with this. Insight or understanding into the phenomenon of the subjective experience is not the same thing as the subjective experience itself. Neuroscience certainly has given us a great deal of insight into the process. In certain conditions, we can even "mind read" by using software that learns to recognize the pattern of certain types of brain scans to reveal information about the subjective experience the person is having. (I've mentioned before the example from the California Institute of Neuroscience where the subjective experience of seeing horizontal blues vs. vertical reds flips, and we can discern which experience a subject is having based on analysis of a brain scan. There are others. Again, the whole purpose of anaesthesia is to prevent or ameliorate the subjective experience of pain or discomfort.)
 
More seriously, if you are not your sense organs and your neocortext (which is actually the structure where conscious experience of the sensory input takes place), then what are you?

Can you exist, but have no gender, no memories, no name (no language), no position, no posture, no sensations, etc?

I understand you're using the "I" (or from my p.o.v. the "you") to mean consciousness, but I think you're making a false distinction.

I knew it might be dangerous to invoke 'I' when dealing with consciousness :) It was just a rhetorical device. To be honest, I don't think it matters what I am. It really does lead down a path of arbitrary definitions and personal opinions. In practice, I think I people equivocate when using the word I. I know I do. Most of the time everyone understands what is meant and there's no harm done. I could make an argument for 'I' really being the consciousness only but it's so closely associated with our bodies that the distinction would prove to be a linguistic disaster. No one says my body is going out to buy some beer (unless they are pedantic Buddhists).
 
A stumbling block these "consciousness" threads never seem to face is what I'd call awareness vs. consciousness.

Self referential information processing should allow every instance of awareness to be programmed using any computing substrate, and for mineral, plant, or animal. The thermostat exhibits awareness, so do plants, animals.

Add another layer of self referential programming, and decisions based on awareness as provided by sensors interacting with environment become possible. Lifeforms with neural systems seem to do this. This level of programming should also allow computers to use language level communication with humans, and for most of us appear to pass Turing test criteria.

The strong AI proponents seem to have faith this level of behavior is prima facie evidence of consciousness. Other like westprog and Fontwell appear to disagree.

Is this the Hard Problem?
 
Neuroscience alone tells us that any component of experience reliant upon human specific brain structures will be absent from the overall experience of being a mouse.

So, once again, you are just wrong.

Which tells us nothing about what it feels like to be a mouse. This is the big question which has been solved by the Strong AI protagonists by ignoring it, or pretending that what we "feel like" is simply a matter of itemising all the switches and links in the brain, or saying that we don't really feel like anything, or claiming that everything feels like itself, and a domestic heating system has its own sense of purpose.
 
The strong AI proponents seem to have faith this level of behavior is prima facie evidence of consciousness. Other like westprog and Fontwell appear to disagree.

Is this the Hard Problem?

I agree that it's a matter of faith.
 
I disagree with this. Insight or understanding into the phenomenon of the subjective experience is not the same thing as the subjective experience itself. Neuroscience certainly has given us a great deal of insight into the process. In certain conditions, we can even "mind read" by using software that learns to recognize the pattern of certain types of brain scans to reveal information about the subjective experience the person is having. (I've mentioned before the example from the California Institute of Neuroscience where the subjective experience of seeing horizontal blues vs. vertical reds flips, and we can discern which experience a subject is having based on analysis of a brain scan. There are others. Again, the whole purpose of anaesthesia is to prevent or ameliorate the subjective experience of pain or discomfort.)

Yes, but we already know that certain types of consciousness are associated with certain external phenomena. We know that we look at cats and think about cats. It's not really that surprising that the thing we do our thinking with can be tracked in some way. But that still doesn't tell us what it feels like to be a mouse. Or a clever robot, for that matter - or even if it feels like anything.
 
I knew it might be dangerous to invoke 'I' when dealing with consciousness :) It was just a rhetorical device.
I understand that (and said so). I think your use of "I" this way points to a flaw in your understanding, though.

I could make an argument for 'I' really being the consciousness only but it's so closely associated with our bodies that the distinction would prove to be a linguistic disaster. No one says my body is going out to buy some beer (unless they are pedantic Buddhists).
Similarly, no one would ever say "I'm going out for some beer" and mean that their self but not their body was going to go out--not even a pedantic Buddhist (who I have heard use the phrase "this body-mind" in place of "I" as a way of training themselves to let go of attachment to the ego).

It's not that consciousness is "closely associated with our bodies", it's that it's a property or function of our bodies. You keep talking of it as if it's a separate and separable thing, even though it's not. (Again, not one single empirical example of that rarified consciousness--just conscious but not conscious of anything or a disembodied consciousness.)

ETA: This body/mind I think of as myself is about to go out to do a juggling gig. Some 25 years back I wrote a poem playing with the sort of mindless or no-self experience of juggling--what Csíkszentmihályi calls "flow" or others call being "in the zone". I titled the poem "Eye the Juggler" (a play on "I, the Juggler"--I wrote it for a poetry reading where I recited the poem while juggling in front of the audience). I remember one bit of it was a description of what juggling looks like to the audience, and then the observation, "On this side, it's just the same."

I would point out that neuroscience explains this stuff pretty well too--"muscle memory" or what happens when you practice a skill over and over again. (Some of the learning is letting unused synapses die off--especially in younger people, and some of it is a form of chemically "strengthening" other synapses, and still others is actually growing new anatomy in the form of new synapses.)
 
Last edited:
Has it occured to you that, since you (by your own admission) don't know anything about this issue, you might want to start at the very beginning?

By doing this exercise you are supposed to get an idea of what "the consciousness" is. If you can't ask yourself these kinds of questions then how on Earth do you hope to understand your own consciousness?

Well if I admitted to knowing nothing about this, then I was referring to the process by which the structure of the brain can give rise to consciousness.

On the other hand, I have done an inordinate amount of exercises like this, including 5 years of meditation. In fact it is the experience of exercises promoting the awareness of self, surroundings, thoughts and feelings that have lead me to my understanding of what it feels like to be conscious. Interestingly, when I talk to people with a philosophical system which involves consciousness and who have practised for some time (Hindus, Buddhists), we generally seem to agree on what we mean by consciousness. Unfortunately they then bring in all sorts of woo to explain things and there we part ways.
 
A stumbling block these "consciousness" threads never seem to face is what I'd call awareness vs. consciousness.

Self referential information processing should allow every instance of awareness to be programmed using any computing substrate, and for mineral, plant, or animal. The thermostat exhibits awareness, so do plants, animals.

Add another layer of self referential programming, and decisions based on awareness as provided by sensors interacting with environment become possible. Lifeforms with neural systems seem to do this. This level of programming should also allow computers to use language level communication with humans, and for most of us appear to pass Turing test criteria.

The strong AI proponents seem to have faith this level of behavior is prima facie evidence of consciousness. Other like westprog and Fontwell appear to disagree.

Is this the Hard Problem?

I think so
 
Which tells us nothing about what it feels like to be a mouse. This is the big question which has been solved by the Strong AI protagonists by ignoring it, or pretending that what we "feel like" is simply a matter of itemising all the switches and links in the brain, or saying that we don't really feel like anything, or claiming that everything feels like itself, and a domestic heating system has its own sense of purpose.

Well it is quite simple in my opinion.

Take what it feels like to be yourself. Now remove any aspect of that experience that comes from physical structure that a mouse does not have.

What remains is probably very close to what it feels like to be a mouse.
 
Well if I admitted to knowing nothing about this, then I was referring to the process by which the structure of the brain can give rise to consciousness.

On the other hand, I have done an inordinate amount of exercises like this, including 5 years of meditation. In fact it is the experience of exercises promoting the awareness of self, surroundings, thoughts and feelings that have lead me to my understanding of what it feels like to be conscious. Interestingly, when I talk to people with a philosophical system which involves consciousness and who have practised for some time (Hindus, Buddhists), we generally seem to agree on what we mean by consciousness. Unfortunately they then bring in all sorts of woo to explain things and there we part ways.

No, you haven't done any exercises like this at all. Otherwise you would already have the answers to your questions.

Meditation is not anything close to what I asked you to do.
 
I understand that (and said so). I think your use of "I" this way points to a flaw in your understanding, though.

It's not that consciousness is "closely associated with our bodies", it's that it's a property or function of our bodies. You keep talking of it as if it's a separate and separable thing, even though it's not. (Again, not one single empirical example of that rarified consciousness--just conscious but not conscious of anything or a disembodied consciousness.)

You may well be correct. Confusingly, I do think that consciousness is somehow an emergent property of our brains in the same way that wetness is an emergent property of H2O molecules. However, the experience of what it is like to have perceptions seems to be very different to the physical thing that is a body.

Several times there have been points when I've said 'this is the heart of it' and this is another. I can't reconcile my ability to experience perception, thoughts, emotions and existence, with the physical machine that is my body and brain. This seems to be something that you either get or you don't get. No amount of detail in processing or wiring can explain to me the qualitative leap from physical goings on to what I feel, or more precisely, that I even can feel.

And yet I do think consciousness must just be the result of physical stuff doing processes.

I have already said I don't think you can have disembodied consciousness. Maybe this might help. modelling clay. It has to have a shape to exist. Something imposes a shape on the clay and it stays in that shape until something else turns it into another shape. The shapes are very interesting but the shapes are not the clay because it's the same clay whatever the shape happens to be. Similarly the clay is not defined in terms of having a shape. But it always does have a shape. It can't exist without a shape but it isn't the shape.

I think I give up, I can't even follow my own explanations now :)
 
Last edited:
No, you haven't done any exercises like this at all. Otherwise you would already have the answers to your questions.

Meditation is not anything close to what I asked you to do.

Well I'm afraid I have to differ with you on every point there. There's not much more I can say. I have done many exercises observing the processes of my mind. I still don't have answers that fill the gaps between the physical and the experiential. If anything, my exercises only served to broaden the gap. And I think meditation is related to what you asked me to do, even if it isn't the same. I actually feel insulted by your reply.

Also, I don't have convenient access to a leafy tree right now.

ETA Bollox to the lot of you (apart from westprog), I come here trying to reconcile my understanding with that of the experts and then get told I'm a liar.
 
Last edited:

Back
Top Bottom