• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Hard consciousness: binary? cline? else?

JeanTate

Illuminator
Joined
Nov 18, 2014
Messages
4,001
Many of you reading this thread will have heard of the “hard problem of consciousness“; WP is as good an intro as any.

In this thread, I’d like to discuss a somewhat orthogonal question: is consciousness, of the “hard problem” kind, binary?

It’s a question I’ve not seen discussed; if you know of a good one, please cite.

Often, consciousness of the hard problem kind seems to refer to fully conscious, fully functioning adult humans, as well as AIs, philosophical zombies, etc.

But does whatever consciousness we have when dreaming, with brains ravaged by disease or drugs, etc count as consciousness?

In a recent discussion elsewhere - on an unrelated topic - a participant talked about a repugnance to torture dogs, but zero qualms about “torturing” azalea bushes. He said this had to do with a recognition that dogs have consciousness (of the hard problem kind), but azaleas do not. And torturing computers - AI or not - is impossible. IOW, he was, briefly, sketching boundaries.

So, where does consciousness begin, and end? When did it first evolve? Any convergent evolution (e.g. Octopoda vs Primate)?
 
Last edited:
Good question, but I have no idea how to answer.

Personally I am not so sure that an AI cannot have a consciousness in principle. Nor even maybe an azalea bush, although I rather doubt it. Do babies have consciousness? I assume they do, and yet I cannot remember being a baby at all. My earliest memories begin around 2 years of age and have all but faded away into a few images. Our perspective may be biased. We don't know what it is like to be another creature or another entity, so how can we really say with any certainty?
 
Many of you reading this thread will have heard of the “hard problem of consciousness“; WP is as good an intro as any.

In this thread, I’d like to discuss a somewhat orthogonal question: is consciousness, of the “hard problem” kind, binary?

It’s a question I’ve not seen discussed; if you know of a good one, please cite.

Often, consciousness of the hard problem kind seems to refer to fully conscious, fully functioning adult humans, as well as AIs, philosophical zombies, etc.

But does whatever consciousness we have when dreaming, with brains ravaged by disease or drugs, etc count as consciousness?

In a recent discussion elsewhere - on an unrelated topic - a participant talked about a repugnance to torture dogs, but zero qualms about “torturing” azalea bushes. He said this had to do with a recognition that dogs have consciousness (of the hard problem kind), but azaleas do not. And torturing computers - AI or not - is impossible. IOW, he was, briefly, sketching boundaries.

So, where does consciousness begin, and end? When did it first evolve? Any convergent evolution (e.g. Octopoda vs Primate)?


I view it like most physical properties of living creatures - there is a range but no clear division. When I'm using the word "consciousness" all I mean is the processing of stimuli, we tend to reserve it for the apparently more complex processing - so we don't say a plant has consciousness because it reacts to the direction of the sun as it grows. I'd be happy to say a nervous system with a central processing collection of neurons is the minimum required for any consciousness, after that it's all a matter of degree.
 
When philosophers get involved with consciousness, they tend to get stuck in endless arguments over defining “qualia” (experience).
Modern neuroscience tends to look at what’s actually going on in the brain.

At one level, consciousness is just being aware of the environment and being able to respond to it. A housefly would be “conscious” in that regard. But we normally use the term to describe the multiple cognitive functions of much more-advanced brains, like humans and a few other animals.
We know that there are many discrete structures in the brain, and that they are all cross-wired very extensively. Our primitive “reptile” brain communicates with the most-advanced structures like the pre-frontal cortex.
So it’s felt that consciousness is the sum total of brain function, the ability to not only have sensory input but to analyze and correlate those functions along with memory, emotion, analysis.... Etc.
As to exactly how that all works, and how such correlations and analysis lead to new ideas and creativity and such... Not well understood to put it mildly. But it’s only been for the last few decades that we’ve been able to image brain function in real time with modern scanning technology.
 
What is it like to be an electron? An interview with Galen Strawson

In recent years more and more philosophers seem to have embraced panpsychism—the view that consciousness pervades the universe and so is present, in however simple a form, in every little speck of matter. It’s a view that’s hard to wrap your mind around, so I’m glad I got to have a conversation with Galen Strawson, a noted philosopher who is one of its most articulate proponents (and who, as a bonus, is charmingly offbeat). I interviewed Galen on the Wright Show (available on both meaningoflife.tv and as an audio podcast) more than a year ago. Below is an extended excerpt.

Perhaps consciousness is a property of all baryonic matter, not just organic lifeforms.
 
When philosophers get involved with consciousness, they tend to get stuck in endless arguments over defining “qualia” (experience).
Inner experiences, such as the pain of a headache, the taste of wine, the redness of the evening sky ... that sort of thing, right?

Taking just the first: can dogs experience the pain of a headache? How about mice? A fruitfly? A nematode worm (e.g. C. elegans)?

Do philosophers concern themselves with the qualia of dogs?

Modern neuroscience tends to look at what’s actually going on in the brain.
AFAIK, in C. elegans the "what's actually going on" is now pretty well understood down to the level of each individual nerve cell (all ~200 of them).

At one level, consciousness is just being aware of the environment and being able to respond to it. A housefly would be “conscious” in that regard.
By that criterion, at the limit, even live bacteria could be conscious (not sure about viruses)! :)

But bacteria do not have qualia, do they?

And it should be a piece of cake to build a gadget, out of silicon and metal, which is aware of its environment and able to respond to it ... would that gadget be conscious?

But we normally use the term to describe the multiple cognitive functions of much more-advanced brains, like humans and a few other animals.
We know that there are many discrete structures in the brain, and that they are all cross-wired very extensively. Our primitive “reptile” brain communicates with the most-advanced structures like the pre-frontal cortex.
So it’s felt that consciousness is the sum total of brain function, the ability to not only have sensory input but to analyze and correlate those functions along with memory, emotion, analysis.... Etc.
Hence my question about Octopoda.

More advanced brain? Check.

Multiple cognitive functions? Check.

Many discrete structures? Check.

Extensively cross-wired? Check.

But, evolution-wise, as invertebrates, not in any way related to reptiles (other than, perhaps, convergent evolution).

As to exactly how that all works, and how such correlations and analysis lead to new ideas and creativity and such... Not well understood to put it mildly. But it’s only been for the last few decades that we’ve been able to image brain function in real time with modern scanning technology.
Maybe at least some qualia will be well-understood enough that my experience of the redness of red sunsets can be simulated in your brain (even if you are color-blind)? ;)

Or will qualia be forever ineffible? :rolleyes:
 
Last edited:
You might enjoy Dennett's Kinds of Minds and Consciousness Explained.

Unless you're a p-zombie, in which case you'll only think you enjoy them.
 
Inner experiences, such as the pain of a headache, the taste of wine, the redness of the evening sky ... that sort of thing, right?



Taking just the first: can dogs experience the pain of a headache? How about mice? A fruitfly? A nematode worm (e.g. C. elegans)?



Do philosophers concern themselves with the qualia of dogs?





AFAIK, in C. elegans the "what's actually going on" is now pretty well understood down to the level of each individual nerve cell (all ~200 of them).





By that criterion, at the limit, even live bacteria could be conscious (not sure about viruses)! :)



But bacteria do not have qualia, do they?



And it should be a piece of cake to build a gadget, out of silicon and metal, which is aware of its environment and able to respond to it ... would that gadget be conscious?





Hence my question about Octopoda.



More advanced brain? Check.



Multiple cognitive functions? Check.



Many discrete structures? Check.



Extensively cross-wired? Check.



But, evolution-wise, as invertebrates, not in any way related to reptiles (other than, perhaps, convergent evolution).





Maybe at least some qualia will be well-understood enough that my experience of the redness of red sunsets can be simulated in your brain (even if you are color-blind)? ;)



Or will qualia be forever ineffible? :rolleyes:
Qualia don't exist or at least the way philosophy uses the term. There is no requirement for them to exist apart from making us something more than self ambulatory bags of chemicals and water.
 
I tend to think that consciousness is not binary, but a continuum ranging from bacteria to humans, and I think that qualia is an artificial concept designed to make consciousness mysterious and dualistic.
 
Qualia don't exist or at least the way philosophy uses the term. There is no requirement for them to exist apart from making us something more than self ambulatory bags of chemicals and water.
Indeed.

It would seem impossible to engage a dog in a discussion of the inner experience of the pain of a headache, much less a C. elegans :D

Such a discussion might tax some four year old Homo sapiens too, whether they are fully awake or dreaming.

Even the boldest philosophers would surely quail at the idea of elucidating the nature of qualia in quails :p
 
You might enjoy Dennett's Kinds of Minds and Consciousness Explained.

Unless you're a p-zombie, in which case you'll only think you enjoy them.

Thanks.

If you have read this, do you know if Dennertt discusses the nature of qualia in quails? ;)

Would a p-zombie be confused by the idea of qualia? :p
 
I tend to think that consciousness is not binary, but a continuum ranging from bacteria to humans, and I think that qualia is an artificial concept designed to make consciousness mysterious and dualistic.
If an organism - bacterium or human - is alive, does the amount of consciousness it has change during its life?

For example, the same when you are trying to write down a taxonomic scheme for qualia as when you’re intubated and deeply sedated in an ICU?
 
Indeed.



It would seem impossible to engage a dog in a discussion of the inner experience of the pain of a headache, much less a C. elegans :D



Such a discussion might tax some four year old Homo sapiens too, whether they are fully awake or dreaming.



Even the boldest philosophers would surely quail at the idea of elucidating the nature of qualia in quails :p
Imagine red in your mind's eye. That is meant to be the qualia of experiencing red. Meant to be totally separate from seeing a red apple and experiencing red.
 
Thanks.



If you have read this, do you know if Dennertt discusses the nature of qualia in quails? ;)



Would a p-zombie be confused by the idea of qualia? :p
And yes, us p-zombies are confused by the idea. I only experience the experience of red when I see something red, I have no "mind's eye" so I have no qualia of experience of red.
 
Thanks! :)

What is it like to be an electron? An interview with Galen Strawson



Perhaps consciousness is a property of all baryonic matter, not just organic lifeforms.
That may be an important contribution to philosophy, but not to science I feel.

If an object contains ~1024 electrons, does it have ~1024 the level of consciousness an electron does?

I wonder what the taxonomy of qualia is, in Strawson's panpsychism?

What is the pain of a headache for an electron? Is that quale the same for all electrons?
 
And yes, us p-zombies are confused by the idea. I only experience the experience of red when I see something red, I have no "mind's eye" so I have no qualia of experience of red.
Ah, but do you - sometimes at least - dream in color? :p :D
 
You might enjoy Dennett's Kinds of Minds and Consciousness Explained.

Unless you're a p-zombie, in which case you'll only think you enjoy them.
I own Daniel Dennett's Consciousness Explained, but I have never met him.

I recommend Antonio Damasio's The Feeling of What Happens: Body and Emotion in the Making of Consciousness. My copy is autographed to me.
 
Good question, but I have no idea how to answer.

Personally I am not so sure that an AI cannot have a consciousness in principle. Nor even maybe an azalea bush, although I rather doubt it. Do babies have consciousness? I assume they do, and yet I cannot remember being a baby at all. My earliest memories begin around 2 years of age and have all but faded away into a few images. Our perspective may be biased. We don't know what it is like to be another creature or another entity, so how can we really say with any certainty?
I am perplexed as to how AI can have consciousness without having personal and selfish motivations.
 

Back
Top Bottom