• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
You're being facetious, but Pixy actually believes toasters are conscious (the higher end models, I think).

But I'm sure nobody considers a toaster alive, so you're just as facetious.

Besides, my question has some validity. If consciousness is tied to life, then why isn't my toe conscious ? As far as all the ingredients for life are concerned, all cells in our body have an awful lot in common.
 
I've never fully grasped what Daniel Dennet is saying. He claims qualia don't exist. But what are qualia defined as exactly? As far as I can tell, qualia is a nebulous, difficult to define sense of what-its-likeness. As some philosopher says, examine consciousness and it all dissolves into processes. But still, there is an an "inner light" a subjective state of experience. From what I can tell, Dennet seems to think this subjective experience doesn't exist. This I don't understand.
It may not be objectively measurable. Ok. But to some extent, you're pulling out the foundation of all observational inference. Without a Cogito ergo sum, or just Cogito for you super skeptics, the house of reality modeling comes crumbling down.

I have no problem with the notion that the objective functions of mental processing can be basically explained with our current states of thinking. What remains unclear to me is why the subjective experience. As it's been pointed out, that subjective experience does not seem "necessary." Biologically, the system would seem to work as well if we were p-zombies (except maybe it wouldn't exist, since no one could "observe" it). Dennet says the functions of consciousness are consciousness. But how? In other words, what makes consciousness emerge? Is it a property of dynamic information complexity? A result of the physical biological systems? At what threshold does subjectivity emerge? It seems to me that rather then take these questions seriously, Dennet just says the questions don't exist.
We don't have a large conceptual problem with explaining how the properties of water molecules lead to the emergent behavior of water. Similarly, we don't have a big conceptual problem with how the biological function of the brain leads to the emergent behavior of complex organisms. But we still have no model for how the emergent property of consciousness comes about.

What thing I find frustrating is that often in discussing this with people who favor Dennet's arguments, they will treat your belief in existence of the hard problem as being akin to believing in leprachauns. Or assume you are a dualist. I think that it is very likely that consciousness can eventually be explained based on the physical properties of the brain, or by information theory or something. I just think we're not there, and frankly, not even close.
 
As it's been pointed out, that subjective experience does not seem "necessary." Biologically, the system would seem to work as well if we were p-zombies (except maybe it wouldn't exist, since no one could "observe" it)

From these observations, it follows logically that we are, in fact, p-zombies. Evolution has selected us purely on external function, and couldn't care less about subjective feelings that have no effect.
 
I've never fully grasped what Daniel Dennet is saying. He claims qualia don't exist. But what are qualia defined as exactly? As far as I can tell, qualia is a nebulous, difficult to define sense of what-its-likeness. As some philosopher says, examine consciousness and it all dissolves into processes.
Right. Physical processes.

But still, there is an an "inner light" a subjective state of experience. From what I can tell, Dennet seems to think this subjective experience doesn't exist.
No, not at all. Qualia are defined as what is left over once you've excluded all that is explained by physical processes. Dennett is pointing out that there's no reason to believe that anything is left over.

I have no problem with the notion that the objective functions of mental processing can be basically explained with our current states of thinking. What remains unclear to me is why the subjective experience. As it's been pointed out, that subjective experience does not seem "necessary." Biologically, the system would seem to work as well if we were p-zombies (except maybe it wouldn't exist, since no one could "observe" it).
No, you can't build a P-Zombie. It's a physical impossibility; you can't get human-complexity behaviour without the internal reflection of consciousness.

Dennet says the functions of consciousness are consciousness. But how? In other words, what makes consciousness emerge? Is it a property of dynamic information complexity? A result of the physical biological systems? At what threshold does subjectivity emerge? It seems to me that rather then take these questions seriously, Dennet just says the questions don't exist.
No, he doesn't say that. Are you reading Dennett, or reading what others are saying about Dennett? If the latter, they're wrong.

We don't have a large conceptual problem with explaining how the properties of water molecules lead to the emergent behavior of water. Similarly, we don't have a big conceptual problem with how the biological function of the brain leads to the emergent behavior of complex organisms. But we still have no model for how the emergent property of consciousness comes about.
Yes, yes we do. Read Dennett. That's what he talks about. Or Hofstadter.

What thing I find frustrating is that often in discussing this with people who favor Dennet's arguments, they will treat your belief in existence of the hard problem as being akin to believing in leprachauns.
No, not at all. The so called "hard problem" is far less believable than leprechauns.

Or assume you are a dualist.
"Hard problem" consciousness is intrinsically a dualistic concept. It is utterly incoherent under physicalism; it's a denial of physicalism. And, of course, baseless.

I think that it is very likely that consciousness can eventually be explained based on the physical properties of the brain, or by information theory or something. I just think we're not there, and frankly, not even close.
But you say that without apparently being aware of where we are.
 
Is a conscious toaster alive?
Not unless you have an odd definition of "alive".

You're being facetious, but Pixy actually believes toasters are conscious (the higher end models, I think).
Sure, at least potentially. Washing machines are probably a better example, and many models of modern cars are certainly conscious.

Why do you think there is any conlict here?
 
I have no problem with the notion that the objective functions of mental processing can be basically explained with our current states of thinking. What remains unclear to me is why the subjective experience. As it's been pointed out, that subjective experience does not seem "necessary."


Welcome to the forums.

Perhaps the problem is with the way we think about subjective experience, with what we think it is?

We used to think rainbows something very different from what they are, but rainbows still exist. Subjective experience will still exist despite a new view of what it might be, but we need to think about it differently if we are going to move ahead with this problem.
 
Absolutely. No-one here is denying that subjective experience happens. We're just pointing out that it has no magical properties, and hence, requires no magical explanation.
 
But I'm sure nobody considers a toaster alive, so you're just as facetious.

LOL, before I got here, I thought nobody would consider a toaster conscious. If you stick around, you'll meet people who won't even admit to being conscious.

Besides, my question has some validity. If consciousness is tied to life, then why isn't my toe conscious ?

Since a cell performs Self Referential Information Processingtm, and your toe is made of cells, then your toe is....

See the absurdity?

As far as all the ingredients for life are concerned, all cells in our body have an awful lot in common.

Right, they're not conscious.
 
1. No, you can't build a P-Zombie. It's a physical impossibility; you can't get human-complexity behaviour without the internal reflection of consciousness.



2.No, he doesn't say that. Are you reading Dennett, or reading what others are saying about Dennett? If the latter, they're wrong.

3.Yes, yes we do. Read Dennett. That's what he talks about. Or Hofstadter.

4. No, not at all. The so called "hard problem" is far less believable than leprechauns.

"Hard problem" consciousness is intrinsically a dualistic concept. It is utterly incoherent under physicalism; it's a denial of physicalism. And, of course, baseless.

But you say that without apparently being aware of where we are.


1. Why?

2. Both. Perhaps I'm misinterpreting him.

3. I've also read Hofstadter. I loved the "Strange Loop" book. But I still don't think he really addresses the actual nature of subjective experience. He talks about why we are so attached to concept, and his argument makes some sense.

4. Why is it a denial of physicalism? It seems you are reading more into the problem then I am.
 
I've never fully grasped what Daniel Dennet is saying. He claims qualia don't exist. But what are qualia defined as exactly? As far as I can tell, qualia is a nebulous, difficult to define sense of what-its-likeness. As some philosopher says, examine consciousness and it all dissolves into processes. But still, there is an an "inner light" a subjective state of experience. From what I can tell, Dennet seems to think this subjective experience doesn't exist. This I don't understand.
It may not be objectively measurable. Ok. But to some extent, you're pulling out the foundation of all observational inference. Without a Cogito ergo sum, or just Cogito for you super skeptics, the house of reality modeling comes crumbling down.

Computationalists hate qualia so therefore qualia don't exist. You can actually punch one in the face if you ever meet one because they don't have subjective experiences (i.e., pain).

I have no problem with the notion that the objective functions of mental processing can be basically explained with our current states of thinking. What remains unclear to me is why the subjective experience. As it's been pointed out, that subjective experience does not seem "necessary." Biologically, the system would seem to work as well if we were p-zombies (except maybe it wouldn't exist, since no one could "observe" it). Dennet says the functions of consciousness are consciousness. But how? In other words, what makes consciousness emerge? Is it a property of dynamic information complexity? A result of the physical biological systems? At what threshold does subjectivity emerge? It seems to me that rather then take these questions seriously, Dennet just says the questions don't exist.
We don't have a large conceptual problem with explaining how the properties of water molecules lead to the emergent behavior of water. Similarly, we don't have a big conceptual problem with how the biological function of the brain leads to the emergent behavior of complex organisms. But we still have no model for how the emergent property of consciousness comes about.

What thing I find frustrating is that often in discussing this with people who favor Dennet's arguments, they will treat your belief in existence of the hard problem as being akin to believing in leprachauns. Or assume you are a dualist. I think that it is very likely that consciousness can eventually be explained based on the physical properties of the brain, or by information theory or something. I just think we're not there, and frankly, not even close.

The ones here are like Christian fundies. They even have a Bible of sorts, that they'll tell you you must read. It's very funny.
 
Right, they're not conscious.

Well, the cells in our brain form a consciousness, which makes me think that their property of being alive has little to do with that (except for the fact that metabolism is needed for the energy this process requires).
 
Is a conscious toaster alive?

PixyMisa said:
Not unless you have an odd definition of "alive".

So a toaster is a not-alive conscious thing in the same respect that a cadaver is a not-alive conscious thing: Neither of them are conscious.


You're being facetious, but Pixy actually believes toasters are conscious (the higher end models, I think).

PixyMisa said:
Sure, at least potentially. Washing machines are probably a better example, and many models of modern cars are certainly conscious.

Why do you think there is any conlict here?

And there you have it, folks. You can't make this stuff up.
 
A P-Zombie is defined as something that behaves exactly like a conscious human without having any internal self-awareness. That means that if you ask it how it feels, what it is thinking about, and so on, it will answer appropriately, just like a conscious human.

In order to implement that behaviour without the reflective loop that is the basis of consciousness, you need a simpler mechanism like a static lookup table or decision tree. However, a static representation of all possible human behaviours would be so large as to make the term "astronomical" feel inadequate.

3. I've also read Hofstadter. I loved the "Strange Loop" book. But I still don't think he really addresses the actual nature of subjective experience.
When you say the nature of subjective experience, what do you mean? Godel, Escher, Bach and I am a Strange Loop are all about how subjective experience arises from physical systems.

Subjective experience is the strange loop of the title. That's it. That's the point. All it is, is a computer (an information processing system) examining its own operation (self-reference).

4. Why is it a denial of physicalism? It seems you are reading more into the problem then I am.
David Chalmers invented the term, and he is quite explicit that the so-called "hard problem" is what is left over when the physical is accounted for:

Wikipedia said:
In considerations by David Chalmers, this is contrasted with the "easy problems" of explaining the ability to discriminate, integrate information, report mental states, focus attention, etc. Easy problems are easy because all that is required for their solution is to specify a mechanism that can perform the function. That is, their proposed solutions, regardless of how complex or poorly understood they may be, can be entirely consistent with the modern materialistic conception of natural phenomena. Chalmers claims that the problem of experience is distinct from this set, and he assumes that the problem of experience will "persist even when the performance of all the relevant functions is explained".
Dennett points out that Chalmers' claim has no basis in evidence.
 
1. A P-Zombie is defined as something that behaves exactly like a conscious human without having any internal self-awareness. That means that if you ask it how it feels, what it is thinking about, and so on, it will answer appropriately, just like a conscious human.

In order to implement that behaviour without the reflective loop that is the basis of consciousness, you need a simpler mechanism like a static lookup table or decision tree. However, a static representation of all possible human behaviours would be so large as to make the term "astronomical" feel inadequate.


2. When you say the nature of subjective experience, what do you mean? Godel, Escher, Bach and I am a Strange Loop are all about how subjective experience arises from physical systems.

Subjective experience is the strange loop of the title. That's it. That's the point. All it is, is a computer (an information processing system) examining its own operation (self-reference).


3. David Chalmers invented the term, and he is quite explicit that the so-called "hard problem" is what is left over when the physical is accounted for:


Dennett points out that Chalmers' claim has no basis in evidence.

1. Could you explain what you mean by "reflective loop" a little more precisely?

2. While I enjoyed the book, Hofstadter lost me a bit with all the metaphors. ( I still don't exactly see what the Incompleteness theorem has to do with anything.) I you understood what he meant, and could express it a bit more concisely, that would be greatly appreciated.

3. Fair enough. I don't exactly agree with Chalmers there. It might be a case of sloppiness of language. I am using "hard problem" in the more colloquial sense of a problem with how subjective experience arises. I guess you could call me a hard problem agnostic, in the sense that I just don't know. I do treat the problem as a real one, rather then a metaphysical or semantic one.
 
I guess you could call me a hard problem agnostic, in the sense that I just don't know. I do treat the problem as a real one, rather then a metaphysical or semantic one.

I agree it's a real problem, but not in the sense that Chalmers claims it is. The real problem is realizing that his p-zombie has the same subjective experiences as we do. This is hard, because we're not used to thinking that way.

It's like dealing with the fact that the earth revolves around the sun, when it's so obvious that the sun is going around the earth.

ETA: In addition to books by Dennett and Hofstadter, I recommend reading "the man who mistook his wife for a hat" by Oliver Sacks. Reading about what happens when a brain doesn't work right can be very enlightening. There are also some youtube movies about split brain patients, who appear to have a separate consciousness in each half of their brain.
 
Last edited:
1. Could you explain what you mean by "reflective loop" a little more precisely?
It's a computer programming term: http://en.wikipedia.org/wiki/Reflection_(computer_programming)

Basically, it's the situation where a program is examining and reporting on and/or modifying its own behaviour. This is common in modern computer applications, including embedded systems like washing machines and cars (which fact has sent Malerin into a tizzy, but that's not unusual).

That's what consciousness is.

2. While I enjoyed the book, Hofstadter lost me a bit with all the metaphors. ( I still don't exactly see what the Incompleteness theorem has to do with anything.) I you understood what he meant, and could express it a bit more concisely, that would be greatly appreciated.
Well, they're not metaphors, exactly. They're different perspectives on the same thing. Godel's Incompleteness Theorems (there's two related proofs) turn mathematics in on itself, self-referentially.

Self-reference is the key insight. Like hexapodia.

3. Fair enough. I don't exactly agree with Chalmers there. It might be a case of sloppiness of language. I am using "hard problem" in the more colloquial sense of a problem with how subjective experience arises. I guess you could call me a hard problem agnostic, in the sense that I just don't know. I do treat the problem as a real one, rather then a metaphysical or semantic one.
Okay. Well, this is essentially what Dennett and Hofstadter are addressing: Chalmers (and others like him, such as John Searle and Frank Jackson) assert that there is an unbridgeable divide between the physical and subjective experience.

Hofstadter and Dennett respond: No, here's how the divide could be bridged; and what's more, that appears to be how it is bridged.
 
It's a computer programming term: http://en.wikipedia.org/wiki/Reflection_(computer_programming)

Basically, it's the situation where a program is examining and reporting on and/or modifying its own behaviour. This is common in modern computer applications, including embedded systems like washing machines and cars (which fact has sent Malerin into a tizzy, but that's not unusual).

That's what consciousness is.

I still don't get why this necessarily leads to subjective experience.
 
Status
Not open for further replies.

Back
Top Bottom