• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
No, it's not the pattern of neural activation. It's your feeling of mild amusement, which doesn't show up on the ECG.
The ECG can only measure the gross external, objective correlates of my experiences. The map is not the territory.

The neural activation and the feeling of mild amusement are clearly linked, but equally clearly they aren't the same thing.
They are the same thing from the inside, when I am that brain.
 
Yes, I understand what the Hard Problem of Consciousness is, and yes, I've read Chalmers' articles.



I think Chalmers provides a reasonably comprehensive overview of the issues and approaches - although he doesn't cover one that, for a while, I thought might be interesting - Emergence:

[Jeffrey Goldstein on emergence:
"the arising of novel and coherent structures, patterns and properties during the process of self-organization in complex systems"

"The common characteristics are:
(1) radical novelty (features not previously observed in systems);
(2) coherence or correlation (meaning integrated wholes that maintain themselves over some period of time);
(3) A global or macro "level" (i.e. there is some property of "wholeness");
(4) it is the product of a dynamical process (it evolves); and
(5) it is "ostensive" (it can be perceived)."

See Emergence & Creativity for an interesting read.
]

However, I'm not quite so enthusiastic about the emergence hypothesis these days, for various reasons, although I think emergence may be useful conceptually in bridging the subjective-objective gap.

Basically, Chalmers hits the nail on the head quite early on, when he says:

"The hard problem is about explaining the view from the first-person perspective."
and:
"..whatever account of processing we give, the vital step - the step where we move from facts about structure and function to facts about experience - will always be an extra step, requiring some substantial principle to bridge the gap."
and:
"... any neurobiological or cognitive account, will be incomplete"

This seems key. What 'substantial principle' can bridge that gap? Only some arbitrary and axiomatic contingency. The problem is one of trying to find an objective explanation for something that requires a subjective explanation. AFAICS, ultimately a completely satisfying objective explanation isn't possible, because there will always be that ugly join.

All the other arguments and approaches to an objective description of consciousness founder on the same rock - the metaphysical chasm between objective & subjective. It's all speculative hand-waving that goes nowhere.

I particularly discount quantum explanations as based on misunderstanding or misapplication of QM. I also discount pan-psychism as a futile attempt to hide or ignore the problem by making it ubiquitous.

Chalmers says "if it turns out that it cannot be explained in terms of more basic entities, then it must be taken as irreducible, just as happens with such categories as space and time", which seems as much a cop-out as the Type A Materialism (nothing to explain) he dislikes so much, and he criticises the Type-B materialism of Clark and Hardcastle who postulate an empirical identity between conscious experiences and physical processes, because it makes that identity fundamental (irreducible) - perhaps too much like his own conclusion about irreducibility (above)...?

Again, he talks of examining physical process and phenomenology to "find systematic regularities between the two; work down to the simpler principles which explain these regularities in turn; and ultimately explain the connection in terms of a simple set of fundamental laws". Quite how these "underlying brutely contingent fundamental laws", that define contingent relations between the principles of function & process and of consciousness, are substantially different from the Clark and Hardcastle approach, he doesn't clarify. Nevertheless, despite some apparent contradictions, I think he does a good job overall.

If anything, I prefer a version of original Identity Theory (a pre-Type B Materialist approach) which proposes that the sensation of a thing is the sort of state caused by that thing (with the caveat that this occurs in a complex system structured like the brain).

But ultimately, there's no way to objectively explain the subjective position/experience of being the complex system under consideration. Experience and conscious awareness is what it is like to be an active brain, of sufficient complexity, in the awake state. That's why my starting point is the waves of activation that sweep across the brain when our conscious focus of awareness changes. When you are that brain, those patterns of activation are your experiences.

It seems to me that the best we can do is to examine the function of the brain to narrow down the subsystems necessary for consciousness, and to investigate how these subsystems contribute to consciousness. Once we have a better understanding of how the system is put together, we may get a better understanding of why consciousness is the result.


Yes, right. That is why I kept harping on the issue of descriptions and actions in the other thread with Cornsail.

Objective descriptions will always remain as descriptions. They will never be the thing they describe. When it comes to consciousness we are dealing with a very complex set of actions that *are* us. We, our minds, are that complex pattern of neuron activations. The best any neuroscience will be able to do is describe how those actions occur; but those descriptions will never be the actions themselves. But I don't see that as a problem since that is the case with all descriptions and actions. The big difference with subjectivity is that it is a type of action that constitutes experiencing, which is frankly a different sort of action than a horse running or a bird flying.
 
The ECG can only measure the gross external, objective correlates of my experiences. The map is not the territory.


They are the same thing from the inside, when I am that brain.


I didn't mention this earlier because I didn't think anyone would reply to Westprog, but you guys have the wrong acronym.

ECG = Electrocardiogram
EEG = Electroencephalogram

You want EEG. And EEG is a very gross measure of large areas of neuronal activity working by comparing potentials from one area against a common reference or potentials from another brain area. As you mention it shouldn't even be a part of this discussion*.













*Except historically since Berger, who invented EEG, wanted it to be a graphically representation of thought. Turns out it is much better as a gross way of measuring whole brain functioning, a very gross tool for localizing focal brain dysfunction, and as a diagnostic tool for epilepsy.
 
I think that what you've just posted is the equivalent to trying to explain the chemical properties of water in terms of plumbing. I don't think you understand what it is I'm even asking :-/
So melodramatic. I believe I do understand what you're trying to ask, it's just that the actual questions you ask don't address the point - as my literal answers demonstrate.

I honestly cannot fathom how that lack of understating doesn't spark even a little itch of curiosity in you...
Really? Your incredulity is amusing as ever. So remind me again where I said I wasn't curious.

I'm asking how the components of consciousness in-and-of-itself [what are referred to as 'qualia'] are related to the -physics- of brain activity and -physics- in general.
Why do you think the -physics- of brain activity is more relevant than the functional architecture and organization of the brain? and what makes you think that 'qualia' are related to physics in general?

This is a huge scientitic frontier -- a gaping hole in our understanding of ourselves and reality -- and you're writting it off as insignificant????
Are you sure? I don't recall saying that. Perhaps you didn't read carefully enough. This is what I said:

I don't think we have sufficient understanding of how the organisation of the brain gives rise to the components of consciousness, awareness and perception, and I do think it's worth bothering to gain a deeper grasp of it.
(emphasis added)

Not sure where lack of curiosity and 'writing it off as insignificant' can be read into that...

I'm genuinely flabbergasted...
I can't say I'm surprised.

But enough about me - what, if anything, do you have to say about the physics of 'qualia'?
 
I didn't mention this earlier because I didn't think anyone would reply to Westprog, but you guys have the wrong acronym.

ECG = Electrocardiogram
EEG = Electroencephalogram.
Oops - I didn't even notice, and me with a degree in human biology and all...:o
 
I think that what you've just posted is the equivalent to trying to explain the chemical properties of water in terms of plumbing. I don't think you understand what it is I'm even asking :-/
So melodramatic. I believe I do understand what you're trying to ask, it's just that the actual questions you ask don't address the point - as my literal answers demonstrate.

No, no, no... Your answers aren't addressing the point of my questions, leading me to conclude that you don't understand the questions :p

I honestly cannot fathom how that lack of understating doesn't spark even a little itch of curiosity in you...
Really? Your incredulity is amusing as ever. So remind me again where I said I wasn't curious.

My bad. I read that "do think" of yours as a "don't think". I need to get muh eyes checked x-P

I'm asking how the components of consciousness in-and-of-itself [what are referred to as 'qualia'] are related to the -physics- of brain activity and -physics- in general.

Why do you think the -physics- of brain activity is more relevant than the functional architecture and organization of the brain? and what makes you think that 'qualia' are related to physics in general?

For the same reasons why I'd say that the underlying physics of plant metabolism are more relevant to understanding photosynthesis than canopy structure. The computational architecture of the brain is only relevant to how our experiences are organized; what we currently lack is a physical understanding of the experiences in themselves. Subjective experiences (and their dynamics) are inextricably tied into the cognition, general behavior, and physiological functions of an individual. They are concrete phenomena [i.e. not merely abstractions] with physical consequences and we need to understand, in physical terms, how they relate to and interact with physical phenomena that are modeled in our current theories.

As I've already mentioned numerous times, 'qualia' is just a term for what our conscious experiences IAOT reduce to [for instance, various qualia, such as colors, sounds, tastes, emotions, etc. are woven together to form a single experience in the awareness of an individual]. I'm curious as to how such things relate to what we know of physics and the most direct way, IMO, is to find the ontological relationship between qualia [the basic elements of subjective experience] and quanta [the fundamental units of physical objects]. If we can do that we'll not only have a suitable answer for the HPC but also a theoretical framework that would allow us to design conscious systems and have some understanding of what those systems may be experiencing or are capable of experiencing.

This myopic obsession with mapping the functional architecture of the brain does little to nothing in terms of giving us the physical understanding of consciousness required to scientifically propose how to reproduce it synthetically. Do you understand where I coming from on this?
 
Last edited:
I would suggest that once we understand all the processing underlying mirror neurons we will be able to formalize those issues better.
Yes, mirror neurons indicate recognition of others like self, on which Theory Of Mind presumably rests, but for the subtleties Frank is after, we need some sense of group membership and cooperative benefit. In animals this presumably evolved from the family group, where dependent young need parental support/protection. Extending this to larger groups, basic 'moral' behaviours such as reciprocal altruism, etc. (as well as 'immoral' behaviours such as 'free-riding'), seem to appear quite naturally. Trust, sincerity, and truthfulness begin to become useful concepts where the capability for deliberate deception and bluff has developed (this seems to be roughly where certain corvids are at).

I guess in an artificial consciousness, drivers for those basic behaviours would need to be provided.
 
This myopic obsession with mapping the functional architecture of the brain does little to nothing in terms of giving us the physical understanding of consciousness required to scientifically propose how to reproduce it synthetically. Do you understand where I coming from on this?

If we understand the functional architecture of the brain, we have everything we need to reproduce it synthetically (assuming sufficiently advanced technology).

Compare it to producing a faithful replica of a beautiful painting. If done well enough, the replica will also be beautiful, without ever requiring an objective understanding of "beauty"
 
If we understand the functional architecture of the brain, we have everything we need to reproduce it synthetically (assuming sufficiently advanced technology).

Compare it to producing a faithful replica of a beautiful painting. If done well enough, the replica will also be beautiful, without ever requiring an objective understanding of "beauty"

Building a functional replica of a circulatory system does not produce blood. The assumption that merely emulating the computational architecture of the brain will produce consciousness is a form of cargo cult thinking. We absolutely must understand the -physics- of what the brain is doing with regard to consciousness before we can ever seriously propose instantiating it artificially. This is a critical scientific hurdle that must be overcome and I don't think a lot of individuals in these discussions are prepared to even acknowledge it :-/
 
No, no, no... Your answers aren't addressing the point of my questions, leading me to conclude that you don't understand the questions :p
I answered the questions you asked. Perhaps you're asking the wrong questions for your point.

For the same reasons why I'd say that the underlying physics of plant metabolism are more relevant to understanding photosynthesis than canopy structure.
But the so-called Hard Problem is essentially different from that. If you find that a relevant or useful analogy, I don't think you understand the Hard Problem.

... Subjective experiences (and their dynamics) are inextricably tied into the cognition, general behavior, and physiological functions of an individual. They are concrete phenomena [i.e. not merely abstractions] with physical consequences and we need to understand, in physical terms, how they relate to and interact with physical phenomena that are modeled in our current theories.
Well good luck with that. Incidentally, how do you suggest that might be achieved? after all, if subjective experiences are concrete phenomena, shouldn't they be objectively observable and measurable...?

... I'm curious as to how such things relate to what we know of physics and the most direct way, IMO, is to find the ontological relationship between qualia [the basic elements of subjective experience] and quanta [the fundamental units of physical objects].
Good luck with that too. What's your plan of attack?

This myopic obsession with mapping the functional architecture of the brain does little to nothing in terms of giving us the physical understanding of consciousness required to scientifically propose how to reproduce it synthetically. Do you understand where I coming from on this?

Yes, I understand where you're coming from, and I think you're wrong - but I'm quite prepared to change my mind if you can describe and explain a plausible plan of attack to gain a physical understanding of consciousness that ignores the functional architecture of the brain.
 
And while you're at it, please explain why some qualia appear to be reducible to more than one process if they are supposed to be the quanta of mental life.
 
The assumption that merely emulating the computational architecture of the brain will produce consciousness is a form of cargo cult thinking.

If we have a replica of the brain, hook it up to all the proper input and output interfaces, then we can compare the functionality with a human brain. We have 3 possible outcomes:

  1. A detectable difference in behavior. In this case, there must be a flaw in the emulation that can be fixed by zooming in on the exact difference.
  2. A p-zombie, of which you stated earlier that they don't exist.
  3. A replica of human consciousness. We're done.

Or do you propose a 4th alternative ?
 
Building a functional replica of a circulatory system does not produce blood.
No-one said it would. These analogies are straw men.

The assumption that merely emulating the computational architecture of the brain will produce consciousness is a form of cargo cult thinking.
Another straw man analogy. Cargo cult thinking involves imitating the appearance and ritual of the technology without the function. If the cults had copied the airstrips, aircraft, and radios using functional components, things would have been different. We're talking about copying the functional components, structure, and architecture of the brain, not making something that looks like a brain.

We absolutely must understand the -physics- of what the brain is doing with regard to consciousness before we can ever seriously propose instantiating it artificially.
Nature generates and assembles 100 billion neurons and connects them together to give a conscious brain without understanding anything at all...
 
No, no, no... Your answers aren't addressing the point of my questions, leading me to conclude that you don't understand the questions :p
I answered the questions you asked. Perhaps you're asking the wrong questions for your point.

I specifically asked you questions regarding the physical properties that underly subjective experiences and, instead of simply saying "I don't know", you proceeded to give me functional answers. The fact is, no one knows the answers to the questions I just asked, and if they do they aren't sharing it with the rest of humanity.

For the same reasons why I'd say that the underlying physics of plant metabolism are more relevant to understanding photosynthesis than canopy structure.
But the so-called Hard Problem is essentially different from that. If you find that a relevant or useful analogy, I don't think you understand the Hard Problem.

The analogy is relevant insofar far as demonstrating where the understanding of a thing is not reducible merely to issues of structure and computational functionality.

... Subjective experiences (and their dynamics) are inextricably tied into the cognition, general behavior, and physiological functions of an individual. They are concrete phenomena [i.e. not merely abstractions] with physical consequences and we need to understand, in physical terms, how they relate to and interact with physical phenomena that are modeled in our current theories.
Well good luck with that. Incidentally, how do you suggest that might be achieved? after all, if subjective experiences are concrete phenomena, shouldn't they be objectively observable and measurable...?

Maybe, maybe not. What I AM proposing is that there should be be a way to understand what the necessary physical conditions of consciousness are, the sufficient physical indicators of conscious experience(s), and a have theoretical structure that ties these understandings together in a meaningful way with the rest of physics. Incidentally I think the only -direct- way to 'observe' observation [short of a direct telepathic connection to another conscious individual] is via introspecting ones own consciousness.

In anycase, the only way such a thing can be achieved is to consider the issue of consciousness is terms of biophysics and physics in general -- not just information processing. If anything, the question of consciousness extends into the realm of metaphysics as well.

... I'm curious as to how such things relate to what we know of physics and the most direct way, IMO, is to find the ontological relationship between qualia [the basic elements of subjective experience] and quanta [the fundamental units of physical objects].
Good luck with that too. What's your plan of attack?

Like I said, examining the relevant data in terms of the physics of whats going in the nervous system of conscious individuals, introspecting and meditation on what one's own consciousness is doing from 'the inside', and brainstorming on ways to integrate that with current physical theories. Can I give a detailed map to how to get there? Of course not, elsewise we'd already have the answers we need. But I CAN say one thing for certain: we're not going to get the understanding required by simply equating consciousness with computation and then calling it a day.

This myopic obsession with mapping the functional architecture of the brain does little to nothing in terms of giving us the physical understanding of consciousness required to scientifically propose how to reproduce it synthetically. Do you understand where I coming from on this?

Yes, I understand where you're coming from, and I think you're wrong - but I'm quite prepared to change my mind if you can describe and explain a plausible plan of attack to gain a physical understanding of consciousness that ignores the functional architecture of the brain.

I specifically stated that the computational architecture of the brain is only relevant insofar as understanding how our experiences are organized. I never said that it was completely irrelevant. My point is that, by itself, its a dead end.
 
Last edited:
If we have a replica of the brain, hook it up to all the proper input and output interfaces, then we can compare the functionality with a human brain. We have 3 possible outcomes:

  1. A detectable difference in behavior. In this case, there must be a flaw in the emulation that can be fixed by zooming in on the exact difference.
  2. A p-zombie, of which you stated earlier that they don't exist.
  3. A replica of human consciousness. We're done.

Or do you propose a 4th alternative ?

A 'relica' of human consciousness is not a structural replica of the human brain. Consciousness is a phenomena tied in with the activity of our organic brains -- and possibly other biological systems. We need a physical understanding of the phenomena in-and-of-itself, not just the structural plumbing of the nervous system.
 
Pixy, from what I recall of the conversations over the years, only claims SRIP as conscious given a particular definition of consciousness. He doesn't claim that this would be anything like what a human experiences. It may not even meet the criteria for what most people would consider conscious. His ultimate point was that if you don't think SRIP is conscious, then provide a definition that we can work with.

How about "consciousness is the workings of neurons in a human brain". It's more accurate than Pixy's definition, and more useful. It's not accurate or useful enough, but it shows how easy improving on SRIP is.
 
Building a functional replica of a circulatory system does not produce blood.
No-one said it would. These analogies are straw men.

No, these analogies are metaphors to help convey a conceptual problem that you're consistently side-stepping.

The assumption that merely emulating the computational architecture of the brain will produce consciousness is a form of cargo cult thinking.
Another straw man analogy. Cargo cult thinking involves imitating the appearance and ritual of the technology without the function. If the cults had copied the airstrips, aircraft, and radios using functional components, things would have been different. We're talking about copying the functional components, structure, and architecture of the brain, not making something that looks like a brain.

Such emulations ARE merely imitating appearance if they do not incorporate the relevant physical processes. We do not know what it is about the physics of organic brains that produces conscious experience yet we have people assuming that by merely imitating the computational features of the brain we can reproduce consciousness. This is just blatant cargo cult thinking in another guise.

We absolutely must understand the -physics- of what the brain is doing with regard to consciousness before we can ever seriously propose instantiating it artificially.
Nature generates and assembles 100 billion neurons and connects them together to give a conscious brain without understanding anything at all...

Nature IS the physics of those neurons and what they are doing. You must understand that physics in order to artificially reproduce the same physical results -- which includes conscious experiences. This is not a difficult concept to grasp; why are you going to such lengths to resist acknowledging the obvious?
 
Last edited:
How about "consciousness is the workings of neurons in a human brain". It's more accurate than Pixy's definition, and more useful. It's not accurate or useful enough, but it shows how easy improving on SRIP is.


Unfortunately, unconsciousness (sleep and coma) is also produced by the workings of neurons in a human brain.

That really only provides a location. SRIP provides one type of action that is probably necessary for human consciousness; and it is the particular types of actions, the architecture that we need to pin down. I think most of us agree on the location of it.
 
Status
Not open for further replies.

Back
Top Bottom