Has consciousness been fully explained?

Status
Not open for further replies.
We can build pumps, because we understand the principle by which the heart pumps blood. We know that a mechanical pump and the heart are doing the same thing. That we know what goes on in a neuron - even if we had absolute and total knowledge - does not mean that we understand how the neuron creates consciousness.

This is true, which is why I use toy mechanical models to outline the principles of brain function. Without understanding the principles, no blueprint is going to help.

I also think the preoccupation with the Church-Turing Thesis can be misleading. It's not invalid, just too narrow a window into the principles our brains take advantage of. Sort of like saying starting a fire is equivalent to doing a calculation, which is true in some sense.

The salient feature of interesting brain function are emergent characteristics. Thus a linear calculation going from A to B will not suffice. Yet identifying these characteristics as emergent does not mean we can't understand the principles from the underlying parts alone. Temperature is a particularly simple emergent phenomena, yet can be fully understood in terms of the parts alone. Ant colony intelligence is another comprehensible emergent quality, where in some sense the intelligence of the colony is smarter than the intelligence of an ant, in fact the majority of the ants computational intelligence is emergent only at the colony level. Very interesting if you wish to study it. I used the emergence of self syncing metronomes in the toy model of neurons and brain function earlier.

In brain function a calculation is not a linear progression from A to B. The massive associative parallelism of the brain sifts through all the associative content of experiences and pays the most attention to most persistent associations. However, we can divert our attention to more narrow aspects of the associative data in an open ended manner. By treating a particular association as a calculation, and modeling it alone as a brain model, it misses all the other associative content inherent in the same brain process, which we could have selected and discussed. The calculation model tends to treat each of these associations as a separate process, when they are not from the perspective of actual brain function.

You might still assume that a given decision or association is a neural sequence between a sensory input and a response output. But that is dead wrong, if the sensory inputs of many of the neurons is defined not by external sensory inputs, but by internal global patterns of the more direct sensory inputs, leading to what we call "Theory of Mind". Thus lower animals have a primary mind, which learns through selection to navigate its environment. The higher animals develop a secondary mind which learns to model and navigate the primary mind. The primary mind is then the environment the secondary mind lives in and learns to navigate and provide meaning to primary content.
 
In your response the me, even though I'm saying the same thing as PixyMisa qualified against long term consequences, you aren't telling me I'm wrong, but sort of waffling. When your response specified "during development" PixyMisa stated: "Which in no way contradicts what I just said".
I was lampooning Pixy’s posting style. Lampooning.
This effectively proves PixyMisa's statements and mine are essentially identically. Yet in narrowing down the distinctions with you, your agreements with me points to a wishy washy yes and no. So what are the distinctions?
Dude, I disagreed with an absolute statement, I can have opinions that vary from yours , I have stated that sometimes perceptions fall under the label of consciousness.
Neither PixyMisa or I are denying that sensory input is needed for development of our sense of consciousness. Neither of us are denying that when the stored data about past sensory data degrades that consciousness will degrade accordingly.
As you will see since I responded to this post from the bottom up, I have said this many many times. I said that what is often described under the consciousness label is perceptions. I did not say that it was all there was under that label.
Now here's the kicker, everything required to maintain consciousness in the here and now is fully contained in the physical brain contained in the skull, without any sensory inputs or connections to the world external to the brain.
Look buddy, I said what I said that sometimes perceptions are conflated under the term consciousness. I did not say at any time that they were solely consciousness.
That consciousness would degrade over time, due to a lack of maintenance provided by these inputs, is immaterial to that fact. Degradation is merely a mechanistic consequence of the way our brain is constructed, not an absolute condition from loss of sensory inputs.
I agreed, with that. I stated what I sated, you seem to have gone places from things I did not say.
What you appear to be saying here is that if I look toward my keys on the desk and don't see them, some of my consciousness is gone.
That isn’t what I said at all, I said that some parts of what would be labeled consciousness at times would not be present, do you have a problem with that?
You have really taken the straw ball and run down the field.
Taken from a 'toy' model perspective, this indicates that a tank of compressed air, that doesn't have a gauge reading the pressure (sensory data), isn't compressed.
Are you having a bad day?
By conscious mind two things can be indicated. One is working memory, which is limited to 3 or 4 bits of information.
I sort of agree but it depends on the defintion of 'bits' there.
Even though those bits can be bit representations of a much larger set of data, in which longer term memory must be accessed to obtain information about. The second is the world model, stored in memory. The model used in your head to make sense of the sensory data and world around you. In fact illusions are created by creating expectations from your world model, when in fact it is not so, nor even what your senses are actually telling you.
I may know more about perceptions that you think. What is your issue, are you mistaking me for someone else?
You then see your world model instead of what your senses actually see, and call this memory contained in your brain external sensory data.
Excuse me, why are you lecturing me, your statement are accurate to some degrees and not to others.
This has nothing to do with what I actually said.
This world model was modeled from sensory data of the past, but does not disappear just because ALL sensory data from the external world disappears.
I never said that it did. I said that parts of what are labeled as consciousness are perceptions.
And so long as it persist it can feed working memory bits to define your consciousness, even with a complete and total blackout of ALL sensory data external to the brain grey matter itself. Consciousness IS fully contained in the grey matter of the brain, irrespective of what external data was involved in its development.

And that is why PixyMisa said:
Proves PixyMisa's statement was meant in the same sense mine were, with the only difference being the qualifications of of precise sense that was.
I made a joke. That is what it was. I can agree with people partly and disagree with people partly.
It appears the breadth of empirical science you are making presumptions from is intensely limited. In fact we have a far better gauge than any "behavioral criteria" can ever dream of conceiving.
You are being rude. I made a certain statement, you have perceived me disagreeing with you in ways that I did not intend not imply.
We can watch your thoughts and know which predefined choices you are going to make before you do.
Sort of and sort of not. We can not watch thoughts, there are some interesting studies.
You are disagreeing with things I never said.

It's even making its way into the gaming market. Buy yourself a brain wave game controller for $99.99. Its primary limitation is signal quality issues without implanted sensors inside the brain. "Behavioral criteria" makes it sound like we're stuck in the 1940s.
You are being rude, how else will you define consciousness, you didn’t ask before going off on your rant.
We haven't done brain transplants, but we have done head transplants.
I am sorry that my humor has triggered this response in you.
From a consciousness perspective the only role the body plays is to keep the brain alive, feed it sugar. We have sliced off pieces of rat brains, grew them on a substrate, and used it as a robot control mechanism connected through bluetooth. We know exactly what do do to your brain so everything is normal, except that you will not recognize your mother while looking at her. Even though you'll agree it looks like her and you'll recognize it's really her on the phone. I'll not even get into the terabytes of more detailed empirical data.
keep at these straw argument, you have over interpreted what I said.
If you want to hang onto the notion that consciousness is some kind of whole body phenomena, base it on something more than 1940s style "behavioral criteria".

I did not say that it was ‘some kind of whole body phenomena’, I said that the perceptions are part of what is often conflated under the label of consciousness, and therefore much of what is labeled as consciousness comes under that category.

Wow, MyWan I did not mean a flip statement to trigger this in you.

You are also very wrong about behavioral criteria, you are the one who is seemingly limited in perspective and understanding, I am knowledgeable of neuroscience, biological models of the brain and modern psychology.
I made no statements about your level of knowledge or understanding and you have gone on some sort of snit.
I am sorry for whatever role I may have had in that.

There are no ways at all of describing consciousness that are not behavioral criteria.

So here is the deal, we have not defined consciousness in this thread, I have stated that I tend to rely upon the medical definition. Pixy made a statement that all things labeled as consciousness occur within the brain. I disagreed and stated so in a fashion meant to lampoon Pixy’s posting style.

I have stated repeatedly that sometimes under the label of the word ‘consciousness’ there are perceptive events and things that require sensory input. I did not state that they were the sole basis of what we call consciousness.

I even stated that with a disembodied brain we could re-embody it to ask if it remembered conscious states during that time. As I used the phrase ‘self report’, I then also stated that a disembodied brain would have a hard time meeting the common medical criteria for consciousness.

At many times My Wan I have appreciated your posts and your knowledge, especially in the areas of medical science. If I disagree with you and do not just say “Yes” to you , that is my prerogative.
 
Last edited:
I don't think so. It's really no different from saying that the heartbeat is not in the muscle cell. You can't see it at that level. The heartbeat is a pattern of contraction.

Each cell is doing exactly the same thing during a normal heartbeat as it is doing during fibrillation. The former keeps you alive, the latter kills you. But you cannot detect fibrillation at the cellular level. And if the heart were made of something different which had the same aggregate behavior, it would still keep us alive when contracting properly and kill us if allowed to fibrillate too long.

You misunderstand my comment. If consciousness is something separate from the physical units that comprise the brain (i.e. neurons). If it is best considered as a coordinated pattern of the activity/communication between those units, then the pattern could occur in systems that we don't typically think of as conscious. Computers are one clear example of such a system, but an ant colony might be another. Or the universe as a whole (i.e. pantheism).
 
I also think the preoccupation with the Church-Turing Thesis can be misleading. It's not invalid, just too narrow a window into the principles our brains take advantage of. Sort of like saying starting a fire is equivalent to doing a calculation, which is true in some sense.
I only bring up the Church-Turing thesis when someone insists that it's impossible for a machine to be conscious, since the Church-Turing thesis establishes that it is impossible for the brain to do something that cannot be done by a computer.

The thesis itself doesn't tell us much about how the brain works.
 
Dancing David,
I went back and reread the sequence of post leading to this. I don't see the absolutes you attributed to PixyMisa, as what could be interpreted as such was disavowed by agreeing with certain qualifications. Including her statement of "Which in no way contradicts what I just said.", when you specified "development". Thus nothing exist to "lampoon", which is why I got involved. Yet I remain confused over the justification of the lampooning. Forgive me if attempting to understand lead to erroneous assumptions about why it was thought to be justified.

Yet the fact remains, irrespective of what constructs outside the brain was required for development, the present state capacity for consciousness is fully contained in the brain, even if effectively removed from the body or any sensory input.
 
No-one ever said it was.

Really, read Godel, Escher, Bach.

I wasn't responding to your post there.

Please stop plugging GEB. It's not like I can go off and read it this afternoon anyway.
 
You misunderstand my comment. If consciousness is something separate from the physical units that comprise the brain (i.e. neurons). If it is best considered as a coordinated pattern of the activity/communication between those units, then the pattern could occur in systems that we don't typically think of as conscious. Computers are one clear example of such a system, but an ant colony might be another. Or the universe as a whole (i.e. pantheism).

That's too broad a brush.

It's not just any old coordinated pattern.

Once we understand what the pattern is -- if we can call it that -- then (and only then) we will be able to look at objects other than the brain and come to a conclusion about their capacity for consciousness.

But right now, we have no reason to believe that structures in the brain happen to be recreated elsewhere by chance.
 
I only bring up the Church-Turing thesis when someone insists that it's impossible for a machine to be conscious, since the Church-Turing thesis establishes that it is impossible for the brain to do something that cannot be done by a computer.

The thesis itself doesn't tell us much about how the brain works.

It's the last bit that's being objected to: "the Church-Turing thesis establishes that it is impossible for the brain to do something that cannot be done by a computer".

I freely admit that conscious machines are possible, but I also contend that this statement above is incorrect, since it leads to absurd conclusions, such as your "pen and paper brain" scenario, among other reasons.
 
I wasn't responding to your post there.

Please stop plugging GEB. It's not like I can go off and read it this afternoon anyway.

But if we would all just read it we'd see that PM was right all along. About everything.
 
So... describing what goes on in your heart is enough to satisfy you as to how your heart works, but describing what goes on inside your brain is not enough to satisfy you as to how your brain works?

It would be, if we could do it. We're working on that, but it's difficult and tricky and there's a long, long way to go yet.
 
I only bring up the Church-Turing thesis when someone insists that it's impossible for a machine to be conscious, since the Church-Turing thesis establishes that it is impossible for the brain to do something that cannot be done by a computer.

The thesis itself doesn't tell us much about how the brain works.
Yes, that I understand. Sometimes trying to put a point on solid ground can narrow its conception in unwarranted ways.

I've been thinking on experimental neural designs based on using RFID interrogator tag sets to represent neurons. Neural connectivity would be defined by the range of tags a given interrogator would listen to and/or sync with. A sensor, such as a web cam, would then assign each pixel of the image to a neuron. Randomness of individual neuron locations would be immaterial since learning occurs relative to whatever initial distribution exist.

I've also considered BB style resonators with a conductive surface, and connectivity would be defined by how many others were in contact with each other. Syncing would be defined by an open ended frequency range table it would sync with. Connectivity would be far more limited here, and a bucket of BB's would become untrained if you bumped it too hard.

The RFID approach appears more promising, but has a bandwidth problem. Given the shocking slowness in which real neurons work, I've considered greatly increasing effective bandwidth using a timing signal to use the same bandwidth for multiple bands, based on their timing. However, the most promising is to use signal strengths so low that only a limited but significant number of RFID tags around it can even detect it. In this way, the same band being used at some sufficient distance wouldn't have any effect, unless it had a linkage through nearer tags that synced with it.

The point of this is we need real artificial neural nets to play with, where we can define the degree of connectivity on the fly. Hardwiring them as traditionally done would require millions of years for a sufficiently large network to be built. Even then the degree of connectivity would be hardwired in. With variable connectivity etc., we might even have initial condition compete in an evolutionary algorithm, to evolve the initial conditions best for certain learning skills.
 
It's the last bit that's being objected to: "the Church-Turing thesis establishes that it is impossible for the brain to do something that cannot be done by a computer".

I freely admit that conscious machines are possible, but I also contend that this statement above is incorrect, since it leads to absurd conclusions, such as your "pen and paper brain" scenario, among other reasons.

She more or less answered this in her response to my issues on the Church-Turing thesis, but does need more clarification in my view.

The "pen and paper brain" couldn't be considered intelligent in the same sense an individual air molecule doesn't have a temperature. You can assign it some temperature by assigning some random frame a unique status, but it's simply a coordinate choice without fundamental physical meaning. Note that the atmosphere is hotter relative to the space shuttle reentering than it is to you for a reason. Similarly, the notion that an abacus is intelligent because it performed a calculation is tantamount to assigning ensemble properties to individual parts of that ensemble, like the temperature issue.

As she indicated in response to me, the "pen and paper brain" is merely a demonstration of part mechanics, and does not address how to get global ensemble properties. Yet does demonstrate that non of the parts required for those ensemble characteristics have mysterious properties. Assigning emergent properties of a system to properties of parts of a system is a common tactic theist use in proofs of God, it doesn't work.
 
Last edited:
Please stop plugging GEB. It's not like I can go off and read it this afternoon anyway.
Oh, man. Said a mouthful there, you did. One doesn't read GEB, exactly; one slogs one's way through it. I've slogged my way through it several times over the last couple of decades or so, and it still makes my head hurt. GEB was a watershed for me. Along with The Mind's I (which Hofstadter co-authored with Dennett) and Metamagical Themas, it was my introduction to the (missing!) science of consciousness, and my first exposure to the ideas of Searle, and Nagel, and Pinker, and others (and thence to a bunch more, including the somewhat ambitious efforts of Roger Penrose). Most of the people out there, you try to talk to them about the underlying concepts put forth in GEB, and first their eyes glaze over, and then they sort of start slowly backing away...

Apologies for plugging and all. I've been lurking this thread all along, and while I'll probably continue to do so, I doubt if I'll be contributing anything further; you guys seem to have it pretty well covered -- but I just had to respond to that.
 
I've just been informed PixyMisa is male. I was sure I had female confirmation in a post somewhere :confused:

Oh well, changes nothing. Back to topic...
 
Dancing David,
I went back and reread the sequence of post leading to this. I don't see the absolutes you attributed to PixyMisa, as what could be interpreted as such was disavowed by agreeing with certain qualifications. Including her statement of "Which in no way contradicts what I just said.", when you specified "development".
Thus nothing exist to "lampoon",
Had
I
done
it
like
this
? :D
. Yet I remain confused over the justification of the lampooning.
It was lark. There is none.
Forgive me if attempting to understand lead to erroneous assumptions about why it was thought to be justified.
Just goofing and pedantry on my part. No real justification, I am prone to capricious acts.
Yet the fact remains, irrespective of what constructs outside the brain was required for development, the present state capacity for consciousness is fully contained in the brain, even if effectively removed from the body or any sensory input.

I respectfully say, okay.

I agree with your second paragraph.
 
Last edited:
The point of this is we need real artificial neural nets to play with, where we can define the degree of connectivity on the fly. Hardwiring them as traditionally done would require millions of years for a sufficiently large network to be built. Even then the degree of connectivity would be hardwired in. With variable connectivity etc., we might even have initial condition compete in an evolutionary algorithm, to evolve the initial conditions best for certain learning skills.

Amen to that! I don't even know how many processors and array packs it would take to simulate one trillion neurons and their conenctions. (And then the states of potentiation and attenuation, and the differentials.) And I think that we have some understanding of neurotransmission but there are many variables still unknown.
 
She more or less answered this in her response to my issues on the Church-Turing thesis, but does need more clarification in my view.

The "pen and paper brain" couldn't be considered intelligent in the same sense an individual air molecule doesn't have a temperature. You can assign it some temperature by assigning some random frame a unique status, but it's simply a coordinate choice without fundamental physical meaning. Note that the atmosphere is hotter relative to the space shuttle reentering than it is to you for a reason. Similarly, the notion that an abacus is intelligent because it performed a calculation is tantamount to assigning ensemble properties to individual parts of that ensemble, like the temperature issue.

As she indicated in response to me, the "pen and paper brain" is merely a demonstration of part mechanics, and does not address how to get global ensemble properties. Yet does demonstrate that non of the parts required for those ensemble characteristics have mysterious properties. Assigning emergent properties of a system to properties of parts of a system is a common tactic theist use in proofs of God, it doesn't work.

My disagreement w/ PM regarding the pen-and-paper brain is specifically over the issue of whether it would generate an instantiation of conscious awareness.

Pixy and I agree on many things, but not that.

Also, Pixy has several times taken the stance that consciousness is a true emergent property, like the whiteness of clouds, which requires no specific mechanism (and I use that term extremely broadly, btw) but which merely arises as a result of the presence of self-referential information processing in any form.

Ergo, thermostats may be conscious, and for all we know your computer is conscious right now.
 
GEB was a watershed for me.

I have no objection to reading GEB, and I may get around to it eventually, but the thing is, if I want to read about cognitive neuroscience, I'd rather pick up Gazzaniga.
 
Status
Not open for further replies.

Back
Top Bottom