Has consciousness been fully explained?

Status
Not open for further replies.
The number of neurons isn't important by itself.

The answer will lie in the configuration of large-scale neural structures, and other supra-neural (if I may coin a term) activities.

At the end of the day, the neurons don't matter, strange as that may seem to say, because if you could design a machine that did the same thing with some other base unit, you'd get the same result.

It's like, if you get two fronts of different pressure and temperature interacting, they form the same sorts of dynamic shapes, regardless of what the underlying components are.

Swirls of stars, swirls of air, swirls of water... these share some qualities despite being composed of very different sorts of components.

Consciousness is not in the neuron.
I agree.

Absolutely and if we start looking at connections, like 7,000 on average for a neuron, that is a really big number, it is the interaction that matters.


If we have one trillion elements , interacting with 7,000 other elements each, mindobogglo!
 
Last edited:
But what is the mechanism for the heartbeat?

http://en.wikipedia.org/wiki/Cardiac_cycle#Regulation_of_the_cardiac_cycle
The rhythmic sequence of contractions is coordinated by the sinoatrial (SA) and atrioventricular (AV) nodes. The sinoatrial node, often known as the cardiac pacemaker, is located in the upper wall of the right atrium and is responsible for the wave of electrical stimulation that initiates atrial contraction by creating an action potential.
:)
 

I found the mechanism for consciousness!

http://en.wikipedia.org/wiki/Neuron
All neurons are electrically excitable, maintaining voltage gradients across their membranes by means of metabolically driven ion pumps, which combine with ion channels embedded in the membrane to generate intracellular-versus-extracellular concentration differences of ions such as sodium, potassium, chloride, and calcium. Changes in the cross-membrane voltage can alter the function of voltage-dependent ion channels. If the voltage changes by a large enough amount, an all-or-none electrochemical pulse called an action potential is generated, which travels rapidly along the cell's axon, and activates synaptic connections with other cells when it arrives.
 
If we have one trillion elements , interacting with 7,000 other elements each, mindobogglo!

Well, that number is a bit deceptive, because the connections aren't made willy-nilly. Rather, they are "bundled" in predictable ways into larger-scale structures.
 
I'm sorry, but I don't see anything analogous to the heartbeat mechanism here.

So... describing what goes on in your heart is enough to satisfy you as to how your heart works, but describing what goes on inside your brain is not enough to satisfy you as to how your brain works?
 
I was also poking fun at PM and the
No
Wrong
Not even wrong
Absurd
In your response the me, even though I'm saying the same thing as PixyMisa qualified against long term consequences, you aren't telling me I'm wrong, but sort of waffling. When your response specified "during development" PixyMisa stated: "Which in no way contradicts what I just said". This effectively proves PixyMisa's statements and mine are essentially identically. Yet in narrowing down the distinctions with you, your agreements with me points to a wishy washy yes and no. So what are the distinctions?

Neither PixyMisa or I are denying that sensory input is needed for development of our sense of consciousness. Neither of us are denying that when the stored data about past sensory data degrades that consciousness will degrade accordingly.

Now here's the kicker, everything required to maintain consciousness in the here and now is fully contained in the physical brain contained in the skull, without any sensory inputs or connections to the world external to the brain. That consciousness would degrade over time, due to a lack of maintenance provided by these inputs, is immaterial to that fact. Degradation is merely a mechanistic consequence of the way our brain is constructed, not an absolute condition from loss of sensory inputs.

sort of responses.

As I have stated many times perception is a large amount of what is conflated in the rubric of consciousness. Perception is dependant upon sensation, so much of consciousness, not all, would be gone.
What you appear to be saying here is that if I look toward my keys on the desk and don't see them, some of my consciousness is gone. Taken from a 'toy' model perspective, this indicates that a tank of compressed air, that doesn't have a gauge reading the pressure (sensory data), isn't compressed.

By conscious mind two things can be indicated. One is working memory, which is limited to 3 or 4 bits of information. Even though those bits can be bit representations of a much larger set of data, in which longer term memory must be accessed to obtain information about. The second is the world model, stored in memory. The model used in your head to make sense of the sensory data and world around you. In fact illusions are created by creating expectations from your world model, when in fact it is not so, nor even what your senses are actually telling you. You then see your world model instead of what your senses actually see, and call this memory contained in your brain external sensory data.

This world model was modeled from sensory data of the past, but does not disappear just because ALL sensory data from the external world disappears. And so long as it persist it can feed working memory bits to define your consciousness, even with a complete and total blackout of ALL sensory data external to the brain grey matter itself. Consciousness IS fully contained in the grey matter of the brain, irrespective of what external data was involved in its development.

And that is why PixyMisa said:
Which in no way contradicts what I just said.
Proves PixyMisa's statement was meant in the same sense mine were, with the only difference being the qualifications of of precise sense that was.

yes.

yes, part of teh isue is the confusing morrass of what we label consciousness.

No and yes, the two are part and parcel.

It would be hard to gauge any sort of behavioral criteria for it. Certainly not for the medical definition. I suppose if you reconnected the brain then you could ask for a self report.
It appears the breadth of empirical science you are making presumptions from is intensely limited. In fact we have a far better gauge than any "behavioral criteria" can ever dream of conceiving. We can watch your thoughts and know which predefined choices you are going to make before you do. It's even making its way into the gaming market. Buy yourself a brain wave game controller for $99.99. Its primary limitation is signal quality issues without implanted sensors inside the brain. "Behavioral criteria" makes it sound like we're stuck in the 1940s.

I was just disagreeing, I do not think that there is any part of the brain that is not part of the body. Many parts of 'consciousness' are dependant upon sensation. Some things I agree with, others I don't.


I am fairly certain you haven't disconnected brain yet. :) And I am fairly certain that sensation and perception are part of developing consciousness. And part of what gets labeled as consciousness.

I am not making an absolute statement.
We haven't done brain transplants, but we have done head transplants. From a consciousness perspective the only role the body plays is to keep the brain alive, feed it sugar. We have sliced off pieces of rat brains, grew them on a substrate, and used it as a robot control mechanism connected through bluetooth. We know exactly what do do to your brain so everything is normal, except that you will not recognize your mother while looking at her. Even though you'll agree it looks like her and you'll recognize it's really her on the phone. I'll not even get into the terabytes of more detailed empirical data.

If you want to hang onto the notion that consciousness is some kind of whole body phenomena, base it on something more than 1940s style "behavioral criteria".
 
Last edited:
Sorry for jumping in late. There's a few posts I wanted to respond to, reading through the thread.

It has not been fully explained, yet, but you might be surprised at just how much of it has been explained to some reliable degree.
I'd say the opposite. You might be surprised at how little progress AI, neuroscience and psychology have made. Not none, but there's a long long way to go.

There are clear patterns of brain activity leading up to, for example, awareness of visual stimuli. We are mapping these in ever-greater detail.
"Clear patterns" in the sense that there clearly are patterns. But not in the sense that the patterns are clear.

Do you agree that we have a fait understanding of visual perception?

I, for one, don't.

Everything that the brain does can be done by a Turing machine. No exceptions. Established by the Church-Turing thesis.

The Church-Turing thesis does not establish this. Some would argue that it implies it and it certainly raised a debate over the issue.

My adviser, who studied under Church and has doctorates in both math and cognitive science has certainly never given such an interpretation of Turing and Church's work and I've discussed the material with him and seen him cover it in classes a few times. He would say it raised the debate.

The wiki article you linked to doesn't seem to support your claim either.

Oh and Hofstadter writes well about the work of others, but his ideas on consciousness are what I'd call speculative philosophy.

Consciousness is defined in terms of information.

What do you mean by information? As a colloquial term, it's not a well-defined term and seems to simply be a useful idealization in some sense, unless we define it as including virtually everything.

In cognitive science there is not a formal definition of information that I'm aware of, except for maybe Shannon's information theory. Is that what you mean?

We deliberately build computers that behave differentlybecause we already have plenty of people and we want something that can think, but is good at thinking in precisely the areas we are bad at it. So we build computers that are as rigorous and deterministic and pre-defined as we can.
(bolding added)
No, it's because we're currently incapable of building a computer that functions like a human.

That's simply what it does. You can trace the activity from sensory nerves firing in the retina through the visual cortex and all over the brain as the response to what you are looking at is processed in various ways.

We can't trace neuronal firings in the brain directly or to any precise degree. We can trace general brain activity by monitoring oxygen distribution, although FMRIs are overrated.
 
So... describing what goes on in your heart is enough to satisfy you as to how your heart works, but describing what goes on inside your brain is not enough to satisfy you as to how your brain works?

We can build pumps, because we understand the principle by which the heart pumps blood. We know that a mechanical pump and the heart are doing the same thing. That we know what goes on in a neuron - even if we had absolute and total knowledge - does not mean that we understand how the neuron creates consciousness.
 
The Church-Turing thesis does not establish this. Some would argue that it implies it and it certainly raised a debate over the issue.

To claim that everything the brain does can be duplicated by a Turing Machine is so obviously wrong that one can only suppose that PM means something else which he hasn't thought worth stating. Clearly a Turing Machine can't bleed, for example. If he means that any data processing the brain does can be done by a Turing machine, then he should say so.

My adviser, who studied under Church and has doctorates in both math and cognitive science has certainly never given such an interpretation of Turing and Church's work and I've discussed the material with him and seen him cover it in classes a few times. He would say it raised the debate.

The wiki article you linked to doesn't seem to support your claim either.

IIRC the wiki article explicitly contradicts the claim.

I've just done a google search for "anything a brain can do can be done by a Turing machine", with a number of rephrasings, but cannot find it said by anyone except for PM on this forum. I'd suggest that he provide a reference that shows that this is a common position, but I suspect that would just result in a huge reading list that supposedly supports the position, and no actual quotes.

Oh and Hofstadter writes well about the work of others, but his ideas on consciousness are what I'd call speculative philosophy.

It's an area where the qualifications of the so-called experts seem to be dependent on their opinions.

What do you mean by information? As a colloquial term, it's not a well-defined term and seems to simply be a useful idealization in some sense, unless we define it as including virtually everything.

In cognitive science there is not a formal definition of information that I'm aware of, except for maybe Shannon's information theory. Is that what you mean?

In a previous incarnation of this thread, PM referred to Physical information. That at least has some kind of meaning, though I don't think it supports the contention.

(bolding added)
No, it's because we're currently incapable of building a computer that functions like a human.



We can't trace neuronal firings in the brain directly or to any precise degree. We can trace general brain activity by monitoring oxygen distribution, although FMRIs are overrated.
 
I have not had the time to pore over all 19 pages of this thread, so apologies in advance if this has been pointed out, but...

I think the original question is flawed. I mean, can anything be "fully explained"?

Something as fundamental as gravity still holds mysteries, so why shouldn't consciousness?

Most often this comes up in debates putting forth a soul or spirit or something, and takes the form of, "You can't fully explain consciousness, therefor..."

My take is that consciousness is an emergent property from complexity. As such, when a system gets complex enough, consciousness may arise.

I've often wondered at what point the internet itself might become conscious, and how we might recognize that fact!
 
I found the mechanism for consciousness!

http://en.wikipedia.org/wiki/Neuron

That is the mechanism for neural transmission, sort of, It involves a weak voltage but really it is a polarity shift across an osmotic membrane. The cell signals to itself that it if firing by opening the channels and allowing the calcium and potassium to flow out, This then creates signals in the direct dendrite connections and causes the release of neurotransmitter at the post synaptic face of the neuron.

But it is apparently not ‘consciousing’ unless they do it in concert. :D

ETA: Osmotic Phase Shift Across A Bilipid Layer now at an arena near you!
 
Last edited:
Well, that number is a bit deceptive, because the connections aren't made willy-nilly. Rather, they are "bundled" in predictable ways into larger-scale structures.

Yes and no, the connections in many cases are sort of random (seemingly at the fine scale), the large and medium scale structures are structured by the enzyme gradients during development.

It truly is a mystery at the fine scale level, and I am wondering about the focus on the neurotransmitters, the dendritic connections are just as frequent (if not more so, I have to look that up), and so it is likely they play differential regulatory roles.

There is some research into how the stuff develops at the fine scale but it is very early on, I shall have to see what I can find. I know that there was some stuff on learning and connections in older individuals.
 
Status
Not open for further replies.

Back
Top Bottom