• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Explain consciousness to the layman.

Status
Not open for further replies.
I think Pixy is simply saying that the difference we find between the two systemic conditions, when using the same perspective on the same system, is in principle generalizable, by way of mechanism, to many other systems. That’s the minimal definition; consciousness as a general mechanism rather than a vague umbrella-definition of everything it allows for in a particular system when “online”. Obviously, explaining the variety of experiencing in the human system, or how the 1st person perspective in that system is produced in the first place, is still ahead. I’m pretty sure that requires a more detailed and specialized explanatory level. Hence why I think the actual substrate must be taken into account here. I.e., we are not off the hook when it comes to explaining the details of the human system.
Yes. ;)
 
To be fair: how most people define consciousness may not stand up to scientific standards, or indeed correspond to reality.

Any definition of consciousness should be vague and unsatisfactory, because that reflects our knowledge and understanding. A precise comprehensible definition is entirely pointless, since it avoids all the challenging issues. Pixy's definition has nothing to do with human consciousness. It's a term relating to computer engineering.
 
That is a more appealing definition for sure. It resonates with what most of us intuitively feel about consciousness. However, it is not a workable definition because it doesn't describe the "process".

ie "inadequate".

I agree. Piggy's definition tells us where to look for an adequate, expanded more objective definition.

I must admit that I have to keep pulling my eyebrows back down when I read what PixyMisa is posting on this thread. Of course I can't really criticize him until I have a better definition of "consciousness" (not that there is any shortage of critics anyway).

Pixy's definition is a dead end. There's no point in a definition that blocks off investigation.
 
I see plenty of selling the "more promising model" but not so much of the "awful lot of groundwork".
That's because there's an awful lot of groundwork. If you want it, Godel, Escher, Bach is a better treatment than anything I could hope to accomplish here.

But I'm open to questions.

But don't mind me. This is your baby and you can run with it anyway you like.
That's not correct. It's not my baby, and these ideas don't remotely originate with me. Read Hofstadter and Dennett for background. Then read the people they cite.

And I can't, and don't want to, "run with it any way I like". It's useful and worthwhile only insofar as it is an accurate model. If it's wrong, I want to know.
 
Any definition of consciousness should be vague and unsatisfactory, because that reflects our knowledge and understanding. A precise comprehensible definition is entirely pointless, since it avoids all the challenging issues.
What challenging issues? Since you can't even define the term, how can you even know that these challenging issues exist, much less discuss them?

Pixy's definition has nothing to do with human consciousness.
Complete nonsense. If you examine it, it clearly represents the everyday use of the word.

It's a term relating to computer engineering.
Computer science. And since the brain is a computer, and since the brain does exactly what I am talking about, if you want to claim it's unconnected you have a hell of a lot of work in front of you... Of which you have so far done nothing at all.
 
First, my computer is not a sentient being, though some programs running on my computer are conscious.
Have you found anyone you can convince of this? A cite would be helpful.

Read GEB and "I am a strange loop" isn't helpful advice for most of us.

Second, why would that step be necessary? If one has an operational definition of consciousness, then one can determine whether a system is consciousness by functional inspection.
I.E. Your operational definition doesn't match what conscious humans (other than you and perhaps Hofstadter apparently) accept consciousness to entail.

Note that 'functional inspection' does introduce at least one consciousness into the picture, that being you or another actual consciousness.

Why would we wonder any of that? If consciousness is something simple, then clearly there's no award due, and all those research dollars are going to fund other aspects of cognitive science.
Again, citation is needed demonstrating that a fair number of people agree with you 'consciousness is something simple'. Few here icluding myself appear willing to accept that idea.

And if someone asks you a question, you're more likely to contribute to the discussion if you answer the question than if you launch it an ad hominem diatribe.
Perhaps.

Since no one has yet guessed an on-point question to you, what question would you ask yourself?
 
I.E. Your operational definition doesn't match what conscious humans (other than you and perhaps Hofstadter apparently) accept consciousness to entail.
Could you please list a few features you would like to have in a definition of consciousness?
 
It all started here

Leumas said:
[snip lots of theistic tomfoolery]

Leumas, how many transistors and interconnections are there in a modern computer? Roughly? Order of magnitude?



How many cells and interconnections are there in a neuron? Roughly? Order of magnitude?


And which do you reckon is more complex .... a cell.... or a transistor?


And how many computers are there on the internet?


You tell me..... I thought I already told you..... go back and READ the post you tried to respond to.


Leumas asked you about the number of components that make up a neuron mistakenly referring to the components as "cells"


PixyMisa said:
How many cells are there in a neuron and interconnections? Roughly? Order of magnitude?
A neuron is a cell, so, one.

And which do you reckon is more complex .... a cell.... or a transistor?
A neuron is more structurally complex. A transistor is several orders of magnitude faster, so for dynamic complexity it may be a wash.

You tell me..... I thought I already told you..... go back and READ the post you tried to respond to.
You said users. Users aren't relevant.

Count the computers, and the components that make up the computers. It's simple arithmetic; we only need to deal with orders of magnitude.


You did not answer his real and obvious question and instead stuck with his mistaken word choice.

You did admit that neurons were structurally more complex than a transistor though.
I want to know why you think this is irrelevant to the function of a neuron?
Why would a faster transistor have any relevance to consciousness?
Is the speed of SRIP (you claim causes consciousness) relevant when it comes to transistors but not when we simulate your SRIP by moving rocks around in the desert or writing equations on a paper?
What is dynamic complexity?
I also want to touch briefly now on what your theory misses - neurochemistry - a neuron does not just transmit electrical signals it also responds to chemicals in its environment - more on brain structure and neurochemistry later



Leumas asked again about the number of components/cells in a neuron, even providing a diagram of a neuron with some of its obvious components. You disingenuously responded to his language instead of the content of his question


Leumas said:
Count the computers,


So why don't you tell us what this number is? I have already asked you 4 times and every time you evade the question but yet tell me to count them and when I give you a number you dismiss it as irrelevant. So please enlighten us o knowing one.

Do you or do you not have the number?


Another question you evaded..... is the internet conscious?


Or are you going to call the questions drivel again?


A neuron is a cell, so, one.

A neuron is more structurally complex. A transistor is several orders of magnitude faster, so for dynamic complexity it may be a wash.



640px-Complete_neuron_cell_diagram_en.svg.png



Now you claim you gave a number, but you did not, still relying on Leumas mistaken wording to justify your lack of response.

PixyMisa said:
So why don't you tell us what this number is? I have already asked you 4 times and every time you evade the question but yet tell me to count them and when I give you a number you dismiss it as irrelevant. So please enlighten us o knowing one.
If you give an irrelevant number, you can expect to be told that it's irrelevant. Complaining about that seems pointless.

Do you or do you not have the number?
On the order of a billion. That's precise enough for our purposes.

Now, how many transistors are there in the average modern computer?

Another question you evaded..... is the internet conscious?

Or are you going to call the questions drivel again?
Sure.

picture of neuron
Yes? I saw that the first time. Did you read the file description when you posted it? Neuron cell diagram.

As I said, a neuron is a cell. So the number of cells in a neuron is, by definition, one.

Now this is were I come in to point out how you use the same argument you say is wrong to defend your theory.

On one hand you claim this below.


!Kaggen said:
All the brain does is process and store coded nerve pulses and return new coded nerve pulses.

Yes, all a brain does is what a brain does.
Thanks for enlightening us with your amazing wisdom.
Mu

I point out that what your saying above is a brain does how the brain does it "processing and storing coded nerve pulses and returning new coded nerve pulses". And don't come up with the lame excuse about the level of abstraction, that is metaphysics. We keep pointing out to you that we do not know how the brain does consciousness, but somehow you manage to get away with defining what a brain does (consciousness) by how the brain does it, SRIP or the above. This is the problem with your SRIP definition of consciousness which I have pointed out on numerous occasions, it is circular.

Now returning to neuronchemistry - you are wrong about this
All the brain does is process and store coded nerve pulses and return new coded nerve pulses.
The brain also does neurochemistry. The blood supply to the brain carries chemicals to the brain which not only feed the neurons with nutrients and removes waste products of metabolism they change the way the neurons work and this can have drastic effects on consciousness. One major component of the brain missing from computation theory of consciousness - neurochemistry - more latter on consciousness and neurochemistry.


I further enlighten you on your little game of ignoring Leumas intention and emphasize the point he is trying to make by highlighting the complexity of a neuron.

!Kaggen said:
I Neuron cell diagram.

As I said, a neuron is a cell. So the number of cells in a neuron is, by definition, one.

Yeah sure attack the messenger.
Lets translate the picture for you then.
You see all the different structures making up a neuron cell?
Let us start with the DNA in the cell nucleus which holds sufficient information to participate rather dominantly in building a human body with 100 trillion cells.

You get the picture?


So now you get defensive about your obvious tactic of ignoring Leumas intention.
PixyMisa said:
Yes, all a brain does is what a brain does.
Thanks for enlightening us with your amazing wisdom.
That is not what I said. If intended as a response to my post it is either hopelessly confused or deeply dishonest.

Yeah sure attack the messenger.
Again, hopelessly confused or deeply dishonest.

I did no such thing.

You do highlight a point below which I often use to explain to those ignorant of genetics that although the DNA is the same in every cell each cell certainly does not look or function the same. Take cancer cells for instance which we cannot trace to faulty DNA. There is more to biology than DNA, correct. However this does not address the DNA code required in a neuron to make it a neuron. So add this to your silly equation of complexity instead of playing games
PixyMisa said:
Lets translate the picture for you then.
You see all the different structures making up a neuron cell?
Yes, of course.

Let us start with the DNA in the cell nucleus which holds sufficient information to participate rather dominantly in building a human body with 100 trillion cells.
The number of cells is irrelevant here, since there are a relatively small number of cell types all with identical DNA.

You get the picture?
I understand that you are trying to make a point, but have failed to do so.


When I point out your ignorance of biology below you still fail to recognize your shortcoming.
!Kaggen said:
The number of cells is irrelevant here, since there are a relatively small number of cell types all with identical DNA.
Yes all the DNA in every cell is identical.
All the atoms in adenine are identical.
All the electrons in every atom are identical.
Etc Etc

You have a point?


I try to get you to see the arbitrary and bias selection process you are using for your exercise in measuring complexity. All you end up doing is stubbornly sticking to your simplistic theory of what the brain does at the neuron level and this as I have pointed out is even simplistically wrong
!Kaggen said:
And see, if you post facts and ask questions, you will get responses.

So, basically, the brain has 100,000,000,000 neurons.

The internet has (conservatively) 10,000,000,000,000,000,000 transistors.

Line the two numbers up:

100,000,000,000 neurons
10,000,000,000,000,000,000 transistors

You can see just how vast and complex the internet is when considered as a system. Even though transistors are much simpler than neurons, there are just so many more of them that the numbers swamp everything else.

Wrong, you arbitrary choose to stop at the neuron level of a brains complexity.

Lets start with chemical compounds that make up a neuron compared to those that make up a transistor.

Get the picture yet?


You slipped up a bit below and talked about quality which you claim does not exist, so in this context would you mind answering my question below. I am very curious why you suddenly use this word
!Kaggen said:
What I'm saying is that consciousness is not unique to life, but to a certain class of information processing system. Until the last century, the only working examples of such systems were indeed living brains, but that's no longer the case.

All those behaviours by which we distinguish consciousness are now replicated in computers. Still for the most part in a more basic way than in humans, but the difference is quantitative and not qualitative.

So what would a qualitative difference be?

Belz also does not get the picture, looking through his neuron = transistor spectacles
Belz... said:
Yeah sure attack the messenger.
Lets translate the picture for you then.
You see all the different structures making up a neuron cell?
Let us start with the DNA in the cell nucleus which holds sufficient information to participate rather dominantly in building a human body with 100 trillion cells.

You get the picture?

I don't think the complexity of the neuron is important. What matters, IMO, is its function.



So I point out the implications of this mistake and why it does not lead to any better understanding of consciousness
!Kaggen said:
I don't think the complexity of the neuron is important. What matters, IMO, is its function.

I see, so all the behaviors of all the chemical compounds that make the neuron behave are not important?

This is the problem with behaviorism it assumes the importance of one level of behavior over another. This assumption has no basis in behaviorism itself and therefore behaviorism is simply a tool which is meaningless on its own.
It needs a human interpreter.
Somehow though when this is pointed out its taken to mean there must be a magic bean.
We have no choice but to recognize our own consciousness's contribution without needing to describe its behavior using a behavioral descriptor such as an algorithm.
And as Westprog and I have said on numerous occasions the way of describing consciousness is through artistic endeavors. No computational theory or magic beans required.
I would also add now that a brain is not just neurons, it is blood vessels and connective tissue. These perform an essential function in the brains activity and ignoring them in any model of the brain such as the computational model is just silly.

This is were it gets interesting since you claim to have a functional definition of consciousness, but when any brain function is raised such as the biochemistry inside a neuron you claim its irrelevant because it is not on your arbitrary level of description. How very convenient. Part of your masterplan to prevent any possible testing of your functional theory. It works on the level of neurons, but only if you define neurons with one of their activities and ignore the rest and don't try to use it to describe neurochemistry. I have news for you Pixy Misa. Neurochemistry is arguably the most relevant aspect of consciousness. The signals from our sensory organs are, as you rightly describe, informational. The thing sensed does not actually travel to the neuron which then "experiences" it. What I mean by experience is a chemical reaction or an electro-magnetic signal. Its unique chemical or electro-magnetic signal gets translated by the sensory system into a signal which carries information about its uniqueness. What the neuron "experiences" from another neuron is exactly the same whether it was the colour red we saw or yellow or the smell of a rose or the sound of a harp. However the information in the signal is different. Now this is were the computationalists stop. It is convenient. All the "experiences" are the same only the information is different. It fits perfectly into information theory. More later
PixyMisa said:
I see, so all the behaviors of all the chemical compounds that make the neuron behave are not important?
Correct. The behaviour and not the mechanism is what's relevant.

This is the problem with behaviorism it assumes the importance of one level of behavior over another.
It assumes nothing of the sort. We are talking about consciousness, not biochemistry, so that is the appropriate level to examine.

PixyMisa said:
This assumption has no basis in behaviorism itself and therefore behaviorism is simply a tool which is meaningless on its own.

It needs a human interpreter.

Somehow though when this is pointed out its taken to mean there must be a magic bean.
Yep. It's pure dualism.
No it is not dualism because we are remaining within consciousness when we describe it. You want to somehow look at consciousness from an abstract distance. It is impossible. Your stance is dualism because in order to look at consciousness abstractly requires a belief that there is some kind of abstract platonic world where even consciousness is objective. No, consciousness creates abstractions. That is what it does. They are not out there waiting to be found. Maths was invented

PixyMisa said:
We have no choice but to recognize our own consciousness's contribution without needing to describe its behavior using a behavioral descriptor such as an algorithm.
And as Westprog and I have said on numerous occasions the way of describing consciousness is through artistic endeavors. No computational theory or magic beans required.
If you assert that this is the only way to describe consciousness, then you're asserting that consciousness is a magic bean.
No, we are being consistent with monism



I honestly appreciate your input believe it or not :) to the study of consciousness. I would call it the atheism of consciousness studies. The gate through which all must pass in order to move along in this pursuit of freedom. It certainly eliminates the hubris in this important debate. However it is only the start. We need to get to grips with neurochemistry. This area of study has been lagging drastically behind because of the draconian laws preventing psychoactive drug research. It is obvious to anyone who has experimented with these things and anesthetists that sensory input (nerve signals) are not as important for consciousness compared to chemicals (neurochemistry). Even pain transmitted through the nervous system to the brain as a signal can be annulled through chemical effects on the neurons. What is fascinating about neurochemistry and consciousness for instance is that it is the only way for a thing (chemical) to enter the brain through the blood system and interact directly with a brain neuron. A brain neuron experiences chemicals firsthand. Take oxygen and glucose for instance. The effects of too much or too little of these substances on the brain neuron causes profound effects on consciousness.

In a way, consciousness can be almost totally independent of sensory inputs. Perhaps the idea of reducing sensory inputs because they are simply the noise we mistakenly believe is consciousness, as practiced in Buddhism, has some merit. They even take this further by not only reducing sensory inputs (closing eyes, quiet environments, no odours) but by reducing outside chemicals through slowing breath/heart rate(oxygen) and fasting (glucose and other psychoactive chemicals in food). This way the brain neurons have minimal chemical influence and only their basic metabolism to stay alive is "being experienced". The layers of consciousness experience are being peeled away till its is just neurons. Other's use psychoactive chemicals to effect and "heighten" aspects of consciousness which they regard as true consciousness.
Ultimately though what I am saying is consciousness has these layers of experience and it is not a magic bean that is missing from your theory but simple chemicals which objectively add significantly to the layers of consciousness. Ignoring neurochemistry and claiming you can create consciousness by the simple model of neurons modeled as switches and electrical inputs/outputs carrying information back and forth is what is wrong with your theory/definition. It selects one layer and assumes its importance over all others.


You see my argument now Belz?

Belz... said:
I see, so all the behaviors of all the chemical compounds that make the neuron behave are not important?

I don't see a reason why it should. Is the complexity of individual atoms important ?

For example, in terms of data storage, what matters is the actual information beign stored, be it on paper, in bytes, etc. The atomic composition of the information is irrelevant.

This is the problem with behaviorism it assumes the importance of one level of behavior over another.

Perhaps. I'd like to see a good argument for the reverse, however.



!Kaggen said:
I don't see a reason why it should. Is the complexity of individual atoms important ?

For example, in terms of data storage, what matters is the actual information beign stored, be it on paper, in bytes, etc. The atomic composition of the information is irrelevant.



Perhaps. I'd like to see a good argument for the reverse, however.

No, I said the chemical compounds behavior.
You ever done chemistry or biochemistry?
There are innumerable reactions taking place in a cell, many we do not even know yet. This is the complexity I am talking about. These are all essential for the functionality of the cell and contribute to its overall function.





PixyMisa said:
No, I said the chemical compounds behavior.
You ever done chemistry or biochemistry?
There are innumerable reactions taking place in a cell, many we do not even know yet. This is the complexity I am talking about. These are all essential for the functionality of the cell and contribute to its overall function.
The question is, what is the relevant level of behaviour: What the neuron does, or how it does it?

Since it's perfectly possible to replicate the function of a neuron with none of those chemical processes, we say it's what the neuron does that matters.
This is simply false for the reasons I mentioned above


Pixy Misa your abstract idea that the rules are what give the meaning does not apply to neurons I see
!Kaggen said:
The question is, what is the relevant level of behaviour: What the neuron does, or how it does it?
I thought semantics is syntax?


Since it's perfectly possible to replicate the function of a neuron with none of those chemical processes, we say it's what the neuron does that matters.
Not when what the neuron does is the same thing as how the neuron does it.



PixyMisa said:
I thought semantics is syntax?
Indeed it is.

Not when what the neuron does is the same thing as how the neuron does it.
Where did that premise come from?
Your claim that semantics is syntax




Belz... said:
No, I said the chemical compounds behavior.

I didn't say you had. I asked you a question.

There are innumerable reactions taking place in a cell, many we do not even know yet. This is the complexity I am talking about. These are all essential for the functionality of the cell and contribute to its overall function.

Assuming we build a circuit board made out of cars, I'm not sure whether the cars run on gasoline or not is important so, again, why do you think it matters ?
Is your SRIP different when your blood sugar is low?


!Kaggen said:
Indeed it is.


Where did that premise come from?

Well if consciousness is what the brain does and not how the brain does it then neuron function is not important only brain function.

You see the problem yet?




PixyMisa said:
Well if consciousness is what the brain does and not how the brain does it then neuron function is not important only brain function.

You see the problem yet?
Nope. Just as you can replace neurons with artificial neurons that work by different means but produce the same results, you can replace the brain with, for example, a simulated brain, and get the same results.

Still not sure what you're talking about or why you think it's a problem.
You did not address the levels problem I am highlighting, just side stepping it.
You have said

All the brain does is process and store coded nerve pulses and return new coded nerve pulses.
This is your how the brain does what it does definition of the brain again.
You may deny it but it would be the same semantic argument you use to argue that your consciousness definition is not circular.
And no, it is not a matter of the level of abstraction, that is metaphysics.
One way or another you are arbitrarily choosing the level of importance and as I have established this is based on an hopelessly incomplete theory of consciousness.


!Kaggen said:
Nope. Just as you can replace neurons with artificial neurons that work by different means but produce the same results, you can replace the brain with, for example, a simulated brain, and get the same results.

Still not sure what you're talking about or why you think it's a problem.

When you simulate your brain what inputs are you going to use? I hope not the functionality of the neurons since the brain function is not dependent on the functionality of the neurons right?


I see you start the apologetics for your metaphysics below
PixyMisa said:
When you simulate your brain what inputs are you going to use?
You mean sensory data? That can be recorded or simulated.

I hope not the functionality of the neurons since the brain function is not dependent on the functionality of the neurons right?
That's not correct. The brain is dependent on the functionality of the neurons, because that's what it's made of. You're free to substitute parts, as I said, as long as they're compatible.


So now that your backpedaling I highlight the conflict between your arbitary levels of importance
!Kaggen said:
You mean sensory data? That can be recorded or simulated.
No, I mean what is going to do the work of the brain?

That's not correct. The brain is dependent on the functionality of the neurons, because that's what it's made of. You're free to substitute parts, as I said, as long as they're compatible.

You sure?

PixyMisa said:
No, I said the chemical compounds behavior.
You ever done chemistry or biochemistry?
There are innumerable reactions taking place in a cell, many we do not even know yet. This is the complexity I am talking about. These are all essential for the functionality of the cell and contribute to its overall function.
The question is, what is the relevant level of behaviour: What the neuron does, or how it does it?

Since it's perfectly possible to replicate the function of a neuron brain with none of those chemical processesthe neurons functionality , we say it's what the neuron brain does that matters.



PixyMisa said:
No, I mean what is going to do the work of the brain?
The brain is being simulated.
Ah the "jumping up a level of abstraction fallacy" is becoming your forte

I thought you said
That's not correct. The brain is dependent on the functionality of the neurons, because that's what it's made of. You're free to substitute parts, as I said, as long as they're compatible.
So the brain works by the functionality of the neurons but the neuron does not work by the functionality of its cell components. Yes it is convenient for the brain to consist only of neurons receiving nerve pulse inputs and giving nerve pulse outputs for your SRIP, but it is not true



PixyMisa said:
You sure?
Yes, I am sure. I was very precise in what I said. There is no contradiction.

The brain is made up of neurons. That's how it works. That's not the only way for it to work, but that is how it works.
One thing I like about Pixy Misa. When he is wrong he is precisely wrong. :D



!Kaggen said:
The brain is being simulated.
And the brain is what exactly?
Oh forget it you answered below.
The brain is made up of neurons.

Now lets try this again

Yes, I am sure. I was very precise in what I said. There is no contradiction.

The brain neuron is made up of neurons chemical compounds. That's how it works. That's not the only way for it to work, but that is how it works.


Now the level of abstraction fallacy is used without any shame
PixyMisa said:
And the brain is what exactly?
On what level of abstraction?

Oh forget it you answered below.
Yes, I did.

Now lets try this again
Why? Do you actually have a point you want to make, or do you just want to keep repeating your error?

So it would appear looking through SRIP glasses.
However your SRIP glasses need a prescription upgrade.
 
lupus_in_fabula said:
!Kaggen said:
The content of anything describes the thing best.
Okay. But I don’t think it explains it in the best way. For instance, I can hear the content of my mp3 file, but it doesn’t seem to explain to me how it works. (Although there might be such an mp3 file as well.)

Well if you reduce the content to whatever you think might make your case then I suppose you can say this.

lupus_in_fabula said:
!Kaggen said:
The human neuron is a neuron because of what it's made of - cell organelles and cell membranes

The human brain is a human brain because of what it's made of - neurons, blood vessels and connective tissue.

Consciousness is consciousness because of what it's made of - thoughts and emotions

This is true of course. But it’s not saying much in terms of explanations for how they function. When it comes to your last line here, about consciousness being thoughts and emotions, I consider that definition to be operationalized into many specific components in order to be useful in our endeavor to try to eventually understand the whole system again.
Not sure what your saying?

lupus_in_fabula said:
If we’re really successful in understanding it, it means we’re able to generalize certain aspects of its functioning and draw parallels to other systems; i.e. “finding” general principles. (The general principles would of course be models (abstractions), not the thing itself, but that's the nature of knowledge anyhow.)
Yes we use abstractions, but we never say our abstractions are what we abstracted from. This is reserved for Pixy Misa's theory of consciousness.
Also his abstraction of consciousness is missing some key components. The main and quite possibly the only component of consciousness, Neurochemistry.
 
Last edited:
You certainly need to drop your preconceptions regarding consciousness to follow what I'm saying. The best starting point for this is Hofstadter's Godel, Escher, Bach. He takes 800 pages to work up to the same point I'm making, and he's since written a 400 page epilogue because so many people missed the point of the first book.

I'm not going to write 1200 pages of explanation every time a consciousness thread shows up; I expect people to think. I'm usually disappointed.

Oh no I thought you would be strong and not bring up GEB in this thread.
Oh well, as you said it helps drop those preconceptions.
Unfortunately not the ones about computation and consciousness.
But hey no book is perfect.
 
I see that my modbox failed to have the desired effect. Thread temporarily closed until a member of the Mod Team has a chance to review it. As ever, do not attempt to restart the discussion elsewhere.
Replying to this modbox in thread will be off topic  Posted By: jhunter1163
 
All right, I've gone through and sent a lot of posts to AAH (mostly stuff from the last couple of days.) Some had a good deal of content and only a little bickering, but hey, that's the breaks. Now since repeated requests have not seemed to work in improving the civility of the thread, you force me to use magic:
(Tricky pulls his wand out of his sleeve and intones ...)

Bickery Barkery
Posts getting Snarkery
To AAH where they'll
Rot in the Darkery.

Billower Bellower
Better get Mellower
'Cause if they don't
I see cards getting Yellower.

Now if that doesn't work I don't know what will.
Replying to this modbox in thread will be off topic  Posted By: Tricky
 
Guess I missed all the excitement.

Gotta say !Kaggen…that was an epic post. Most informative.

A brief aside re: the number of ‘operational cells’ (in the neural sense). Pixy referred to the brain having 100B neurons (a somewhat metaphorical measure of operational complexity). Up until recently it was believed that neurons did the real brain work….but it has now been all but conclusively established that what are referred to as glial cells (which apparently make up 90% of brain cells…as compared to the 10% comprised of neurons) also play a fundamental (though still little understood) role in brain ‘work’.

http://www.ottawacitizen.com/techno...ential+change+neuroscience/6267343/story.html

and here (if you can get past the subscription requirements): http://www.cell.com/abstract/S0092-8674(12)00142-0

....so it would seem Pixy's metaphorical measure of operational complexity was slightly conservative (by multiple orders of magnitude...especially if we include the operational complexity of the cells themselves).
 
The youtube video that was posted earlier in this topic was the closest I've ever come to an explanation on consciousness. I wish there was more programs on like that!

Some of these replies are very impressive, and scholarly but I just don't understand them.
 
Last edited:
It all started here

On the contrary, this was all just a minor digression.

Leumas asked you about the number of components that make up a neuron mistakenly referring to the components as "cells"

Okay.

You did admit that neurons were structurally more complex than a transistor though.
Of course. Since they are.

I want to know why you think this is irrelevant to the function of a neuron?
I don't. I never said that. I never said anything like that. I explicitly and repeatedly said that neurons are functionally more complex than transistors.

Why would a faster transistor have any relevance to consciousness?
I addressed that point in detail when I mentioned the speed difference. An array of transistors equivalent in functional complexity to a neuron will operate several orders of magnitude faster than a neuron. That means that with appropriate buffering logic, one transistor based neuron could replace thousands of biological neurons.

That's the difference between structural and dynamic complexity: The complexity of the state of the system vs. the the rate of change of that state.

Is the speed of SRIP (you claim causes consciousness) relevant when it comes to transistors but not when we simulate your SRIP by moving rocks around in the desert or writing equations on a paper?
Neither.

What is dynamic complexity?
Complexity with a time component.

I also want to touch briefly now on what your theory misses - neurochemistry - a neuron does not just transmit electrical signals it also responds to chemicals in its environment - more on brain structure and neurochemistry later
My model doesn't miss this at all. Neurochemistry is a bias input to the neuron. You can do the same thing with as single transistor. Clearly the different biases to neuron activity are more complex, so you'd need a number of transistors.

Now you claim you gave a number, but you did not, still relying on Leumas mistaken wording to justify your lack of response.
Completely untrue. In fact, I've already addressed his point about neuron vs. transistor complexity, and continue to do so throughout the discussion.

Now this is were I come in to point out how you use the same argument you say is wrong to defend your theory.
And again you are wrong.

On one hand you claim this below.
I point out that what your saying above is a brain does how the brain does it "processing and storing coded nerve pulses and returning new coded nerve pulses".

What I said was a functional description. Your assertion is entirely false.

And don't come up with the lame excuse about the level of abstraction, that is metaphysics.
No, it's science.

We keep pointing out to you that we do not know how the brain does consciousness
Your assertion, unsupported.

but somehow you manage to get away with defining what a brain does (consciousness) by how the brain does it, SRIP or the above.
Which is of course a perfectly valid approach.

This is the problem with your SRIP definition of consciousness which I have pointed out on numerous occasions, it is circular.
Nope.

Now returning to neuronchemistry - you are wrong about this
Already addressed.

I further enlighten you on your little game of ignoring Leumas intention and emphasize the point he is trying to make by highlighting the complexity of a neuron.
Already addressed.

So now you get defensive about your obvious tactic of ignoring Leumas intention.
No, your assertions there were simply incorrect.

You do highlight a point below which I often use to explain to those ignorant of genetics that although the DNA is the same in every cell each cell certainly does not look or function the same. Take cancer cells for instance which we cannot trace to faulty DNA. There is more to biology than DNA, correct. However this does not address the DNA code required in a neuron to make it a neuron. So add this to your silly equation of complexity instead of playing games

Sorry, the problem is entirely yours. Complexity is not a vague notion. There are different measures, but each is well-defined.

There are relevant measures of complexity, those that are minimally required for function, and there are irrelevant degrees of complexity. The information in the DNA of a neuron is not relevant to its functional complexity.

When I point out your ignorance of biology below you still fail to recognize your shortcoming.
I'm not ignorant of this; you're simply wrong.

I try to get you to see the arbitrary and bias selection process you are using for your exercise in measuring complexity. All you end up doing is stubbornly sticking to your simplistic theory of what the brain does at the neuron level and this as I have pointed out is even simplistically wrong
It's not arbitrary and there's no bias involved; you're attempting to introduce features that are clearly not relevant.

You slipped up a bit below and talked about quality which you claim does not exist, so in this context would you mind answering my question below. I am very curious why you suddenly use this word
Sorry, but you are entirely wrong yet again.

I have explicitly and repeatedly pointed out that self-referential information processing systems have access to an entire class of behaviours that are not available to information processing systems that are not self-referential. They are qualitatively different.

Once you introduce self-reference, though, there are no new structures left to introduce. Beyond that the differences are strictly quantitative.

So I point out the implications of this mistake and why it does not lead to any better understanding of consciousness
The mistake, again, is yours.

I would also add now that a brain is not just neurons, it is blood vessels and connective tissue. These perform an essential function in the brains activity and ignoring them in any model of the brain such as the computational model is just silly.

Sorry, but no. You need a power supply and wires to run a computer, but it's not the power supply or the wires that perform the computation. They are the first things to be abstracted out when constructing a model.

This is were it gets interesting since you claim to have a functional definition of consciousness, but when any brain function is raised such as the biochemistry inside a neuron you claim its irrelevant because it is not on your arbitrary level of description.

Entirely incorrect.

Biochemistry inside a neuron only propagates in the form of neural signalling. Thus it can and should be abstracted out of any higher-level model of brain function, without affecting the accuracy of the model at all.

No it is not dualism because we are remaining within consciousness when we describe it. You want to somehow look at consciousness from an abstract distance. It is impossible.
That's dualism.


This is simply false for the reasons I mentioned above

The problem is, the reasons you mentioned above are all false.

Pixy Misa your abstract idea that the rules are what give the meaning does not apply to neurons I see
Of course it does. Your statement is still incorrect, or at best, very poorly worded.

Your claim that semantics is syntax

Correct.

Is your SRIP different when your blood sugar is low?
Certainly. Blood sugar is a bias input to the circuitry of the brain.

You have said
This is your how the brain does what it does definition of the brain again.

It's a functional description of the brain. It's an accurate functional description of the brain.

You may deny it but it would be the same semantic argument you use to argue that your consciousness definition is not circular.
My definition of consciousness rests on well-defined terms that do not in turn refer to consciousness in any way. It is not, an cannot be, circular.

Whatever valid objections there may be to my definition, this will never be one of them.

And no, it is not a matter of the level of abstraction, that is metaphysics.
Science.

One way or another you are arbitrarily choosing the level of importance and as I have established this is based on an hopelessly incomplete theory of consciousness.
As already noted, it's not arbitrary, and doesn't rest on any theory of consciousness. It's physics.

I see you start the apologetics for your metaphysics below
Nope.

So now that your backpedaling I highlight the conflict between your arbitary levels of importance
I've already shown that every part of that assertion is untrue. I do not backpedal, there is no conflict, and the distinction is not arbitrary.

I thought you said


So the brain works by the functionality of the neurons but the neuron does not work by the functionality of its cell components.

That is entirely incorrect in every possible way.

You can replace a neuron with an electronic substitute that produces the same results but operates entirely differently. Or indeed with a computationally simulated neuron. You can do this with one neuron, or with all the neurons in the brain, until there's nothing of the original left, without changing the function of the brain at all.

An existing biological neuron is dependent on its internal chemistry, but that is not necessary to replicate its function. An existing biological brain is dependent on its neural activity, but that's not necessary to replicate its function.

Yes it is convenient for the brain to consist only of neurons receiving nerve pulse inputs and giving nerve pulse outputs for your SRIP, but it is not true
Nor is it required.

Now the level of abstraction fallacy is used without any shame
That's not a fallacy. Sorry.

 
Guess I missed all the excitement.

Gotta say !Kaggen…that was an epic post. Most informative.

A brief aside re: the number of ‘operational cells’ (in the neural sense). Pixy referred to the brain having 100B neurons (a somewhat metaphorical measure of operational complexity). Up until recently it was believed that neurons did the real brain work….but it has now been all but conclusively established that what are referred to as glial cells (which apparently make up 90% of brain cells…as compared to the 10% comprised of neurons) also play a fundamental (though still little understood) role in brain ‘work’.

http://www.ottawacitizen.com/techno...ential+change+neuroscience/6267343/story.html

and here (if you can get past the subscription requirements): http://www.cell.com/abstract/S0092-8674(12)00142-0

....so it would seem Pixy's metaphorical measure of operational complexity was slightly conservative (by multiple orders of magnitude...especially if we include the operational complexity of the cells themselves).
I think you are wrong on at least three levels: the function of astrocytes, their contribution to brain computation, and Pixy's assumptions about our current level of knowledge regarding brain computation. To sum them all up, that's a very interesting but not altogether surprising result to come from the brain's support architecture, which doesn't really matter because I don't believe Pixy ever claimed to know all the answers to begin with.

For a stronger argument, look up some papers about microglia taking active roles in synaptic homeostasis. Little bastards just bite spines off sometimes.
 
PixyMisa said:
Wondering how if the computational encodings only use/borrow representational content that is already available... how this computational "consciousness" deals with what would have to be undefined/unspecified: to be exact, external, "new" representational content?

What am I missing?
You're missing that neither our senses nor our brain works like that. Our senses reduce everything to coded nerve pulses. All the brain does is process and store coded nerve pulses and return new coded nerve pulses.

When something new arrives, it is new only in that that precise pattern of pulses hasn't been previously encountered. It can still be classified and correlated by its similarities to other patterns.


With no possibility of error? Or should I say with no way to explain how the classification or correlation could possibly be in error?

If any part of this classification and correlation by virtue of its similarities to other patterns is deficient or absent then this classification and correlation through similarities to other patterns, likewise, would be deficient or absent... ie the correspondence would not exist.


Soory to have to point this out.
 
Guess I missed all the excitement.

Gotta say !Kaggen…that was an epic post. Most informative.
Epically wrong, as we've seen, but epic in its own way.

A brief aside re: the number of ‘operational cells’ (in the neural sense). Pixy referred to the brain having 100B neurons (a somewhat metaphorical measure of operational complexity). Up until recently it was believed that neurons did the real brain work….but it has now been all but conclusively established that what are referred to as glial cells (which apparently make up 90% of brain cells…as compared to the 10% comprised of neurons) also play a fundamental (though still little understood) role in brain ‘work’.
Actually, the fact that some glial cells contribute to brain function is neither new nor surprising. Most glial cells, though, are purely structural.

....so it would seem Pixy's metaphorical measure of operational complexity
It is not metaphorical.

was slightly conservative (by multiple orders of magnitude...especially if we include the operational complexity of the cells themselves).
Not even remotely true. In my comparison I accounted not only for the number of neurons, but for the number of synapses.

There are between 10,000 and 100,000 as many transistors in the internet as there are synapses in the human brain; between 100 million and 1 billion transistors for each neuron. A neuron is more complex than a transistor, but the function of a synapse is quite similar. (Edit: And again, transistors operate several orders of magnitude faster than neurons, so the dynamic complexity of the internet is several additional orders of magnitude higher than that of the human brain.)

Even adding in all the glial cells only shifts the comparison by 1%.

In short: The internet is structurally four or five orders of magnitude more complex than the human brain, and dynamically nine to twelve orders of magnitude more complex (there's more leeway in the dynamic model for differences of opinion). There's simply no contest: We're second best.
 
Last edited:
With no possibility of error? Or should I say with no way to explain how the classification or correlation could possibly be in error?
Of course there's a possibility of error. Why do you even ask that question?

If any part of this classification and correlation by virtue of its similarities to other patterns is deficient or absent then this classification and correlation through similarities to other patterns, likewise, would be deficient or absent... ie the correspondence would not exist.
All or nothing fallacy.

Soory to have to point this out.
That's okay. You're just wrong.
 
Status
Not open for further replies.

Back
Top Bottom