• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
Of course it does. Code is physical. If it weren't, it wouldn't do anything.

What do you define as "code"?

I agree that there's a physical analog to what we normally think of as code, for precisely the reason you mention here.

But it's certainly not the same thing that goes thru your average person's mind when looking at the syntax we use to write code.

That syntax tells a "story", so to speak, which does not describe the actual physical mechanism of the computer.
 
That being said, consciousness is not a word, symbol, or abstraction. Its a concrete reality for which we don't have a scientific understanding. Therefore, merely abstracting operational byproducts and phenomena associated with consciousness is not sufficient to produce consciousness. A proper physical 'definition' of consciousness would not be a "logic gate".


Who said it was?
 
What do you define as "code"?

I agree that there's a physical analog to what we normally think of as code, for precisely the reason you mention here.

But it's certainly not the same thing that goes thru your average person's mind when looking at the syntax we use to write code.

That syntax tells a "story", so to speak, which does not describe the actual physical mechanism of the computer.

Any particular instance of the code might be the same, but whatever is in common between the idea in the mind of the person writing the code, the markings on paper, the program typed into the computer, the machine code generated by the compiler, and the physical actions of the computer running the program is not physical.

Of course each individual item is physical - but if it's claimed that they are all instances of the same thing, then that thing is clearly not physical.
 
I'm sorry, but that is not even a worthwhile issue to bring up in such a discussion. We all know Wittgenstein. Definitions are word games; all definitions share family resemblances. That tells us nothing about the nature of the world but only about how we use language and define words. It will not help you.


The proper physical 'definition' of a logic gate is a logic gate. Look at what it does. That is its definition. Words are words.

Funnily enough, physicists manage to get along well enough without constant reference to Wittgenstein.

The issue is not whether a logic gate can be defined. It's whether a logic gate and a neuron can have a definition that covers them both. That's the trick which hasn't quite been managed yet.
 
Funnily enough, physicists manage to get along well enough without constant reference to Wittgenstein.

The issue is not whether a logic gate can be defined. It's whether a logic gate and a neuron can have a definition that covers them both. That's the trick which hasn't quite been managed yet.


Funnily enough, what you are now asking for has nothing to do with what you earlier said we needed -- an unambiguous physical definition of a logic gate. For that issue Wittgenstein is important.


Why do you think it is difficult to arrive at a definition that covers both logic gates and neurons? If the definition is not unambiguous it only shows the nature of definitions.
 
Funnily enough, what you are now asking for has nothing to do with what you earlier said we needed -- an unambiguous physical definition of a logic gate. For that issue Wittgenstein is important.

It's possible to define "logic gate" in terms that just encompass doped silicon, but that's not a particularly useful definition.

This is relevant because there's a claim, implicit in computationalism, that the operation of a logic gate, whatever it's basis of operation, will produce, if properly organised, the phenomenon of consciousness.

Either the concept of a "logic gate" has to be physically defined, or else it has to be accepted that consciousness is not a physical phenomenon.

There's no need for a complete, fundamental understanding. We have sufficient physical definitions of electricity and magnetism that we can form physical theories about them. It doesn't mean that we have total knowledge of what they consist of. However, in order to say that if we rotate conducting wires in a magnetic field we produce electric current, we need all these concepts to be sufficiently precisely defined so that we objectively know what all our terms actually mean. This isn't dependent on what we are using the device for.

When we are talking about logic gates, we need a definition that ensures that we know when something is a logic gate, and when it isn't.

Why do you think it is difficult to arrive at a definition that covers both logic gates and neurons? If the definition is not unambiguous it only shows the nature of definitions.

I want a definition that's as solid as that for any physical object or quantity. I want one that's sufficiently objective that the rule can be applied independently of what answer people want to get.
 
It's possible to define "logic gate" in terms that just encompass doped silicon, but that's not a particularly useful definition.

This is relevant because there's a claim, implicit in computationalism, that the operation of a logic gate, whatever it's basis of operation, will produce, if properly organised, the phenomenon of consciousness.

Either the concept of a "logic gate" has to be physically defined, or else it has to be accepted that consciousness is not a physical phenomenon.

There's no need for a complete, fundamental understanding. We have sufficient physical definitions of electricity and magnetism that we can form physical theories about them. It doesn't mean that we have total knowledge of what they consist of. However, in order to say that if we rotate conducting wires in a magnetic field we produce electric current, we need all these concepts to be sufficiently precisely defined so that we objectively know what all our terms actually mean. This isn't dependent on what we are using the device for.

When we are talking about logic gates, we need a definition that ensures that we know when something is a logic gate, and when it isn't.



I want a definition that's as solid as that for any physical object or quantity. I want one that's sufficiently objective that the rule can be applied independently of what answer people want to get.



My two objections are this: asking for an unabmiguous definition is an impossible standard for most 'things' (that is just the nature of definitions); and suggesting that a single logic gate equals a single neuron is not a requirement (nor is it remotely likely).
 
My two objections are this: asking for an unabmiguous definition is an impossible standard for most 'things' (that is just the nature of definitions); and suggesting that a single logic gate equals a single neuron is not a requirement (nor is it remotely likely).

The standard I'm looking for is not some unattainable perfection. I just want the same standards used in defining other physical quantities. Physical theories are always formed in terms of physical quantities.
 
A bridge is a physical object. However, I don't think it would be possible to frame a physical theory using words like "bridge" without serious redefinition.


OK, so what physical objects do you have in mind that do have good physical definitions?


ETA:
Are you talkin' 'bout water = H2O?

That sort of definition comes at the end of an analysis not the beginning. The definition would be the program that controls the logic gates that has already been shown to act just like a neuron; so why would we care about that kind of definition except to examine it?
 
Last edited:
OK, so what physical objects do you have in mind that do have good physical definitions?


ETA:
Are you talkin' 'bout water = H2O?

That sort of definition comes at the end of an analysis not the beginning. The definition would be the program that controls the logic gates that has already been shown to act just like a neuron; so why would we care about that kind of definition except to examine it?

Ask westprog about crystallization -- "is a 'crystal' something that has a good physical definition?"
 
That's ok.

... snip ...

I doubt I'll post tomorrow, but I'll try to chime back in over the weekend, and maybe we can explore some common ground, and perhaps that will help illuminate what our real differences are.

This might have to wait for the new year, actually. I need to work 12 hour days just to get my work completed before I go on vacation in 2 weeks, so there isn't much room for writing lengthy posts. After Xmas, though, there will be time.
 
This might have to wait for the new year, actually. I need to work 12 hour days just to get my work completed before I go on vacation in 2 weeks, so there isn't much room for writing lengthy posts. After Xmas, though, there will be time.

Ok, cool. I'm in almost the same boat.

I've been thinking about the commonalities b/t the computationalists and the physicalists -- will have to leave out the organicists, like Al Bell, who deem all non-organic machines incapable of consciousness (for some reason) -- and trying to work up a set of premises.

More to come....
 
A bridge is a physical object.

Yeah, but it's kinda slippery to move from there to some kind of universal definition, b/c a bridge isn't defined by its physical attributes, but rather by what it does &/or an agreement that it fits in that category even when not performing that function.
 
We are beginning to decode neural firings. There is the Cochlear implant and the first artificial eye implant was created in the late 70's. In 2000, some researchers at Berkley managed to decode neural firings in a cat's lateral geniculate nucleus and translate it into images on a computer screen. It may only be a matter of time before we can figure out the neural basis for consciousness.
 
We are beginning to decode neural firings. There is the Cochlear implant and the first artificial eye implant was created in the late 70's. In 2000, some researchers at Berkley managed to decode neural firings in a cat's lateral geniculate nucleus and translate it into images on a computer screen. It may only be a matter of time before we can figure out the neural basis for consciousness.

'Decoding' optical sensory neurons is impressive and all, but that's extremely simple in comparison to substantial brain activity like perception, cognition, consciousness and memory. What you're describing is the sensory level, which is how information from the outside gets sent to the brain. How the brain processes that information is more difficult.

I'd like to think we can figure out how the brain works in great detail, but I don't think it's something we'll be able to do any time soon.
 
Operative word.

This isn't some far out assumption. The foundation of the technology is already there. We pretty much know that it is possible.

'Decoding' optical sensory neurons is impressive and all, but that's extremely simple in comparison to substantial brain activity like perception, cognition, consciousness and memory. What you're describing is the sensory level, which is how information from the outside gets sent to the brain. How the brain processes that information is more difficult.

I'd like to think we can figure out how the brain works in great detail, but I don't think it's something we'll be able to do any time soon.

The researchers at Berkley didn't decode neural firings in the retina of the cat, they decoded neural firings in its lateral geniculate nucleus, which processes information received from the eyes so they were already at the place where sensory information becomes perception and this was 10 years ago.
 
It may only be a matter of time before we can figure out the neural basis for consciousness.

There's no reason to believe that it will prove impossible to figure out how consciousness is done.

But so what?

That (perfectly rational) presumption doesn't help us figure it out any faster.
 
Status
Not open for further replies.

Back
Top Bottom