• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Robot consciousness

You don't see a difference between being consciously aware of something, and being conscious? Perhaps if stuff falls below your threshold some algorithm or other decides not to bother the explicitly consciously aware subsystems with it? Does this tell us that the clock tick of consciousness, if there is such a thing, is the same as the threshold at which visual input gets ignored.

Well, it's difficult to imagine being conscious of nothing and still being conscious, actually. But perhaps that's not what you meant.

Anyway, yes, it does seem that subliminal input -- whether we're talking about very short events or subtle odors -- is processed by the brain but is not, or cannot be, used by the systems that generate conscious experience.

I don't think the "refresh rate" of consciousness -- if there turns out to be such a thing (altho I think there is) -- is the same as the subliminal threshold. In fact, it appears to be variable, slowing down as we age, speeding up during moments when attention to the external world is extremely important.
 
Any child using words is clearly using symbols, and I wouldn't say that consciousness is generated by manipulating symbols. Consciousness is mainly about learning.

You'll have to explain "habit machine". Clearly we (and children) have more than habitual behaviors.

Maybe not.

It certainly seems to us that we're using symbols when we use words.

But the brain doesn't appear to actually operate that way.

The brain, of course, is predisposed to focus on certain patterns, including human faces and language, for example. A baby's brain begins learning the language(s) in its environment well before it has any clue what the words might mean. (It is not, by contrast, hard wired to learn music theory.)

The brain is looking for patterns, and building associations with those patterns through repeated use.

Over time, words become part of clusters of associations. Any pattern in the cluster triggers the brain to retrieve other associated patterns, although not every pattern in the cluster is equally strongly associated with each of the others, and context (other patterns present) also influences the retrieval process.

And these clusters are not discreet -- they all overlap.

So, for example, various but similar configurations of associations can be called up by hearing the sound "cat", seeing the characters "CAT" written on paper or a screen, hearing a meow, seeing a housecat or a lion, or seeing an image of these or a cartoon of them or even a person in a cat or lion suit, even viewing things associatied with cats such as pet toys or brushes or food dishes.

That's what I mean about the brain being a habit machine. It is blind to meaning. All it cares about is what it's predisposed to attend to and the associations built up by experience.

Even supposedly symbolic manipulations such as multiplying numbers probably do not involve any actual manipulation of symbols, but rather a rapid triggering of associations formed by predisposition and habit.

Ditto for using words.
 
Any physical system that changes can be classified as an information processor if you like.

If you like. That's up to you. If you classify it as an information processor, however, you need to define a set of inputs and outputs.


So it seems that when you say "information processor", you are referring specifically to TMs.

In that case, we need to see some convincing argument that the human brain is -- and is only -- a TM.

The argument is quite simple and quite compelling. Every information processor that we have analyzed, either physically or theoretically, is at most as powerful as a TM. We not only know of no physical object that is more powerful than a TM, but we know of no mathematical formulation that doesn't explicitly violate the laws of physics that is more powerful than a TM.

Therefore, we infer that every information processor is at most as powerful as a TM. If the brain is an information processor, it must therefore be at most as powerful as a TM. In the same way that we infer that no iPhone can exceed the speed of light, by virtue of the fact that no physical object studied can do so -- and an iPhone is a physical object, even if we've not actually tested its hyperspace capacity.
 
The argument is quite simple and quite compelling. Every information processor that we have analyzed, either physically or theoretically, is at most as powerful as a TM. We not only know of no physical object that is more powerful than a TM, but we know of no mathematical formulation that doesn't explicitly violate the laws of physics that is more powerful than a TM.

Therefore, we infer that every information processor is at most as powerful as a TM. If the brain is an information processor, it must therefore be at most as powerful as a TM. In the same way that we infer that no iPhone can exceed the speed of light, by virtue of the fact that no physical object studied can do so -- and an iPhone is a physical object, even if we've not actually tested its hyperspace capacity.

Ok. I'm really trying to understand how this seals the deal for the timing question.

But I'm afraid I can't connect the dots just yet.

After all, a computer can't play a movie or run a laser at any processing speed. But perhaps that's totally irrelevant to the process of generating consciousness.

So my first question is: When you say that the brain can't be more powerful than a TM, you mean that it can't... do what exactly? I want to be sure I understand precisely what you mean.
 
So if we slow down the operating speed dramatically, does the logic of the TM apply? Must we conclude that, because a TM can perform all its operations at any arbitrary speed, therefore the consciousness function of this machine must operate to sustain the phenomenon of conscious experience regardless of operating speed?

Or, would it fail to sustain conscious experience during periods of extremely slow operation?

It seems dubious to me, because the high-level operations which generate conscious experience no longer resemble the operations of a TM.

If we allow that computers can't play movies or run lasers at any artibrary operating speed (and for that matter that pressure-washers won't work at any arbitrary pressure) then, given the nature of what's required to perform this task, does the fact regarding a TM and processing speed even matter?
Putting aside differences I see in the operation of consciousness, if it's fully part of the brain's functionality and the brain's functions are slowed down proportionally among each other and with the environment being interacted with, then there can't be any change in the experiences recorded, assuming no information about the length of the intervals between experienced events is being recorded.

The sensing of "wall time" has be slowed down here too, or the experiences will clearly be different.

For your examples to not work there must be something physical that is not being slowed down along with the information processing. A pressure wash depends on the inertia of water, for example. That means pressure washing isn't a proper analogy for a single-steppable program, as described in the OP.
 
Putting aside differences I see in the operation of consciousness, if it's fully part of the brain's functionality and the brain's functions are slowed down proportionally among each other and with the environment being interacted with, then there can't be any change in the experiences recorded, assuming no information about the length of the intervals between experienced events is being recorded.

What do you mean, "recorded"?
 
For your examples to not work there must be something physical that is not being slowed down along with the information processing. A pressure wash depends on the inertia of water, for example. That means pressure washing isn't a proper analogy for a single-steppable program, as described in the OP.

So what is responsible for consciousness?
 
What do you mean, "recorded"?
The usual meaning: information stored for possible later retrieval.The reported recollection could be directly from working memory. How else do we decide what to label as "conscious experience" other than that sort of information that we can recall?
 
So what is responsible for consciousness?
The conscious-labeled subset of pathways and functions of the brain.

We can see this mathematically, which I think is simpler if we consider the brain to be equivalent to a Finite State Machine instead of a TM. Then the entire state of the brain at time ti can be represented by one large number S, and the next state at ti+1 by:

S[i+1] = F(S, I)


where F(S,I) is the state transition function (a fixed lookup table), and
I is a number representing all inputs at time ti.
The outputs can be taken from another function of S.

Now we can see that no matter what time increments we have between the ti, that as long as the input values remain in the same sequence, the states (and thus outputs) must also follow the same sequence.
 
After all, a computer can't play a movie or run a laser at any processing speed.

You still seem to be missing a point here. And I think that if you understand this particular point, you may understand more about information processing in general.

You claim that a computer running a movie at an arbitrarily slow speed is not really "playing a movie"--presumably because it wouldn't make sense to you while you watched it. But this does not demonstrate an inadequacy in the computer. It only points out a mismatch between the computer's operating speed and your brain's operating speed. It's exactly the same mismatch that would exist if you were to slow down the brain's (any brain's) operations compared to the outside world's speed. But it doesn't follow that the your brain is not conscious at that speed, merely that it wouldn't be able to make much sense of the inputs.

They don't "process information". That's a metaphorical abstraction.

You've made the claim repeatedly that information processing is a metaphor. But a metaphor has two parts. What's the other part? I ask because I think you are totally misusing the term "metaphor" and it's leading to fuzzy thinking. I would call "information processing" a description of what the brain does.

When you say that the brain doesn't do information processing, and what it really does is fire neurons and shuttle neurotransmitters around, then you are being deliberately myopic. It's the same as if you said that your body doesn't engage in something called "living", it merely pumps blood and contracts muscles.
 
After all, a computer can't play a movie or run a laser at any processing speed. But perhaps that's totally irrelevant to the process of generating consciousness.

So my first question is: When you say that the brain can't be more powerful than a TM, you mean that it can't... do what exactly? I want to be sure I understand precisely what you mean.

There is no input/output mapping (or stimulus/response pair) that the brain can produce that cannot be computed by an appropriately programmed Turing machine.
 
So are we arguing that the speed theshold of consciousness is effected by how bright the light is, or that our visual systems are effected by this stuff and our consciousness sits on top and has to deal with it?

An event that is too short and filling too little of the visual field will likely be lost in the noise of random firing of the visual receptors. I say likely because the noise is part of the signal so has to be passed along. On the way, the neural pathways process the flashes from the receptors triggering on correlations that represents primitive patterns, motion, shapes, objects, events, building up higher level abstractions. At some point, the processing starts using predictive correlation where an abstract model of the external event is constructed to help track the evolution of the event and fill in missing detail. The event evolution builds into sequence chains. And the correlation of sequence chains results in a playback/recall mechanism that we associate with memory. The playback loops are used to reinforce and build the predictive correlators used earlier that form the basis of long term memory. The same predictive feedback loop can also be used to interpret the consequences of actions to evaluate choices.

So, does our consciousness somehow sit on top of all this, observing and directing? I don't think so. The consciousness needs to be an integral part of the feedback loop and nature has a tendency to reutilize existing parts. The same would apply to building a conscious robot; consciousness would be just another pattern in the neural network.

Timing is part of the event recognition. A subliminal flash may trigger some of the primitive pattern recognitions but since it doesn't form a complete event, those pattern would likely be merged into other events. If we were subjected to flashed images over and over, we would start to recognize them as the flash itself forms its own event pattern in the brain.
 
The usual meaning: information stored for possible later retrieval.The reported recollection could be directly from working memory. How else do we decide what to label as "conscious experience" other than that sort of information that we can recall?

Conscious experience is not the same as recollection.
 
Conscious experience is not the same as recollection.
True (one is a verb), but they're related: how else do we decide what to label as "conscious experience" other than that sort of information that we can recall?
 
The conscious-labeled subset of pathways and functions of the brain.

We can see this mathematically, which I think is simpler if we consider the brain to be equivalent to a Finite State Machine instead of a TM. Then the entire state of the brain at time ti can be represented by one large number S, and the next state at ti+1 by:

S[i+1] = F(S, I)


where F(S,I) is the state transition function (a fixed lookup table), and
I is a number representing all inputs at time ti.
The outputs can be taken from another function of S.

Now we can see that no matter what time increments we have between the ti, that as long as the input values remain in the same sequence, the states (and thus outputs) must also follow the same sequence.


Are you sure?

Let's do the same thing for my desktop machine here. We'll take the same inputs and run the machine at normal operating speed, and then again at an extremely slow operating speed.

If the task is purely logical -- say, adding numbers -- then I expect we'll get the same outputs.

But what if the task is to play a DVD? Will we then get the same outputs?

Obviously no, because in that case time increments make a difference.

This scenario begs the question because it is based on the assumption that no high-level physical tasks that feed back into the system are involved.

Once you have that kind of setup, then everything changes.

What's clear is that the task "generate and maintain conscious experience" is quite dissimilar to the task "respond reflexively to stimulus".

Again, the real-world instantiation of consciousness is not a calculation. It is not an execution of logic. Rather, it is a performance, a physical process, and one which feeds back into the system and alters it.

The representation you propose appears too simple to capture the actual dynamics.
 
True (one is a verb), but they're related: how else do we decide what to label as "conscious experience" other than that sort of information that we can recall?

It's not a matter of what we choose to label.

Recollection is a process that occurs independently of conscious/unconscious processing.

Recollection is used by the brain all the time, for both conscious and unconscious processes.
 
It's not a matter of what we choose to label.

Recollection is a process that occurs independently of conscious/unconscious processing.

Recollection is used by the brain all the time, for both conscious and unconscious processes.
Again, true, but that's still not answering my question. Can you give me an example of something in particular that you label as an experience, but without recalling it? I don't limit "recall" to long-term storage, but include even working storage, such as recalling the sensation you had a split-second ago.
 
We could use the Pixxy definition...

Consciousness: Something a Pixxy has that makes pixxies special.


Of course, pixxies are special because they have DVDs spinning in their head and laser beams protruding from their eyes. But that definition doesn't help us understand what consciousness is or help us recognize consciousness in anything other than a pixxy.
 

Back
Top Bottom