• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Robot consciousness

The problem is in where to draw the line between real-time and stored. Is information held in the global workspace stored or not? Or, how many milliseconds has to pass before it's not real-time? These are judgement calls. Hard to make arguments one way or the other based on them.

That's not difficult at all.

In the cocktail party effect, the voices in the room are real-time, while the associations that make your name jump out at you are stored.

To take a striking example of what happens when the two sets are disconnected, consider the cases of certain brain-damaged patients who have lost the ability to recognize faces. They get the real-time data, but the associations are never connected with it -- even though the memories are there -- so the brain never formats the experience of seeing the face in a way that makes it recognizable.

As a result, a man might have to have his wife wear a ribbon in her hair at parties so that he will know who she is.

The brain sees the face. The brain has the memories of his life with his wife. But the brain cannot connect the real-time input with the memories. As a result, seeing his wife's face does not result in his brain being consciously aware of "this is my wife's face".
 
Regarding the hypothesis that information processing generates consciousness....

Suppose I were to claim that the replication of DNA was not a function in any way distinct from the general operation of the cell. Rather, it simply arises as a result of overall "cell functioning".

Therefore, we don't need to concern ourselves with any thought about specialized functions or structures designed for that purpose. If we built an artificial cell that did everything else a cell does, then if it were robust enough it would automatically perform that function.

If I said something like that, others would be right to point out that the claim is not only absurd on its face, but also directly contradicted by evidence.

The same is true for the claim that consciousness somehow (inexplicably) arises from "information processing" generally, and that we can theorize about it without any reference to any specialized functions designed to perform the task.

The claim is both absurd on its face and contradicted by everything we know about the brain.
 
Since there has never been computer generated consciousness, we have no examples of that.
I was referring to researching the consciousness we do have examples of-- in brains.

As for what consciousness is, I've already offered a working definition.

Take the subliminal experiments -- those define a boundary between conscious experience on the one hand and input that is processed by the brain but never experienced consciously on the other.

There's also the cocktail party effect. Streams of nearby (but un-attended-to) speech are processed by the brain but not experienced consciously, until the string of sounds that is your name is part of the input, at which point it triggers enough strong associations that the brain feeds it into conscious awareness and it "jumps out at you".

People who view the "gorilla on the basketball court" film and don't report seeing the gorilla have not been consciously aware of it.

On the other hand, when we look at a wallpaper pattern, we are consciously aware of seeing much more than our eyes actually perceive -- the rest is filled in by schema. We can be consciously aware (and usually are) of things which we're not actually perceiving.
Ok, these are descriptions of some aspects of consciousness, all of which could be captured in a program. If this was all there was to it, we'd call that program conscious. I'm not sure where you're going with this.
 
rocketdodger:
What you seem to want to say is that if you slow some particles, and not others, that things might fail. Well, obviously, and nobody has said anything to the contrary!
piggy:
Oh, but they have. Repeatedly. Because we're not talking about putting Joe on a rocket ship.

Piggy, you're either being disingenuous or you fail reading comprehension. I know I've stated in at least 3 separate posts exactly the point the rocketdodger is making, and you've failed to acknowledge it. Once right here on this page:

Piggy, no one here--as far as I can tell--is disputing that a human brain has a minimal functional speed (Piggy's Constant?) to produce consciousness.

To be clear, I am disputing two points:

1. that the minimal speed is an inherent limitation in any system of producing consciousness.

2. that consciousness is some process distinct from information processing.

The burden of proof is on you to show evidence for either of these conjectures, and I for one am not convinced by the study you cite and your attempts to wrangle a handful of others into the mix. In the edifice of scientific research on consciousness, your claims are the extraordinary ones.
 
There's nothing fringe about this. But of course, if we want to put it into sharp relief we have to probe the edges.

As for blindsight, your description does not make sense. What do you mean by "failure to recognize"? By that, if you're referring to the obstacle course experiment, you must simply mean that the brain is using information which is not made available to consciousness, which demonstrates that conscious awareness is qualitatively distinct from non-conscious processing.
Correction: failure to be aware of the visual information but still able to unconsciously make use of it. The information doesn't make it into memory where it could be recalled as a conscious experience. The failure isn't in the recall process itself.

I basically agree about conscious awareness: it selects among all potentially-accessible information for what is to be stored, i.e., what is a candidate for later recall as conscious experience.
 
Of course I have.
Conscious experience is clearly formed from a blend of (slightly delayed) real-time input from our internal and external senses associated with stored patterns (memory, schema).

Take a look again at the experiments done with the brain probes which clearly show specialized brain activity coordinated across the brain for events that are consciously experienced but which is absent for events that are not consciously experienced, even though we know from other experiments that the latter (subliminal) events are processed by the brain, recorded into memory, and used for subsequent decision-making.

Consciousness is a function, a behavior, with its own dedicated physiological processes that differentiate it from other types of processing.
We keep returning to your apparent unfamiliarity with what can be done by programs. Everything you've described above can be, and that program can be single-stepped without dropping or distorting any information. It's a technique commonly used for debugging.

The slightly-delayed real-time input would be implemented by storing that input in an array and reading out earlier-stored values. When the program is single-stepped, the reads and writes will take place one at a time, but will all still take place, creating the exact same delay relative to other internal events. In other words, if a delay of 600 clock cycles is needed, then the value will be read out of the array 600 clock cycles after it was stored, no matter what the spacing of those clock cycles is in wall time.
 
Ok, these are descriptions of some aspects of consciousness, all of which could be captured in a program. If this was all there was to it, we'd call that program conscious. I'm not sure where you're going with this.

Ditto.
 
Just a thought, but there seem to me to be two ways of stating this speed limit problem:

1. Every information processing system that has ever been produced has physical limitations that prevent it's operation once it is slowed down beyond a certain limit.


False.

Just about everyone has a three way switch somewhere in their home where a single light is controlled by two or more switches. What is the lower operating limit of those switches?
 
That's not difficult at all.

In the cocktail party effect, the voices in the room are real-time, while the associations that make your name jump out at you are stored.
But "real-time" comes down to a judgement call. I'm including working memory when I talk about recall and you appear to not be. That's clearly going to change the affect of our arguments, so it's best to leave out these non-clear-cut cases.
 
To be clear, I am disputing two points:

1. that the minimal speed is an inherent limitation in any system of producing consciousness.

2. that consciousness is some process distinct from information processing.

The burden of proof is on you to show evidence for either of these conjectures, and I for one am not convinced by the study you cite and your attempts to wrangle a handful of others into the mix. In the edifice of scientific research on consciousness, your claims are the extraordinary ones.

Are you kidding me?

What more evidence can I possibly provide?

Regarding #2, I'm not saying that you can't put the IP label on the process of generating conscious experience. Obviously, you can. But so what?

I can put the "cell functioning" label on the process of DNA replication, along with everything else a cell does. That doesn't mean that it's not a distinct function with specialized processes and structures to handle it, and that if we want to discuss the construction of an artificial cell, and the effect that tweaks to that artificial cell will have on the process of DNA replication, we can somehow ignore dealing with those processes and structures.

Sure, you can call the generation of conscious awareness "information processing". And in doing so, you've accomplished zip.

The fact is, all evidence -- which I've already cited, but I can't tell that you've read -- demonstrates that the function of generating consciousness relies on a particular set of physiological functions, and that it is distinct from (but of course connected with) non-conscious processes.

Which brings us to 1.

The most recent research shows that the processing of input which is made available to conscious experience, in contrast with the processing of input which is not, is marked by a sustained and coherent brain-wide coordination of pre-processed data in real time.

That being the case, and together with the cinematic model of conscious awareness which is supported by numerous studies (on subliminal thresholds, blind sight, simultaneity thresholds, etc.) it is reasonable to conclude that spacing the brain's neuronal firings too widely in time will result in a loss of cohesion that will make the ignition and maintenance of the necessary processes for conscious awareness unsustainable.

It is roughly analogous to the failure of a computer to play a DVD if the operating speed is slowed down too much.

Now, there may be other ways to generate consciousness, but so far, nobody knows of any.

So if you're going to speculate that our hypothetical robot has a brain which is conscious by some means as yet unknown and unimagined, the question of what happens when you slow its operating speed is unanswerable.

Therefore it is ludicrous to attempt to assert that the robot will remain conscious when the operating speed is slowed down drastically.

The best we can say is that if it uses a method analogous to the brain's method for producing consciousness, a drastic slowdown will likely result in insufficient data coherence in real time for consciousness to be maintained, and if it uses some other method then there's no way to even speculate what the effect will be.
 
But "real-time" comes down to a judgement call. I'm including working memory when I talk about recall and you appear to not be. That's clearly going to change the affect of our arguments, so it's best to leave out these non-clear-cut cases.

What in the world are you talking about?
 
Correction: failure to be aware of the visual information but still able to unconsciously make use of it. The information doesn't make it into memory where it could be recalled as a conscious experience. The failure isn't in the recall process itself.

This is a hopeless jumble.

The brain does not store everything it perceives, regardless of whether this data is fed to the functions that produce conscious awareness or not. It is selective in all cases.

And recall is imperfect, whether the brain is recalling information for use in non-conscious decisions, or recalling what it has been consciously aware of.

The sentences above just make no sense.

I basically agree about conscious awareness: it selects among all potentially-accessible information for what is to be stored, i.e., what is a candidate for later recall as conscious experience.

Sorry, but there's no evidence that conscious awareness selects for what is to be stored in memory.

Conscious awareness also does not select for what it's aware of, either in real time or as memory. That's done upstream.
 
We keep returning to your apparent unfamiliarity with what can be done by programs. Everything you've described above can be, and that program can be single-stepped without dropping or distorting any information. It's a technique commonly used for debugging.

The slightly-delayed real-time input would be implemented by storing that input in an array and reading out earlier-stored values. When the program is single-stepped, the reads and writes will take place one at a time, but will all still take place, creating the exact same delay relative to other internal events. In other words, if a delay of 600 clock cycles is needed, then the value will be read out of the array 600 clock cycles after it was stored, no matter what the spacing of those clock cycles is in wall time.

Would you care to discuss how processes analagous to those required for conscious experience can be sustained at very low operating speeds?
 
Pulvinar, I'm sorry I got snappy there. I'm being run from pillar to post lately and I'm tired and just let myself lapse into rudeness. My apologies.
 
This is a hopeless jumble.

The brain does not store everything it perceives, regardless of whether this data is fed to the functions that produce conscious awareness or not. It is selective in all cases.

And recall is imperfect, whether the brain is recalling information for use in non-conscious decisions, or recalling what it has been consciously aware of.

The sentences above just make no sense.
I didn't mean "store everything", but should have said "attentional selection". The point I was trying to make is that the failure is in or before that selection, in not even providing the option for visual sensations to be selected for storage. But we are getting far off the track here.
 
Specifically, the most recent research suggests that it is a "self-sustained reverberant state of coherent activity".

As such, it does not resemble anything like the output of a calculation or logical manipulation, and even though we can describe the low-level activity of the brain in terms of a series of inputs and outputs, and we can observe that the series should not change simply because we lengthen the time interval between these inputs and outputs, we have to recognize that high-level behavior is transparent to such a model.

This article, which seems to be the keystone of your whole argument, only paints a picture of how consciousness happens in a human brain. It makes no claim, nor does it provide warrant for such a claim, that any and all forms of consciousness will have the features thus described. This is the same point we've been making over and over again. But if you want to simply define consciousness as "self-sustained reverberant state of coherent activity" then the whole discussion is moot. I mean, who's to argue with the latest research? Certainly not 50+ years of scientists and philosophers attending to the subject.

On your contention that consciousness is not information processing:
We close this issue with a general consideration, initially put forward by phenomenological philosophers [87]: whenever a subject is conscious, he is necessarily conscious of a given mental content. Consciousness is an transitive or “intentional” process (it is “about” a certain content), and therefore it may be illusory to look for a “pure” form of consciousness independent of its particular contents and of the tasks that it affords.
Feel free to hammer away at the words "may be" in the quote. Or feel free to try and make the case that "mental content" is not information. I seriously and whole-heartedly doubt that even the researchers you cite would defend your interpretation.
 
Would you care to discuss how processes analagous to those required for conscious experience can be sustained at very low operating speeds?
See which do you disagree with:

1) We're given (in Paul's OP) that the conscious process is a running program.

2) A running program is composed of discrete steps.

3) The output of the program could include reports of its conscious experience.

4) As long as the input and step execution order doesn't change, the output of that program doesn't change.

5) We would get the same reports from it of its own conscious experience at any non-zero speed (if played to us at full speed).
 

Back
Top Bottom