• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

On Consciousness

Is consciousness physical or metaphysical?


  • Total voters
    94
  • Poll closed .
Status
Not open for further replies.
Sure. A neuron is a lot more complex than a transistor, having multiple inputs and outputs, while on the other hand a transistor is several orders of magnitude faster. So a transistor can also act as a component in multiple overlaid networks, just via temporal overlays rather than spatial ones.

The human brain has about 1011 neurons. My desktop PC has about 3 x 1011, not counting the SSDs, which would increase it to around 4 x 1012. Neurons switch at less than 1kHz; transistors switch at rates on the order of 1GHz, a million times faster. But a lot of the transistors in a typical computer are purely memory, where all neurons have logical function as well, so the comparison is not simple.

Still, the point stands: We can easily build a computer with the storage capacity of the brain; with a little more effort, we can build one with the processing capacity of the brain. We could even build one with the parallelism of the brain and the switching rate of a modern computer if we really wanted to. That would be expensive, though.

Thank you.

Brainwaves are just electromagnetic noise generated by the switching of large numbers of neurons in phase. You can do that on a computer by simply running a program with a fixed loop. Tune your radio in, and voila, computer waves.

The reason I asked is that I think the production of such integrated noise may point us towards the sort of programming that's required, ie. lot's of feedback loops running simultaneously and in phase with each other.

Right, that's a better question.

I see no reason why a non-human consciousness needs to respond at the same speed as human consciousness. Respond in real-time with respect to some class of external events - yes, that's a reasonable position.

But almost everything in the Universe is either too fast or too slow for us to notice, so the simple argument would be that we're not conscious either.

I may as well have another drink, then... ;)
 
What is "real time"? (In the context you are using?)

Two things: 1) Fast enough to respond appropriately to significant (ie. likely to impinge on the individual's survival success) external events occuring in the environment. 2) Fast enough for all the various loops and sub-routines to communicate what they're doing (in sufficient detail) to each other as they're doing it.
 
The reason I asked is that I think the production of such integrated noise may point us towards the sort of programming that's required, ie. lot's of feedback loops running simultaneously and in phase with each other.
Yes, I think that's correct. I'm not sure that the loops strictly need to be in phase in quite the same way as they are in a conscious brain, but it does point the way to one mode of implementation.

Two things: 1) Fast enough to respond appropriately to significant (ie. likely to impinge on the individual's survival success) external events occuring in the environment. 2) Fast enough for all the various loops and sub-routines to communicate what they're doing (in sufficient detail) to each other as they're doing it.
The second point is particularly interesting - how much bandwidth is required for a human-style integrated consciousness, and how much latency can it tolerate before it disintegrates again? (The answer seems to be 25 milliseconds; if you skew one of the senses vs. the others by that amount, things go seriously awry.)
 
Yes. If you took a brain and made it run at 80% of normal speed, would it be unconscious? A tenth speed? One millionth speed?

It makes no sense at all to implicate speed in the production of consciousness.

The essential question is if the difference between machine and animal consciousness is qualitative or quantitative. Speed is purely quantitative.

Have to be careful with this as it could be because conciousness is a property of the physical stuff it relies on a certain "clock rate".

But if we run a simulation of a human brain it could be slowed or speeded up relative to our "real time".
 
Yes. If you took a brain and made it run at 80% of normal speed, would it be unconscious? A tenth speed? One millionth speed?

It makes no sense at all to implicate speed in the production of consciousness.

The essential question is if the difference between machine and animal consciousness is qualitative or quantitative. Speed is purely quantitative.

It does make sense. Without sufficient speed provide an organism with a survival advantage, it's unlikely consciousness would ever have evolved in the first place.
 
At what speed does the real world operate?

No need to respond to the entire real world, which operates at an infinite number of different speeds, just that subset of real world conditions that is relevant to an individual's likely survival/success.
 
Considering that our consciousness is made aware of our own bodie's decisions a bit after they are made, do you consider that real time ?

Close enough for government work, yes. It seems to be enough for our consciousness to provide some degree of evolutionary utility, anyway.
 
No need to respond to the entire real world, which operates at an infinite number of different speeds, just that subset of real world conditions that is relevant to an individual's likely survival/success.
Well, with artificial consciousness the success will depend on the patience of its creators, and extraterrestrial consciousness that moved at glacial speeds would only need to survive or have success in its own natural environment.

So speed similar to human speeds is not really a necessary ingredient for consciousness; the only requirement for such a consciousness is to exist, right?
 
Well, with artificial consciousness the success will depend on the patience of its creators, and extraterrestrial consciousness that moved at glacial speeds would only need to survive or have success in its own natural environment.

So speed similar to human speeds is not really a necessary ingredient for consciousness; the only requirement for such a consciousness is to exist, right?

I guess that really depends on a specific definition of 'consciousness'. To me it seems consciousness is an evolutionary adaptation which, among other things, increases the speed and accuracy with which an individual is able to navigate and respond to a complex environment. I'm not sure a raw 'consciousness-in-a-box' can exist at all without the context of an environment which makes its existence useful. Are there environments in which the ability to respond to changes at glacial speed, but with much more than simple stimulus-response 'bounce-back', would yield a survival advantage?
 
I guess that really depends on a specific definition of 'consciousness'. To me it seems consciousness is an evolutionary adaptation which, among other things, increases the speed and accuracy with which an individual is able to navigate and respond to a complex environment.

That's an interesting definition, but what do you mean by complex ? I ask because houseflies react very quickly to my hand.
 
That's an interesting definition, but what do you mean by complex ? I ask because houseflies react very quickly to my hand.

It's not a complete definition. Only one necessary (in my view) component of such. Can you think of a route by which consciousness might have evolved if it didn't actually confer any sort of advantage to its owner?
 
I suspect most of us take it to be what your wall clock and wristwatch depict, and that tv broadcast schedules are based on.
Whoops? Forgot time-dilation; so far all we need to consider though is earth gravity. I'll agree that consciousness could/should work in local wall clock time.
 
I guess that really depends on a specific definition of 'consciousness'. To me it seems consciousness is an evolutionary adaptation which, among other things, increases the speed and accuracy with which an individual is able to navigate and respond to a complex environment.
By which you mean only an environment on Earth?

I'm not sure a raw 'consciousness-in-a-box' can exist at all without the context of an environment which makes its existence useful.
Off-hand, I can easily come up with live people whose existence is not useful, and if somebody succeeds in creating an artificial consciousness, you would only rule it to be "conscious" if its existence is useful?

Are there environments in which the ability to respond to changes at glacial speed, but with much more than simple stimulus-response 'bounce-back', would yield a survival advantage?
A survival advantage only makes sense in connection with biological life on Earth. Artificial consciousness only survives as long as it is kept alive, and if we go sci-fi, we can imagine life that knows no death except by accident, and where survival could be something entirely different than the struggle we know from Earth.

Besides, we are all hypothetical here, and a rope computer would be an example where the only speed would be glacial. It only survives as long as its mechanisms are kept running, and who knows what input it will get? Will the lack of a possibility for humans to communicate meaningful with it really determine whether it is conscious?
 
It's not a complete definition. Only one necessary (in my view) component of such. Can you think of a route by which consciousness might have evolved if it didn't actually confer any sort of advantage to its owner?
It could be considered the result of something which does confer an evolutionary advantage. Though that depends on the exact definition of consciousness you prefer.

Take the tale of the Sphex wasp and the evil experimenter. No matter how many times the experimenter moves the wasp's food, the wasp never catches on - it simply doesn't have the circuitry to monitor its own mental processes.

You can catch that sort of thing mechanically, unconsciously, but it is actually simpler and less expensive to do it by adding the feedback loop we call consciousness, because it generalises the problem such that a single process can monitor all such cases.

So we can see consciousness not as the benefit in itself, but merely the most efficient means of implementation.
 
By which you mean only an environment on Earth?

No.


Off-hand, I can easily come up with live people whose existence is not useful, and if somebody succeeds in creating an artificial consciousness, you would only rule it to be "conscious" if its existence is useful?

I didn't mean to say that the whole entity needs to be useful to others, I meant that its consciousness needs to be useful to it.


A survival advantage only makes sense in connection with biological life on Earth. Artificial consciousness only survives as long as it is kept alive, and if we go sci-fi, we can imagine life that knows no death except by accident, and where survival could be something entirely different than the struggle we know from Earth.

Besides, we are all hypothetical here, and a rope computer would be an example where the only speed would be glacial. It only survives as long as its mechanisms are kept running, and who knows what input it will get? Will the lack of a possibility for humans to communicate meaningful with it really determine whether it is conscious?

I know I am not being clear here. Hell, what I mean isn't even clear to myself. Let me think about it some more and try again later. :o
 
It could be considered the result of something which does confer an evolutionary advantage. Though that depends on the exact definition of consciousness you prefer.

Take the tale of the Sphex wasp and the evil experimenter. No matter how many times the experimenter moves the wasp's food, the wasp never catches on - it simply doesn't have the circuitry to monitor its own mental processes.

You can catch that sort of thing mechanically, unconsciously, but it is actually simpler and less expensive to do it by adding the feedback loop we call consciousness, because it generalises the problem such that a single process can monitor all such cases.

So we can see consciousness not as the benefit in itself, but merely the most efficient means of implementation.

That is a good point. In fact, somewhere upthread I mused about the possibility that consciousness may not be directly useful but may only serve to distract some of our brain processes while the ones that are immediately important work in the background.
 
Status
Not open for further replies.

Back
Top Bottom