• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Explain consciousness to the layman.

Status
Not open for further replies.
Because it's an unproven hypothesis.

Does that translate as "I don't accept it" ?

Maybe you've done a pretty poor job of understanding it.

What a fine, clever way to avoid having to make any effort. Congratulations, westprog.

One could produce an overly-restrictive definition of respiration that would require biological lungs, but that would lead to less rather than more understanding of the phenomenon.

Funny that you don't see how this also applies to consciousness and brains.
 
I hate this red herring about whether a Turing Machine is a general purpose computation machine or not. Off topic and irrelevant. Maybe, like qualia, we can side step it with a synonym. How about GPCM (general purpose computation machine)?

Thought experiment time!

1) We wire a sufficiently advanced GPCM to a spider after removing its brain, programmed to do exactly what a spider's brain does in handling input and output and confirm that the spider does everything a real spider does in every way.

2) We do the same thing with a human. We get the same result. It even spontaneously sings rhapsodically about how the subjective experience of redness must be somehow immaterial and incomputable, even though it was not specifically programmed to do that.

Can #2 happen? If not, why not? If it happened, would it be conscious?
 
Last edited:
I hate this red herring about whether a Turing Machine is a general purpose computation machine or not.
It's not a red herring, it's the central theorem of computer science.

1) We wire a sufficiently advanced GPCM to a spider after removing its brain, programmed to do exactly what a spider's brain does in handling input and output and confirm that the spider does everything a real spider does in every way.

2) We do the same thing with a human. We get the same result. It even spontaneously sings rhapsodically about how the subjective experience of redness must be somehow immaterial and incomputable, even though it was not specifically programmed to do that.
With you so far.

Can #2 happen? If not, why not? If it happened, would it be conscious?
Yes, n/a, and yes.
 
I don't think you get it, Pixy. You're supposed to simply accept westprog's claims.
Well, yes, after all this time I realise that.

But I will always point out to him when he's spouting nonsense. Which is not quite all the time. Close, but not quite.

It's a service I provide.
 
Damn straight.

When you are arguing a precise technical definition, and you pull out a general purpose dictionary, you automatically lose.

Not if your opponent offers no argument why the standard definitions are inadequate, especially when the thread title included the word "layperson".

So are you going to enlighten us with the technical deference between compute and calculate or are you just going to claim authority without explanation like always?

Also don't forget to answer the rest of my last post you sniped.
 
You want to define a computer as a self-referential information processor so that if you define a brain as a self-referential information processor then it follows that the brain is a computer.

... and consciousness is self-referential information processing. It just is.
 
Does that translate as "I don't accept it" ?

Just like you don't "accept" God. You want a leap of faith from me. I want actual proof.

What a fine, clever way to avoid having to make any effort. Congratulations, westprog.

You can play the insult game if you want. If you do, don't be surprised when I want to play as well. Don't be surprised if it doesn't lead to your greater understanding.

Funny that you don't see how this also applies to consciousness and brains.

It's an excellent fit, especially when we look at the history of physiology and see the theories that people used to have as an absolute certainty.
 
Just like you don't "accept" God. You want a leap of faith from me. I want actual proof.

I'll take that as a "yes".

You can play the insult game if you want. If you do, don't be surprised when I want to play as well. Don't be surprised if it doesn't lead to your greater understanding.

Greater understanding ? You REFUSED to explain yourself. THAT doesn't lead to my greater understanding. How is it an insult to simply point out what you're doing ?

It's an excellent fit, especially when we look at the history of physiology and see the theories that people used to have as an absolute certainty.

Word salad.
 
Not if your opponent offers no argument why the standard definitions are inadequate, especially when the thread title included the word "layperson".

You're having a tangential conversation, I'm afraid, which means you all get to pull out all that awful vocabulary until you reach a conclusion.

The thread question has pretty much been answered, at least to my satisfaction. "Consciousness is whatever the brain does." Everything else has been an argument that there is a gap to fit a god into.
 
You're having a tangential conversation, I'm afraid, which means you all get to pull out all that awful vocabulary until you reach a conclusion.

The thread question has pretty much been answered, at least to my satisfaction. "Consciousness is whatever the brain does." Everything else has been an argument that there is a gap to fit a god into.



:bigclap
 
I don't think you get it, Pixy. You're supposed to simply accept westprog's claims.

I get it. You're having a laugh.

If I were contending that consciousness is, as a matter of certainty, tied to a particular physical process, then I would certainly be expected to prove that, or at least provide some evidence that that is the case. If I claimed this to be a matter of absolute fact, then indeed, I should be expected to support my contention.

However, I have not made any such claim. I've suggested that it is at least possible that consciousness is tied to a particular physical process, or set of processes, as a possible alternative explanation. I haven't even insisted that this is a more probable explanation than the computation theory. I've said that it's possible.

There cannot possibly be a burden of proof on me to demonstrate that my alternative, possible explanation is in fact true. As to evidence that it may be true - well, since consciousness has only developed in association with a particular set of physical processes, it is at least possible that it is tied to one, or a combination of, those processes. I have no obligation in argument to do more than demonstrate that an alternative to the theory being propounded as a certain fact can be found.

As a sideline to this - it is of course implicit in the physicalist theory that if the physical processes necessary to consciousness could be identified, and if an artificial device could be created which duplicated said physical processes, then the phenomenon of consciousness would be present in said device. What the necessary elements might be we do not know. I'm inclined to think that any such artificial mind would need to operate on the same time scale as its environment, for example - but that's merely an inclination, not an essential contention.

There's nothing inherently absurd in claiming that a phenomenon need not be tied to a particular physical process, but it is a large claim, and if it is to be accepted as certainly true, then it needs to be proven. I'm entitled to demand proof before I accept this theory as a fact. I'm entitled to demand that the process of computation be given a physical definition if it is to be accepted as part of a physical theory. I am in no way obliged to prove that some alternative theory is certainly true in order to be entitled to doubt this specific theory. It is the essence of critical thinking that one be allowed to doubt that something is true until it is demonstrated to be true.
 
You're having a tangential conversation, I'm afraid, which means you all get to pull out all that awful vocabulary until you reach a conclusion.

The thread question has pretty much been answered, at least to my satisfaction. "Consciousness is whatever the brain does." Everything else has been an argument that there is a gap to fit a god into.

By George, he's got it!

:th:
 
rocketdodger, I will respond to your responses to me later. I have to get my references straight.

It's a simulation.
Does that really make a difference?

And also: The computations aren't done by the room, or the pebbles. Computations are done using the room, or the peoples.

The room and the pebbles are datastores without a processor.

That's a good point, that goes along with what I was saying before: About how the pebbles would have to move themselves in such a manner to be considerd a conscious entity.

2) We do the same thing with a human. We get the same result. It even spontaneously sings rhapsodically about how the subjective experience of redness must be somehow immaterial and incomputable, even though it was not specifically programmed to do that.

Can #2 happen? If not, why not? If it happened, would it be conscious?
No. 2 could happen. You're simply substituting the medium by which consciousness takes place.

This is more clear if we were to replace each and every neuron in a person's brain with an artificial electro/chemical circuit that does the exact same thing as the biological neurons, one section of the brain at a time.

You will eventually have an entirely electronic brain, and one where consciousness would, according to thought experiment, still be maintained.
 
I hate this red herring about whether a Turing Machine is a general purpose computation machine or not. Off topic and irrelevant. Maybe, like qualia, we can side step it with a synonym. How about GPCM (general purpose computation machine)?

Thought experiment time!

1) We wire a sufficiently advanced GPCM to a spider after removing its brain, programmed to do exactly what a spider's brain does in handling input and output and confirm that the spider does everything a real spider does in every way.

2) We do the same thing with a human. We get the same result. It even spontaneously sings rhapsodically about how the subjective experience of redness must be somehow immaterial and incomputable, even though it was not specifically programmed to do that.

Can #2 happen? If not, why not? If it happened, would it be conscious?

I've previously given the example of catching a ball as something that can't be done as a computational process, because it is not sufficient to accurately calculate the trajectory of the ball - a signal has to be sent soon enough that a hand can reach out and grab the ball. Clearly, a system that cannot guarantee that it will send the signal in time is not plug compatible. This is why I point out that the Turing model does not describe what the brain actually does. Rocketdodger insists that the sequential, deterministic, time-independent sealed Turing model is able to describe the asynchronous, non-deterministic, time dependent interactive functions of the human brain. I dispute that.

It's certainly true that a GPCM could simulate the catching of the ball. Perhaps if the GPCM were extremely powerful, it could perform the simulation as quickly as if it really happened. However, it's a principle of the computational hypothesis that the conscious experience produced by the computation is entirely independent of how long it takes to run.

I'm aware that when people talk about replacing a brain with a GPCM, or computer, or artificial brain, they are thinking in terms of a machine that will actually allow the person to catch the ball. Such a machine would not be a pure computational device - it will be highly interactive. I consider it quite possible that some such device, able to control an actual human body, or precise simalcrum of such, might be conscious where a pure simulation, running only on computer hardware, might not. This is not the computational view, however.
 
I hate this red herring about whether a Turing Machine is a general purpose computation machine or not. Off topic and irrelevant. Maybe, like qualia, we can side step it with a synonym. How about GPCM (general purpose computation machine)?

Thought experiment time!

1) We wire a sufficiently advanced GPCM to a spider after removing its brain, programmed to do exactly what a spider's brain does in handling input and output and confirm that the spider does everything a real spider does in every way.

2) We do the same thing with a human. We get the same result. It even spontaneously sings rhapsodically about how the subjective experience of redness must be somehow immaterial and incomputable, even though it was not specifically programmed to do that.

Can #2 happen? If not, why not? If it happened, would it be conscious?

The only reason I harp on it, Mr. Scott, is because certain people play semantic games to avoid giving an honest answer to your question #2.

For example, I suspect something like "but a Turing machine can't account for real time events, so if it was hooked up to the human body it couldn't be a Turing machine. So clearly computation alone isn't responsible for consciousness," could be a counter-argument. Which is just another way of stating that a GPCM has some aspects that an idealized Turing machine does not -- well, durr, because a GPCM is real and a Turing machine is not.

Except, that is just a semantic triviality. It doesn't even begin to address the serious and honest question you bring up of "if you can replace the brain with a computer, what are the implications?" It is simply a sidestep to avoid having to even think about "what if."

Many people here have already had discussions about your #2, and furthermore various variations that are more or less radical.

For instance, what if you didn't even use GPCMs? What if you simply took real biological neurons and constructed a human brain from scratch -- exactly to the blueprints of an existing one, mind you -- and then hooked it up inside a body's head. Would that be conscious? Curiously, a few posters here have refused to give a response to even this seemingly non-contentious hypothetical.

A little more radical, instead of replacing the brain wholesale with the GPCM, what if you started at a lower level and just replaced some of the interactions between ion channels ( or any other proteins for that matter ) with the GPCM analog? Or what about then entire neuron? Then you have a brain made of artificial neurons run by GPCMs, that is still physical and still the size/shape of the normal brain. Is that more acceptable, for the sake of argument, to some people?

Then we have a situation like you are asking about -- just replacing the whole brain with a single GPCM. But why stop there? How about just replacing the whole person with one? Or why not the whole environment of the person as well?

Which leads to the very interesting question -- that cannot be answered, by the way -- of whether or not we are in a simulation ourselves, being nothing more than information processed by some GPCM.

All of these are very good things to think about. The problem is, human emotion gets in the way, and invariably people start to stop and cling to irrational positions.

For example, it took me quite awhile to agree that stepping in the teleporter, and having the source destroyed while the destination is merely a copy, is mathematically and therefor physically no different than what occurs to us every instant of our normal lives. Now that I understand it, I would have no hesitation in stepping into the teleporter. However I know that many people who are supporters of the computational model of consciousness still refuse ( irrationally, in my opinion ).

I think that most intelligent people, thinking about these issues, will eventually arrive at what seems logically evident. For example, if you replace certain parts of neurons with computers, the person is still gonna be conscious. But to then make the jump to having the whole brain replaced with a computer ... well, that takes more logical override of emotion than many people have. And to be honest it still seems strange, even to me. However I can't just discount logic and mathematics because it seems strange.
 
Status
Not open for further replies.

Back
Top Bottom