• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

My take on why indeed the study of consciousness may not be as simple

Really ? How do you know those experiences exist at all ? That's one reason why solipsism is BS.

If my experiences don't exist, then nothing exists.

You infer your own consciousness from your behaviour, compared to those of other people. I don't know about you, but I noticed, even back then, a marked difference in my "consciousness" when I was about four. So, what happened ? Did my qualia get an upgrade ? Or did my perception of my behaviours simply get sharper through experience ?

I might decide what my consciousness is based on external factors, but the fact of its existence doesn't depend on them, because those external factors are only experienced through my consciousness.
 
It would not have beliefs or experiences. It would claim to have beliefs or experiences.

A person knows itself to be a person. An inanimate object does not. A p-zombie doesn't believe itself to be a person, because it doesn't believe anything.

Now that is interesting, I thought that the p-zombie behaved exactly as though it had all the behaviors of consciousness.

So you are saying the p-zombie does not have events of perception, or memories of perception?

That is okay, I am taking it as having all the behaviors of consciousness, including the internal ones.

So I looked, and it still confuses me, the p-zombie if poked with a stick feels no pain, but says "Ouch" but if it has no qualia, it can't see the stick or tell it was poked with a stick, so either qualia are defined as something other than perceptions or I am tottaly confused.

Philosophical ZombieWP

I was thinking along these lines
"A behavioral zombie is behaviorally indistinguishable from a human and yet has no conscious experience."
 
Me of course - the consciousness created by the millions of unconnected devices with insufficient information for even an instant of consciousness on any one.
To you?

You've always been conscious. You can't remember a time before you were conscious.

You are just a replay of your earlier self. You only know what your earlier self knew. You only experience what your earlier self experienced. To you, not only does this consciousness created by millions of unconnected devices not appear at any given point in time, not only does that not even have a referent, it doesn't appear at all. To you, you are you. The same you you always were. That is, after all, how you set up the hypothetical.

It might come as a nasty shock when we gather up all the components and plug you back together.

Are you reading these words?

They are appearing to you.

That is how you are able to read them.
That doesn't actually answer the question, but the previous answer did, so it's not a problem.
 
Now that is interesting, I thought that the p-zombie behaved exactly as though it had all the behaviors of consciousness.

So you are saying the p-zombie does not have events of perception, or memories of perception?

That is okay, I am taking it as having all the behaviors of consciousness, including the internal ones.

So I looked, and it still confuses me, the p-zombie if poked with a stick feels no pain, but says "Ouch" but if it has no qualia, it can't see the stick or tell it was poked with a stick, so either qualia are defined as something other than perceptions or I am tottaly confused.

Philosophical ZombieWP

I was thinking along these lines
"A behavioral zombie is behaviorally indistinguishable from a human and yet has no conscious experience."

This is the problem if you call experience behaviour.
 
So I was doing okay with the probes-across-the-universe thought experiment, but now I'm having trouble with the teleported-brains experiment.

Let's say we make a copy of my brain every millisecond for a few seconds and array them across the same field with the horses. Let's assume the brains are halted in time right down to every electrochemical process. Are the brains conscious?

I don't see how. Certainly each one alone isn't, because they are halted. Are they conscious as a group? It doesn't seem like they would be.

What is the difference between the brains and the probes?

~~ Paul
I didn't realise the brains were halted. Then no, there would not be consciousness.

As I said before there would at least have to be a minimal experiencable slice of time before they were annihilated.

And I could not be more than one of them.
 
westprog said:
Constructed out of what? How does one construct a subjective experience?

The "subjective" part is superflous - one only needs worry about the "experience" part.

If by "superfluous" you mean "grammatically redundant", then sure. This does change the fact that humans still have no knowledge of how to generate it.
 
If by "superfluous" you mean "grammatically redundant", then sure. This does change the fact that humans still have no knowledge of how to generate it.

You say this as if there is something to generate rather than something you can't prevent happening.
 
So I looked, and it still confuses me, the p-zombie if poked with a stick feels no pain, but says "Ouch" but if it has no qualia, it can't see the stick or tell it was poked with a stick, so either qualia are defined as something other than perceptions or I am tottaly confused.
There is no end to the problems with the concept of p-zombies, unless you are happy with a Universe that is itself logically inconsistent.

Read the Criticism section of the Wikipedia article. It covers all that I've been saying. The concept of p-zombies is just not meaningful under most metaphysical systems.
 
To you?

You've always been conscious. You can't remember a time before you were conscious.
But my consciousness might have started right now if I am a simulation.
You are just a replay of your earlier self. You only know what your earlier self knew. You only experience what your earlier self experienced. To you, not only does this consciousness created by millions of unconnected devices not appear at any given point in time, not only does that not even have a referent, it doesn't appear at all. To you, you are you. The same you you always were. That is, after all, how you set up the hypothetical.
No, you are still missing it.

I don't know whether or not I always was conscious or just got created this moment complete with replayed memories and I don't know that any moment was my first and will be the last.

All of that would be fine, but how am I reading this whole sentence when none of the information for this sentence is connected in any way?

It would be fine if I just got it in bits and pieces, but I wouldn't even get that. I would never get enough for a full sentence.

And again - what is the mechanism to produce this experience that I have?
 
Last edited:
Where is the metaphysics in saying that a system that looks and behaves like a computational system is a computational system?

Where is the metaphysics in appreciating that the reported experience of conscious decisions do not tie up with the measured times at which apparently "responsive" signals are sent?

Where is the metaphysics in the application of one of the very many ways in which conciousness can be alterned and noting that it affects the normal, very computational looking, operation of the brain?

I am not seeing the extravagence - I see a very simple assertion.

It looks like, it sounds like and it moves like a computer - it is a computer.

Clearly it doesn't. As one can easily tell by looking at a computer, and then looking at a person. Try it, it's very easy to tell the difference.
 
As I say, others have understood what I meant by Run3 so I know that it is your problem that you don't understand it.
I understand Run3. Run3 is an unrolled version of Run2 that uses a hard drive instead of memory sticks for both program data and data segments.
In my conscious experience. Unless my conscious experience is happening 50-100 years after the mechanism that created it had completed it's last step then all this stuff I am seeing right now would have to have come together faster than the speed of light if I am Run4.
No. And again, I'm going to ignore "conscious", because as far as I care, any disagreement you have with PixyMisa and rocketdodger is simply a difference in dictionaries, until demonstrated otherwise.

In Run4, you're duplicating Run2's results. Run2 is what produced the information. You've got that entire run recorded. Then you're breaking that run out into pieces--individual processor calculations. You're taking each piece and putting it on a different processor, complete with all requisite state information. Then you're launching these multiple processors out into space.

Now these processors just go off and perform these steps. Part X performs its step at time t. Part Y performs its step at time t+1 second. Part Y+1 performs its step at time t+2 seconds. And X and Y and Y+1 are farther apart than 1 light second from each other--say, they are one light year apart. X did "write 12 at address 40". Y did "read from address 40. And somehow when Y+1 is starting to process its instruction, R2 already has 12 in it.

Magical faster than c?

No.

Here's how Y+1 got 12 in it.

Run2 executed step X, 20 years ago. That stored 12 at address 40. Run2 then executed step Y, which read what was at address 40 and put it into R2. Run2 then executed step Y+1, recording the starting state first, because that's what you're doing in Run2.

And guess what it recorded as the state of R2? YES! It recorded that R2 had 12 in it.

Then, you put "R2 has 12 in it" on ship Y+1. Then, you launched ship Y+1 into space in some direction.

Then, 20 years pass. And a cesium clock signal says "go". That 12 that you put on ship Y+1's processor's R2 register then gets (potentially) used in a calculation.

And nothing traveled faster than c.
So you are saying that the consciousness could have occurred within the the completion time of the algorithm.
No. No no no, and again no. PixyMisa and rocketdodger are saying that.

I'm saying that step Y+1's R2 register on ship Y+1 has 12 in it because step Y read 12 from address 40, which was left there when step X put 12 into address 40, and that this whole thing happened without violating any laws of physics.
But I am not the observer - I am the conscious mind that it is creating.
The term "observer" is a technical term under the theory of relativity that you are saying is violated, and it doesn't mean conscious entity, except possibly as a helper analogy.

If you say "same time", you're assuming a particular context. That requires an observer--otherwise you don't mean anything coherent under the very theory of relativity that you say is being violated.

You can't just handwave a violation of relativity, Robin.
And you are ignoring the main problem I put, that the modules are unconnected.
No. You're ignoring that Run4 took Run2's results and put them onto a bunch of different ships.
By what mechanism am I having this conscious experience if the mechanism of my brain is in millions of unconnected parts?
Not my argument. Take that one up with PixyMisa and rocketdodger, and stop confusing them with me.

What I'm doing is challenging your assertion that the laws of physics have been violated somewhere. If you're hiding a part of the argument, all you need to do is show your cards. But if your dictionary disagrees with PixyMisa and rocketdodger's dictionaries, then you're committing a straw man.

I'm only concerned here with the specifics of your violation, and the fact that you're handwaving things.
 
Last edited:
How do you tell?

Well the construct is defined that way, the Philosphical ZombieWP does not have all the behaviors of 'consciousness' it only has the apparent behaviors of an external nature. But that still begs the question of how does the p-zombie know you poked it with a stick?

So it does not have the internal behaviors of consciousness, now does the
neurological zombie?

"A neurological zombie has a human brain and is otherwise physically indistinguishable from a human; nevertheless, it has no conscious experience."
 
No, you are still missing it.

I don't know whether or not I always was conscious or just got created this moment complete with replayed memories and I don't know that any moment was my first and will be the last.
Right.

All of that would be fine, but how am I reading this whole sentence when none of the information for this sentence is connected in any way?
By lots of separate computations.

Where's the problem?

It's a recording, Robin. Let's say I record a TV show. Only I record each pixel on a different recorder and shoot them all into space. Now I have - assuming 1080P - 2,073,600 recorders, all set to start playing in precise synchronisation one year after they leave Earth.

Where's the TV show? Does it still exist? Certainly, because we can gather up the recorders and reassemble it. When all the recorders start to play, is the TV show playing? The answer is both yes and no; there is not necessarily any single frame of reference that can observe the TV show playing in proper order.

The same goes for your compartmented mind. "Now" is no longer defined for you with respect to the Universe, because the components that make you you have significantly different frames of reference. But you don't know that, because you are unconcerned with the Universe.

To the rest of the Universe, you're a consciousness in potentia. But to you, you're real. If we reassemble the components and plug you in, you will experience no loss of continuity. We recorded you, played you back on components drifting through space, reassembled you, plugged you in to a new body, and off you go. To you, one single, continous, conscious experience.
 
Of course, if when we reassemble the components, they're not in sync, your mind ends up scrambled. Best not to do that.
 
Now that is interesting, I thought that the p-zombie behaved exactly as though it had all the behaviors of consciousness.

So you are saying the p-zombie does not have events of perception, or memories of perception?

That is okay, I am taking it as having all the behaviors of consciousness, including the internal ones.

The entire point of the p-zombie thought experiment is that the zombie has no "internal behaviors" of consciousness.

So I looked, and it still confuses me, the p-zombie if poked with a stick feels no pain, but says "Ouch" but if it has no qualia, it can't see the stick or tell it was poked with a stick, so either qualia are defined as something other than perceptions or I am tottaly confused.

You're totally confused.

Philosophical ZombieWP

I was thinking along these lines
"A behavioral zombie is behaviorally indistinguishable from a human and yet has no conscious experience."

Hence, if you were a hypothetical p-zombie you would not feel, believe or know anything. If one were to pinch a p-zombie and it said "ouch!" its response would be no more conscious than a pinched rubber duck saying "squeak!".

Keep in mind though that p-zombies are simply a thought device. In practice, I think their would also be some external behavior differences between a conscious human and an unconscious automaton. IMO, the only way a p-zombie could work in practice is if there were a conscious person piloting some of it's behavioral outputs.
 

Back
Top Bottom