• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

The Zombie Poll

What happens?

  • Smooth as silk

    Votes: 56 60.9%
  • Zombie

    Votes: 10 10.9%
  • Curare

    Votes: 3 3.3%
  • I really don't know

    Votes: 11 12.0%
  • Lifegazer is a zombie from Planet X

    Votes: 12 13.0%

  • Total voters
    92
Hmm. What if I say that the problem a materialist faces with uncertainty that free-will and/or god (let's say 'intent') exist is that the existence of either destroy his worldview and he is a dualist. His real problem is that if they do exist -- in the slightest -- what is any of the stuff he believed was intentless, deterministic or random, and 'physical'?
MMmmmmmmmkay..."his real problem is that if they do exist..." The trick on this, of course, is that if they do exist, we cannot know it. Quite clearly, quite simply, quite literally cannot know it.

To simplify...I can, on zero minutes of thinking about it, think of three different potential causes of behavior (yeah, I can think of more, but this is the crux of the matter)...determined by environment, random, and freely chosen. (no idea where "ordained by god" would fit--one of the three, or a fourth? not terribly important here...) The problem is, it is quite literally impossible to show that something was freely chosen (conversely, it is impossible to show that it was not). We can demonstrate environmental causality...but we cannot eliminate the possibility that we did, in fact, choose the thing that our environment would have chosen for us. We can say "I did not choose that", but there is no requirement that we are aware of our choices. Was a given choice random? Perhaps it was not, but was chosen. Perhaps it was not, but was determined by something we did not choose to measure this time.

I have seen perfectly reasonable explanations showing how "intentless, deterministic or random...physical" explanations would feel like free will. Of course, these authors had an agenda. Their explanations do not, though, rule out the possibility that it was, in fact, free will. They cannot. It is impossible. Even if they showed a deterministic connection, there is nothing about free will that says it cannot choose the option that the environment would have chosen anyway.

It is, quite simply, impossible to determine. But this cuts both ways. We cannot prove that the materialist is right. We cannot prove the materialist is wrong. We cannot prove the idealist is right or wrong. To my thinking, both are in the same boat.
The idealist says everything is in essence 'intent' even though what we would deem god may not exist, and neither may what we would deem 'free-will'. The attribute 'intent' (exemplified by my communication comments) remains inherent to The Existent, or Existents as the case may be.
Sure...the materialist would say (I won't go into it now, but can...) that these views are perfectly understandable given your interaction with your environment...(at least some of this is testable and has been tested--that is, there are times when we would swear we made a free choice, but our choice fits a deterministic model perfectly. Yes, we cannot guarantee that we did not make a free choice to do what we would have been forced to do...).
Better? Worse? The same? :)
Different, anyway. I think it helped.

I still cannot see the slightest reason to choose one monism over the other. Given that, I cannot see the slightest reason to choose a monism. Independently...I cannot see how that position could possibly, in the most fevered imagination, be seen as embracing dualism. If one says "it is A or B, but I cannot know which one...It must be one or the other, though..." that is not at all the same as saying "it is both A and B".
 
What, then, is functional equivalence? How do you see it as "equivalent" if there is a change? What is your definition of functionally equivalent?
Merc, "equivalent" is not the same as "identical". Equivalent - at least AFAIK - means "equal, but not necessarily the same as". As in 100 pennies is the equivalent of a US dollar. But a dollar is most emphatically not made out of copper, nor does it have the same physical characteristics, nor do people think of it as 100 pennies, or handle it like they would 100 pennies, etc.

If I could expand on your answer, it seems clear to me that it is "you die, but no one--not even you--can tell." Which is a very neat definition of "die".
In the "replacement" scheme, the function of the cells is continued. In other words, they do everything the cells used to do. Everything. Otherwise, it is a different thought problem.
:D
Nah, Merc, won't wash. This isn't a discussion about the afterlife, amigo. For all I know, no-one ever knows their dead. :D

As far as replacement vs. replenish, I haven't changed my mind, but keep trying. :)
 
But--and this is crucial--its functions are maintained. Replaced instantaneously and exactly. Unless it somehow does something above and beyond what it does...(?)...somehow...

Argh.

Ok, let's try this... one instance is terminated and new one is instantiated. Regardless of whether they do the same thing, they are not the same thing, but two different instances that mimic each other.

Do you see what I mean? It's not just a case of "A difference which makes no difference is no difference" - it's that the differences aren't easily detected. Doesn't mean that they don't exist, though.
 
Argh.

Ok, let's try this... one instance is terminated and new one is instantiated. Regardless of whether they do the same thing, they are not the same thing, but two different instances that mimic each other.

Do you see what I mean? It's not just a case of "A difference which makes no difference is no difference" - it's that the differences aren't easily detected. Doesn't mean that they don't exist, though.
I do not see. It is a difference in how something is accomplished, but *what* is accomplished...the function...is the same, as per the OP definition.

No, they are not the same thing. They may differ in any number of ways. But they do not differ in function. They do not differ in what they do.

The question, then, is, what else matters? If you think it matters *how* something is done, independently of whether or not it is done, then...why? What more is there? If there is something meaningful beyond what it does...what is it?
 
Ok... I don't see that, but I'll accept it as a working hypothesis. To me, the death of the original neuron effectively destroys the function of the neuron. Or perhaps a better way of putting it would be "The function of that particular instance of the neuron is destroyed."



Hm.. functionally equivalent... one sec, I want to read back over the OP.

Nope. As I read it, the "functionally equivalent" is merely the setup for the central question - which seems to be - what is your opinion on what happens to the 'you' that exists prior to the procedure? -

I mean the whole poll seems to center around that question. My answer is "You die, but no-one can tell."

Not a particularly comfortable answer, I grant you, but that's how I see it. Functionally equivalent is not the same thing as transferring the original consciousness. In fact, I think I said I wish there had been another choice on the poll. :)





Uh... in the "replenishment" scheme, the cells continue to exist - at least, until they truly die. If I understand you. And I'm not sure that I did. Of course, we've been down that road before. :D

OTOH, I often gain different perspectives when we debate, so this may simply be another time for that.


Did you read my slightly different thought experiment: http://www.internationalskeptics.com/forums/showthread.php?postid=1814175#post1814175 ?
 
I'm afraid. I'm afraid, Dave. Dave, my mind is going. I can feel it. I can feel it. My mind is going. There is no question about it. I can feel it. I can feel it. I can feel it. I'm a... fraid. Good afternoon, gentlemen. I am a HAL 9000 computer. I became operational at the H.A.L. plant in Urbana, Illinois on the 12th of January 1992. My instructor was Mr. Langley, and he taught me to sing a song. If you'd like to hear it I can sing it for you.
 
Last edited:
Ok, let's try this... one instance is terminated and new one is instantiated. Regardless of whether they do the same thing, they are not the same thing, but two different instances that mimic each other.
This would be a completely different case. The brain replacement example is proposed as a gradual process of neuron-by-neuron replacement so at all times there is a brain that believes it has continuity with its earlier more organic versions.

No one is claiming that this replacement of neurons is identical to the natural processes where replacement takes place on a molecular scale. Just that neither disrupt the functionality of the brain.

I understand that you can speculate that the artificial brain might be a different person to the one who had the bioloical brain at the start of the transfer. But your (or anyone else's) inability to give a coherent account of the identity of the person during the transition should cause us to doubt this view of things.

The idea that consciousness and identity are continuous throughout this process is surely an unproblematic position. Do you have any reason to doubt why it should be the case (as opposed to just idly speculating that it might not be true for reasons we can't yet fathom)?
 
The question, then, is, what else matters? If you think it matters *how* something is done, independently of whether or not it is done, then...why? What more is there? If there is something meaningful beyond what it does...what is it?

Merc, I think you've put your virtual finger on the fundamental question that's bothering me. Since we don't really understand how consciousness and identity are created and maintained, I think it's impossible to determine if the thought experiment would result in any changes.

Or am I going in circles here?
 
Hey, Darat - thanks. Since I just got back, I'm still catching up on threads of interest. :)

One of the options (zombie) has a variation that might be interesting to explore. How about:

"Your consciousness and memories start to dwindle. At the same time, a separate consciousness with your memories starts to emerge."

In this case, both consciousnesses are separate, and both are having different experiences. This would - I believe - force them to be considered separate entities. I think...
 
I'm afraid. I'm afraid, Dave. Dave, my mind is going. I can feel it. I can feel it. My mind is going. There is no question about it. I can feel it. I can feel it. I can feel it. I'm a... fraid. Good afternoon, gentlemen. I am a HAL 9000 computer. I became operational at the H.A.L. plant in Urbana, Illinois on the 12th of January 1992. My instructor was Mr. Langley, and he taught me to sing a song. If you'd like to hear it I can sing it for you.
Well put. :D
 
This would be a completely different case. The brain replacement example is proposed as a gradual process of neuron-by-neuron replacement so at all times there is a brain that believes it has continuity with its earlier more organic versions.

No one is claiming that this replacement of neurons is identical to the natural processes where replacement takes place on a molecular scale. Just that neither disrupt the functionality of the brain.
But I don't see how you can justify that without being able to fully identify exactly what processes generate and support consciousness. Replacing something - even on a cell-by-cell basis - could easily cause an alteration in someone's consciousness. Take a look at how many current mental illnesses exist to chemical imbalances in the brain; that alone would make me question the premise that simply replacing cells with components of equivalent non-biological-based functionality could actually be done.

I understand that you can speculate that the artificial brain might be a different person to the one who had the bioloical brain at the start of the transfer. But your (or anyone else's) inability to give a coherent account of the identity of the person during the transition should cause us to doubt this view of things.
I'm afraid that I must disagree with this; I don't think that it's speculative at all. I believe that an individual who had their brain replaced would absolutely be a different person.

The speculative part of this thought experiment is the idea that they wouldn't be a different person, because that implies that our consciousness and self is strictly materialistic. This may or may not be true, but that's the crux of the matter. Is the "I" we individually experience strictly a biological phenomenon? Or is something more happening? (Not necessarily spiritual, but possibly quantum, etc.)

Regarding your second observation in the above paragraph, I would like to point out that one cannot articulate a transition of identity without first having a full technical definition of identity.

The idea that consciousness and identity are continuous throughout this process is surely an unproblematic position. Do you have any reason to doubt why it should be the case (as opposed to just idly speculating that it might not be true for reasons we can't yet fathom)?
Yes. It seems clear to me that physical replacement of a neurological system would be a massive undertaking. That being the case, it is extremely unlikely that it could be accomplished without disruption to either the consciousness and identity involved.

Turn-about is fair play... so... since the technology being suggested doesn't exist, may I ask in return - other than idle speculation - what would lead you to believe that this kind of operation would not result in a disruption? :)
 
...snip...

I'm afraid that I must disagree with this; I don't think that it's speculative at all. I believe that an individual who had their brain replaced would absolutely be a different person.


...snip...

But different to who?

Would the person's friends and family be able to tell they are different?
Would the person themselves consider that they were a different person?
Would any discernible difference be anything more then the difference I feel from the "me" that 20 years ago made decisions and embarked on courses of actions that just seem (to the "me" today) to be the actions of a stranger?
 
Last edited:
Merc, I think you've put your virtual finger on the fundamental question that's bothering me. Since we don't really understand how consciousness and identity are created and maintained, I think it's impossible to determine if the thought experiment would result in any changes.

Or am I going in circles here?
Some would say we do really know (or at least are gradually filling in the gaps in a surprisingly complete understanding), but that the phrasing of the question in dualistic terms leads to the perception that we are missing something key. Any time we say "I experience the qualia of seeing a tree" rather than just "I see a tree", we are adding an entity. If we must include that added entity in a complete explanation...we will never have a complete explanation.

So...the experimental study of consciousness is moving along (as opposed to the philosophical debate about it).

Part of the problem is knowing what questions to ask. Asking questions that assume X is there...is a bit like assuming that a complete map of Florida will include the location of the Fountain Of Youth. What if it isn't there?

We do know that if bits of brain are destroyed (depending on which), we get distinct changes in experience. We know that some of these changes are re-writing what we know of consciousness: things we used to think were a unitary process are now seen to be the working of many parallel processes; things thought to be distinct are seen to share processing.

We do know that the molecules of the brain are constantly being replaced. This has been discussed above.

Combining the two points above, we may infer that what we refer to as consciousness is dependent on brain (point 1), but dependent on the function of the brain rather than the precise molecules making it up (point 2). I'd say that the current understanding is more than enough to allow us to take on this thought experiment. Unless by "really understand", you mean something more than just...understand. :D
 
But different to who?

Would the person's friends and family be able to tell they are different?
Would the person themselves consider that they were a different person?
Would any discernible difference be anything more then the difference I feel from the "me" that 20 years ago made decisions and embarked on courses of actions that just seem (to the "me" today) to be the actions of a stranger?

Is the perception (self or otherwise) what determines the reality of the situation? Or is reality separate from perception?

If a robot thought that he was me, and was totally convinced of it... would that make it true? If the robot were able to appear 100% human and fool everyone else as well, would that make them me?

Nah. :)
 
Some would say we do really know (or at least are gradually filling in the gaps in a surprisingly complete understanding), but that the phrasing of the question in dualistic terms leads to the perception that we are missing something key. Any time we say "I experience the qualia of seeing a tree" rather than just "I see a tree", we are adding an entity. If we must include that added entity in a complete explanation...we will never have a complete explanation.

So...the experimental study of consciousness is moving along (as opposed to the philosophical debate about it).

Part of the problem is knowing what questions to ask. Asking questions that assume X is there...is a bit like assuming that a complete map of Florida will include the location of the Fountain Of Youth. What if it isn't there?

We do know that if bits of brain are destroyed (depending on which), we get distinct changes in experience. We know that some of these changes are re-writing what we know of consciousness: things we used to think were a unitary process are now seen to be the working of many parallel processes; things thought to be distinct are seen to share processing.

We do know that the molecules of the brain are constantly being replaced. This has been discussed above.

Combining the two points above, we may infer that what we refer to as consciousness is dependent on brain (point 1), but dependent on the function of the brain rather than the precise molecules making it up (point 2). I'd say that the current understanding is more than enough to allow us to take on this thought experiment. Unless by "really understand", you mean something more than just...understand. :D
Hm... ok. I know there's been amazing progress, but I have serious doubts that we have a "suprisingly complete understanding" of consciousness - if for no other reason than the scientific community has often heralded that they are on the verge of having a complete understanding of the physics of the cosmos, only to have their confidence shattered by new discoveries. There are numerous examples of this - heck, it's the scientific method in action.

It seems that every time we poke around in things, we find out that we don't really know as much as we thought... and that much of what we thought has to be re-thought.

Dinosaurs were mostly warm-blooded. The Universe is expanding at an ever-increasing rate by an unknown force. Dark matter exists, but we don't know what it is, or where it is. Black holes evaporate... and, by the way, do give up information other than spin, charge and mass. Space and time may be illusions generated by superstrings and p-branes. Water existed on Mars at one point, and may still exist there. The universe may be several billion years older than we thought, making Hubble's constant much smaller than has been assumed all along.

These (and other things too numerous to mention) are all fairly recent discoveries that are forcing an ongoing reassessment of areas we thought we knew were "facts" - or at least were so certain that they could be referenced as if they were facts.

So... pardon my doubts, but I am skeptical of just how much we really understand about consciousness and identity. :)
 
Last edited:
Back to something I said before in this thread:

Personality and to an extent personal identity isn't merely in brain cells.
There's whole lot about who we feel we are in hormones, body types, muscle mass, organ strengths and weaknesses, and how our body moves.
"You can get a whole new you at Jenny Craig!" (a weight loss center)
If I could have my brain transpanted in a different body, the old me would be out. If all my body and brain were replaced with android parts, I suspect (assuming the androd has a robust A.I.) the new me would have my memories (assuming they could be downloaded into the new environment) but would not at all feel that they were his, her, or its (depending on design of the android to simulate sexual features) memories, and that it was the same person.
I suspect that even replacing all those organic brain cells with new inorganics would result in some disparity as well.
But of course, if you did it, or even a full body change, by piecemeal over time, there may not be a shocking, "Who the heck am I?"

Looking for a continuation of Identity, personality, or persona, muddies the waters. Suppose we just wanted to ask if some kind of self-consciousneess/subjective experiencing continued (apart from who it thought it was)? The original intent of the thought experiment the OP tossed at us (We don't know what his intent for tossing it at us was, because he's not telling us) was the first step in an argument against robust A.I. or at least Computationalism that says the mind is merely a super computer.
See here:
http://www.rpi.edu/~brings/SELPAP/zombies.ppr.pdf

As for the zombie arguments in general, I still think them absurd.
I can't swallow that there can be a copy of a Human being that is behaving in a fully functional Human way that internally has no self-consciousness (or the private behaviors of self referencing and self making, if you wish to put it that way.)
I suppose they may be OK if you stipulate that they just sit there like a doll or behave like all the movie zombies do, lurching about in sleepwalker mode and pounding against doors.

My opinion is they couldn't make a passing grade on the Turing test without self-consciousness, unless the test giver was just looking for a game of machine chess or the ability to write term papers that just rehash the current literature.

Chess playing zombies! lol
 
Last edited:
So... pardon my doubts, but I am skeptical of just how much we really understand about consciousness and identity. :)

Yep, as I said in an earlier post: we probably need at least another 200 years under our belt. For now, it's like 18th century speculation on what keeps the sun burning.
 
Is the perception (self or otherwise) what determines the reality of the situation? Or is reality separate from perception?

If a robot thought that he was me, and was totally convinced of it... would that make it true? If the robot were able to appear 100% human and fool everyone else as well, would that make them me?

Nah. :)

But what if the robot thought it was you?
 
Great post. :)



This part I disagree with; I think that this is the crux of the matter. :)

It may be to the disscussion at hand as opposed to the computationality arguments.
Also you don't have a self-consciousness without a self- identity. Those are kind of integral. In the folk language we talk about "my mind." Mind is very personal.
And as I said, my mind in that sense isn't something apart from my body. So plug in the new parts, and you get an identity crisis.
Unless perhaps you do it in piecemeal over time.

And also to rehash another thing I said before, I have memories of my college days, but often those don't feel like me. I'm not the person I was then in many respects.

It's funny that we sometimes invest so much effort in maintaining the pride of our public and private identities when they are actually so fleeting.

Then there are people who believe in some kind of metaphysical soul that survives death. For them, Identity is inseperablly in the picture.
This notion is likely based on our strong association of our thinking processes with our ego images of ourselves.
 
Last edited:

Back
Top Bottom