• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

The Zombie Poll

What happens?

  • Smooth as silk

    Votes: 56 60.9%
  • Zombie

    Votes: 10 10.9%
  • Curare

    Votes: 3 3.3%
  • I really don't know

    Votes: 11 12.0%
  • Lifegazer is a zombie from Planet X

    Votes: 12 13.0%

  • Total voters
    92

UndercoverElephant

Pachyderm of a Thousand Faces
Joined
Jan 17, 2002
Messages
9,058
This is an offshoot of the thread called "materialists....."

http://www.internationalskeptics.com/forums/showthread.php?t=57934

I am not going to take part in this thread.

If anyone wants some background on this question and the implications then the thread above has plenty, spread over 3000 posts. For a more concise version go here:

http://philosophy.fas.nyu.edu/docs/IO/1172/conceiving.pdf

for an overview

and here:

http://www.rpi.edu/~brings/SELPAP/zombies.ppr.pdf

for the main zombie stuff

Imagine you are taking part in a revolutionary experiment. Science has discovered how to electronically replicate the exact behaviour of an individual neuron. In the experiment, each of your neurons is going to be replaced by an electronic component. Eventually, the surgeons will entirely replace your brain with electronics which carries out precisely the same function.

One of three things will happen. Which do you think it is?

A) Nothing. As your neurons get replaced with electronics you remain fully conscious, and nothing of note changes. At the end of it, you are a human with an electronic brain and nobody would notice the difference from the outside, and you yourself notice no difference from the inside.

B) All goes dark inside.

C) You .....

From the second text

A) The Smooth-as-Silk Variation: The complete silicon replacement of your flesh-and-blood brain works like a charm: same mental life, same sensorimotor capacities, etc.

B) The Zombie Variation: As the silicon is progressively implanted into your
dwindling brain, you find that the area of your conscious experience is shrink-
ing, but that this shows no effect on your external behavior. You find, to
your total amazement, that you are indeed losing control of your external
behavior [You have become blind, but] you hear your voice saying in
a way that is completely out of your control, `I see a red object in front
of me.' We imagine that your conscious experience slowly shrinks to
nothing, while your externally observable behavior remains the same".

C) The Curare Variation: Your body becomes paralyzed and the doctors, to your horror, give you up for dead.
 
Last edited:
Geoff said:
Imagine you are taking part in a revolutionary experiment. Science has discovered how to electronically replicate the exact behaviour of an individual neuron. In the experiment, each of your neurons is going to be replaced by an electronic component. Eventually, the surgeons will entirely replace your brain with electronics which carries out precisely the same function.
Precisely the same? Including any quantum mechanical effects that may turn out to be relevant? Including any yet-to-be-discovered fifth Mental Force?

Okay.

~~ Paul
 
First off, I don't know. Enter the grand world of A.I. speculation. I've read pro and con, and as any Sci-Fi lover, prefer the pro.

If we were making this speculative inquiry, gendanken, or whatver 100 years ago, it would be "your brain cells are replaced by tiny machines." fortunately we're not because machines just would hack it. I'm doubtful that ikkle electronic components would do the job either. We may well be looking for something beyond the electronics age here. perhaps we should wait 100 years, except we don't have that luxury. But since I'm used to stories with scientific sounding future tech words in them, I'll posit some kind of future tech that can sucessfully replace neurons, and not just neurons, but hormons, and not just hormons and neurons, but organs functioning organically. My self isn't just the gray matter of my brain and the central nervous system. There are a whole lot of other biochemical chains of communication going on. It is my whole body that selfs.

OK, given this robust, science-fictional future tech that knows how intelligence works on the subcelluar level, my preferance is that the
replacements could be made and all would be business as usual, including my self-awareness amd my Subjectity.

My preferance, of course, because I'm like someone 200 years ago speculating on how stars shine and having no clue whatsover.
 
Whole body? So if I cut off your hand, you would lose a bit of your conciousness? How about a foot?

I only mention this because I've had a tooth removed before, and I'm as much "me" as I ever was.
 
Last edited:
Whole body? So if I cut off your hand, you would lose a bit of your conciousness? How about a foot?

I only mention this because I've had a tooth removed before, and I'm as much "me" as I ever was.

Not to say that my big toe is as crucial a player as my brain, but that my being a person is not so simple as to be attributed to only portions of my central nervous system.

Let's not make things complicated, though. Self-Consciousness is in the pre-frontal lobes of the brain.
So. we'll start mucking about there.
And, if 200 years from now, we understand what consciousness is and have the right replacemnt tech, perhaps our patient won't notice any loss to his selfhood.

But again: The Sun keeps shining because asteriods full of coal keep falling into it.
 
Last edited:
I don't think this is so much a question of A.I., or zombies, as is the nature of consciousness, which, as of yet, we've no idea.

I don't really want to contribute my opinion on a topic of total speculation, but I don't have any problem pointing to other's.

http://en.wikipedia.org/wiki/Daniel_Dennett
&
http://www.everything2.com/index.pl?node=Consciousness Explained&lastnode_id=49522
&
http://en.wikipedia.org/wiki/Consciousness_Explained

Yep! Let's reconvene in 200 years.
 
Smooth as silk.

The way I see it, what I call my "self", my "consioussness" is rebuilt every morning when my brain "reboots" and I wake up.

OH! Maybe I am already a P-zombie! But wait, if I can recognize I am one, then...

Lifegazer is the real zombie?
 
I thought today I might do B or the unincluded D (I die because there's no interfacing life with the artificial replacement parts.) (Maybe in 200 years we'll catch up with Dr. Frankenstein.)

Any more takers for the poll?
Burned out on zombies et al?
Undercover Elephants's not posting in this thread, so that takes the fun out of it for those who want to argue and rant.
I wonder if anyone is avoiding this thread just because of where the thought experiment is bound: an unwanted destination.
 
I tried joining that debate by asking for some evidence for Undercover Elephant's statement that the brain and mind are obviously not the same thing. Since he refused to answer this question I can only assume that there is no basis for this argument.

The original point was that it is obvious that everyone percieves the colour red in the same way, yet no-one has isolated the brain function that correlates to the experience of seeing red, therefore brain and mind are not the same. I can think of many cases where brain damage has changed people's conciousness, which strongly points to the mind having an physical origin. In addition, red-green colourblindness proves that not all people experience red in the same way and therefore the intial statement is clearly wrong.
 
Instead of neurons (because people have suggested we don't know everything about, say, their connections), or the dependence on brain structure, let's generalize a bit:


Energy >---- Black Box Function ----< Conscious Self :D

Seriously, if that turns out to be the ultimate model for animal consciousness, then any function which has the identical inputs and outputs will suffice. As Hyparxis said, I first ran across this in relation to A.I.

I've also seen claims that a genuine demonstration of A.I. would have serious implications for the lack of necessity for the soul and maybe for God. :eek:

Unless and until we build a functioning A.I. that fools everybody, this is going to remain a philosophical debate. After that point, it will be merely technological.

ETA: I voted smooth as silk. I guess this makes me a materialist or summat :confused:
 
Last edited:
I tried joining that debate by asking for some evidence for Undercover Elephant's statement that the brain and mind are obviously not the same thing. Since he refused to answer this question I can only assume that there is no basis for this argument.

The original point was that it is obvious that everyone percieves the colour red in the same way, yet no-one has isolated the brain function that correlates to the experience of seeing red, therefore brain and mind are not the same. I can think of many cases where brain damage has changed people's conciousness, which strongly points to the mind having an physical origin. In addition, red-green colourblindness proves that not all people experience red in the same way and therefore the intial statement is clearly wrong.

Forget that thread. it was a train wreck. What's your vote on the above?
 
Instead of neurons (because people have suggested we don't know everything about, say, their connections), or the dependence on brain structure, let's generalize a bit:


Energy >---- Black Box Function ----< Conscious Self :D

Seriously, if that turns out to be the ultimate model for animal consciousness, then any function which has the identical inputs and outputs will suffice. As Hyparxis said, I first ran across this in relation to A.I.

I've also seen claims that a genuine demonstration of A.I. would have serious implications for the lack of necessity for the soul and maybe for God. :eek:

Unless and until we build a functioning A.I. that fools everybody, this is going to remain a philosophical debate. After that point, it will be merely technological.

If we can build what is or can to all intents and purposes become a soul, it snatches away another exclusively claimed property of the "Invisible Gardener." Another "gap" he could be hiding in would be closed.

Yes, in the meanwile this is a Philosophical discussion with devious thought problem!
 
Instead of neurons (because people have suggested we don't know everything about, say, their connections), or the dependence on brain structure, let's generalize a bit:


Energy >---- Black Box Function ----< Conscious Self :D

Seriously, if that turns out to be the ultimate model for animal consciousness, then any function which has the identical inputs and outputs will suffice. As Hyparxis said, I first ran across this in relation to A.I.

Yes, but there's more to it than that. Inside the black box must be something more than just data processing. Your experiences are a physical phenomenon, and would not arise (as they are physical) purely out of information. This was Searle's argument, which I did not fully appreciate in my salad days.

We presume that the conscious experience arises somehow out of the particular interaction of atoms in our brains. Electrical wires, even though they transmit the same info, are not that type of configuration. Now, they may cause consciousness to arise, but that's far from guaranteed. Searle himself said that machines could be conscious the way humans are -- indeed, we are one such machine. There's no reason there couldn't be others.

But merely aping the electrical signals would not be enough.

Furthermore, I would suggest that, since we can talk about the "greenness of green", that, obviously, there is two-way feedback from this "mind" into the world "out there" (which makes sense since the mind is instantiated, we presume, from normal physics, even though we don't know how, yet.)

Hence evolution, in molding the brain/mind, used the perceptual experience in developing a decently-working brain/mind to deal with the world.

So by replacing all neurons in the brain with exactly equivalent electrical wires, if those wires did not give rise to a perceptual experiencing mind, quite probably not function correctly, like a car engine with a key gear missing. It'll churn and clank, if it starts at all, and probably not drive the wheels.

Would it be able to talk about the redness of red? I doubt it. Would it get upset if it couldn't, wondering what was missing? The answer to these questions would depend on how much of the thinking processing happened as pure information processing, and how much happened as a result of the physical, never forget phenomenon of conscious experience. Does conscious experience of pain drive the organism? Or are there subconscious processes that drive the organism, and the "mind" just perceives the pain? Note the latter would be pointless; hence we may assume evolution latched onto these conscious experiences to drive and motivate the organism.
 
You left out an obvious possibility in #2 -- namely that a new consciousnes may be born in the replacement of the neurons with silicon. While you may notice that your "mind" dwindles to nothing but behavior remains "normal" who is to say that the reason that your behavior remains "normal" because the silicon creates a new consciousness. So it isn't even clear that a zombie would necessarily result. I think you need four options.
 
You left out an obvious possibility in #2 -- namely that a new consciousnes may be born in the replacement of the neurons with silicon. While you may notice that your "mind" dwindles to nothing but behavior remains "normal" who is to say that the reason that your behavior remains "normal" because the silicon creates a new consciousness. So it isn't even clear that a zombie would necessarily result. I think you need four options.

Do you mean a new identity? I thought of that one also. Bob isn't Bob these days. He's Bobiod.
 
Forget that thread. it was a train wreck. What's your vote on the above?

I voted smooth as silk. I think the main problem with these arguments is that until someone either builds an exact replica of a human brain or proves one is impossible all arguments can only be based on personal belief.

Yes, but there's more to it than that. Inside the black box must be something more than just data processing. Your experiences are a physical phenomenon, and would not arise (as they are physical) purely out of information. This was Searle's argument, which I did not fully appreciate in my salad days.

We presume that the conscious experience arises somehow out of the particular interaction of atoms in our brains. Electrical wires, even though they transmit the same info, are not that type of configuration. Now, they may cause consciousness to arise, but that's far from guaranteed. Searle himself said that machines could be conscious the way humans are -- indeed, we are one such machine. There's no reason there couldn't be others.

But merely aping the electrical signals would not be enough.

Furthermore, I would suggest that, since we can talk about the "greenness of green", that, obviously, there is two-way feedback from this "mind" into the world "out there" (which makes sense since the mind is instantiated, we presume, from normal physics, even though we don't know how, yet.)

Hence evolution, in molding the brain/mind, used the perceptual experience in developing a decently-working brain/mind to deal with the world.

So by replacing all neurons in the brain with exactly equivalent electrical wires, if those wires did not give rise to a perceptual experiencing mind, quite probably not function correctly, like a car engine with a key gear missing. It'll churn and clank, if it starts at all, and probably not drive the wheels.

Would it be able to talk about the redness of red? I doubt it. Would it get upset if it couldn't, wondering what was missing? The answer to these questions would depend on how much of the thinking processing happened as pure information processing, and how much happened as a result of the physical, never forget phenomenon of conscious experience. Does conscious experience of pain drive the organism? Or are there subconscious processes that drive the organism, and the "mind" just perceives the pain? Note the latter would be pointless; hence we may assume evolution latched onto these conscious experiences to drive and motivate the organism.

I don't think he was refering to replacing neurons with wires. The idea of the black box is that we don't know what is inside, it could be a brain or something that duplicates the workings of the brain. If we assume that it exactly mimics the brain then the output must be identical, which implies that at least the appearance of conciousness must be there to observers. Occam's Razor says that conciousness must therefore exist, since any scheme to exactly duplicate conciousness without conciousness actually existing would be exeedingly complicated, if not actually impossible.

The question then becomes "Is it possible to duplicate the workings of the human brain?". I am tempted to answer "yes" to this, but I can't really believe that any structure can be exactly imitated without being an exact copy, which would not answer the question. I do think that it will be possible to make a close enough approximation that conciousness will arise, even if that conciousness is not human. Of course, this then degenerates into a discussion of what may be possible in the future, which again is based on personal beliefs.
 
...snip...

The question then becomes "Is it possible to duplicate the workings of the human brain?". I am tempted to answer "yes" to this, but I can't really believe that any structure can be exactly imitated without being an exact copy, which would not answer the question. I do think that it will be possible to make a close enough approximation that conciousness will arise, even if that conciousness is not human. Of course, this then degenerates into a discussion of what may be possible in the future, which again is based on personal beliefs.

I've voted "I don't know" for pretty much the reasons you've stated.
 

Back
Top Bottom