• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

The Zombie Poll

What happens?

  • Smooth as silk

    Votes: 56 60.9%
  • Zombie

    Votes: 10 10.9%
  • Curare

    Votes: 3 3.3%
  • I really don't know

    Votes: 11 12.0%
  • Lifegazer is a zombie from Planet X

    Votes: 12 13.0%

  • Total voters
    92
You left out an obvious possibility in #2 -- namely that a new consciousnes may be born in the replacement of the neurons with silicon. While you may notice that your "mind" dwindles to nothing but behavior remains "normal" who is to say that the reason that your behavior remains "normal" because the silicon creates a new consciousness. So it isn't even clear that a zombie would necessarily result. I think you need four options.

Do you mean a new identity? I thought of that one also. Bob isn't Bob these days. He's Bobiod.

Added thought:

Would having all the memories of Bob constitute having the identity of Bob?

This gets back to whatt I was saaying about hormones and the like. I kinda doubt a totally rebuilt Bob, body and brain, with inorganic parts, would be the same old Bob we know and love. And he, himself, might find a rift between what he was and is now, so much that he wouldn't be able to think of himself as the old Bob.

But this probably outside the simple intent of the thought experiment. It's about Self-Consciousness orsubjective experience and whether it would translate.
 
Another question that I just thought of (sorry if I'm slow) is would it be ethical? As we understand it, a newborn brain (in a baby) has a mind. This would imply that if we created an artificial brain it would have it's own conciousness already. This brings the problem that even if we can transfer the human mind, anything we can transfer it to will already be a seperate entity.
 
....snip...

And he, himself, might find a rift between what he was and is now, so much that he wouldn't be able to think of himself as the old Bob.

...snip...

An interesting point - at least for me this has already happened - I do not consider myself to be the same "I" as I was say 20 years ago. I have many memories (or think I do ;) ) of that time but they are not what "I" of today would have done, and in some of them I cannot understand at all why I reacted as I did - my memories may as well be of a stranger.
 
Another question that I just thought of (sorry if I'm slow) is would it be ethical? As we understand it, a newborn brain (in a baby) has a mind. This would imply that if we created an artificial brain it would have it's own conciousness already. This brings the problem that even if we can transfer the human mind, anything we can transfer it to will already be a seperate entity.

I got the impression from the OP that this was to be a piecemeal replacement process. Your right, if you download into a fully finished and functional brain, might it already have a self-consciouness of its own?
 
An interesting point - at least for me this has already happened - I do not consider myself to be the same "I" as I was say 20 years ago. I have many memories (or think I do ;) ) of that time but they are not what "I" of today would have done, and in some of them I cannot understand at all why I reacted as I did - my memories may as well be of a stranger.
Indeed, the individual molecules composing the cells of your nervous system are very unlikely to be the same ones as 20 years ago...you are obviously Darat V.2.0 or more...
 
As we understand it, a newborn brain (in a baby) has a mind.
Who is this "we" you speak of?

I remember showing off my brand new baby son to a colleague of mine, a neurobiologist. He remarked something (I wish I could remember) about how developed the brain was, how my son was running on about goldfish level. Soon, very soon, he would be at lizard level, then rat...

Anyway, my point is that there is not at all a clear consensus that a newborn baby has a mind at all, let alone one that is recognizable as the same sort of mind as adults have. This is mostly because there is not a clear consensus on what a "mind" is and what it does, plus the limits of our understanding of the experiential capabilities of newborns. The latter knowledge base grows tremendously each year; the former (what is a mind?)... the arguments there generate more heat than light.
 
Do you mean a new identity?

I'm not sure that I would use that word, but basically, yes. One possibility is a different "mind".

In reality, I don't think it would work. You couldn't replace neurons with silicon chips because of the radically different ways that they function. There would simply be no way for the information from the "old brain" to interact with the "new brain" since silicon chips do not interact through neurotransmitters.

As far as the memory issue, I think that identity has more compenents than simply memory, but memory is certainly a very big part of it. One of the big things missing in computers, and why they do not seem to have an identity, is emotion and motivation. A computer with a motivational and emotional system would not act like us, but it might be interesting to see how such a entity would interact with humans and if we would see consciousness in there.
 
Indeed, the individual molecules composing the cells of your nervous system are very unlikely to be the same ones as 20 years ago...you are obviously Darat V.2.0 or more...

No that can't be true because then that would mean there was nothing "special" about each of the water molecules in my body compared to the water molecules in my glass of water I am about to drink. And that can't be true since I am alive and life is something that is totally different to everything else!!!!
 
I'm not sure that I would use that word, but basically, yes. One possibility is a different "mind".

In reality, I don't think it would work. You couldn't replace neurons with silicon chips because of the radically different ways that they function. There would simply be no way for the information from the "old brain" to interact with the "new brain" since silicon chips do not interact through neurotransmitters.

This would be the unstated D. You just can't replace the brain of a living organism with inorganic parts. The process of replacement would in the end just be a dismantling. Dead brain and no Consciouness.

How many takers would we have for the option D?
 
Wasp said:
In reality, I don't think it would work. You couldn't replace neurons with silicon chips because of the radically different ways that they function. There would simply be no way for the information from the "old brain" to interact with the "new brain" since silicon chips do not interact through neurotransmitters.
I think we're supposed to assume some new electrosiliconeurobioglop technology solves this problem.

~~ Paul
 
I think we're supposed to assume some new electrosiliconeurobioglop technology solves this problem.

~~ Paul

I'd agree. I don't think the underlying implementation is what is necessary for this thought experiment. That's why I generalized the problem. It's the philosophical problem of whether consciousness arises merely from physical processes. If so, it is merely a technological challenge to replace the brain.

Beerina:

Sadly, I am not well read on this subject. I also feel that compter geeks have probably taken the information processing model of the universe too far, akin to the clockwork universe model. However, since information processing occurs in physical systems, how does my generalized black box require something more? I don't care if the computer model is binary digital, or quantum mechanical, or exchanged proteins or whatever! I'm saying that for some finite energy input (food, electrical, etc.) into a properly designed physical system (animal, computer, etc.), self consciousness results.

That's the hypothesis, as far as I understand it, of this thought experiment. Philosophy if you ask me...
 
I'd agree. I don't think the underlying implementation is what is necessary for this thought experiment. That's why I generalized the problem. It's the philosophical problem of whether consciousness arises merely from physical processes. If so, it is merely a technological challenge to replace the brain.

Look for a subtler use as well.
 
Look for a subtler use as well.

Oh, sorry. I meant, "If your objective is to replace the brain, then if this hypothesis is true, the problem is reduced from philosophical to merely technological (no matter how complex)."

I don't use this for the basis of my stance on the soul, or anything else. Moreover, I don't buy the lack of a material soul (should it be proven) to be direct evidence against God.
 
No that can't be true because then that would mean there was nothing "special" about each of the water molecules in my body compared to the water molecules in my glass of water I am about to drink. And that can't be true since I am alive and life is something that is totally different to everything else!!!!
Strange... I am, in a very real sense, a different person than I was when I clicked on this thread to check up on it. In part, I am different in that now I know I am different because of the new molecules in my body I acquired by eating ice cream. Who would have thought that it was that simple?



...besides Darat, I mean.
 
I think we're supposed to assume some new electrosiliconeurobioglop technology solves this problem.

Oh, I know. I was just offering my opinion of the technical possibility in the real world, which I think is nil.

If you could get some magical substance to function exactly like a neuron (it couldn't be anything like a silicon chip as used in computers currently), then my guess would be "no sweat". Essentially that would amount to switching to a new building material while maintaining the same form -- the old "ship of" wherever problem, as you guys are currently discussing (though I like the ice cream wrinkle). It is the relationships that are important rather than the precise building material.

There is simply no way to rule out the other possibilities on logical grounds, so we are left with competing possiblities that cannot be resolved on logical grounds alone. Searle starts to veer off into weird territory with his "ontological subjectivity" argument that he builds off of this scenario.
 
Merc said:
Strange... I am, in a very real sense, a different person than I was when I clicked on this thread to check up on it. In part, I am different in that now I know I am different because of the new molecules in my body I acquired by eating ice cream. Who would have thought that it was that simple?
Come on man, everyone knows that ice cream is a life-changing experience. That's why I eat some every evening from May through September.

~~ Paul
 

Back
Top Bottom