• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Resolution of Transporter Problem

That isn't what I said. I am not going to play into your debate style of "agreeing" to something and then redefining what you are agreeing with as if I said it in the first place.

I said that if you interfere with the input from sensory neurons you change what someone sees. You can do that without changing the environment at all.

Well, I think Merc raises an interesting point if one considers, say, O'Regan's "Sensorimotor Theory of Vision." If we take the position that the "stream of consciousness" concept simply does not exist in actuality, but rather that the brain creates constantly updated, fleeting impressions which appear to take the form of a stream then I think one might validly ask if this makes a difference to the outcome of the Teletransporter thought experiment.

Personally, I don't consider that it does though I think one needs to reflect on whether there are precedents for having such a total and instantaneous shift in phenomenology take place. Could it be too great a shock for the short-term memory for example, given that it's contents would be recreated on teleportation? But, I think if you're in a dark telepod with constant noise and are recreated in the same I don't see it as a real problem. There may be other more subtle environmental factors that would need attention paying to them.

Nick
 
Again, I gave my conditions for stepping into the machine. Sounds not that far from yours. But my conditions recognize that, yes, the scenario results in killing a person. If by "resolving", you are arguing that you are not killing a person since another copy exists, then your resolution does not answer any interesting questions, and is just one more exercise in circularity.

It involves killing a body but not someone. Because there is here a difference, and assuming you appreciate this difference, it seems to me manipulative for you to play this "killing a person" card.

Nick
 
Nick227 said:
However, there is no actual "I." The word is merely a socially-necessary referent and there is actually nothing that it actually references. Thus it is not possible to "destroy an I" because the word does not refer to anything. So no "I's" are being destroyed because there are none anyway.
Why doesn't it refer to anything? Once we decide what constitutes an independent living organism, "I" refers to those attributes.

~~ Paul
 
A couple of personal problems with this scenario:

1) The idea of non-continuity in sleep. I know it's rare - probably very rare - but it is possible to stay lucid while falling asleep and dreaming. Conscious awareness for most people pauses (or, more appropriately, tunes out the senses) every night; but the few cases where it doesn't tune out should be enough to make people understand that the self remains continuous and dynamic even through sleep.

2) In spite of the popular materialist mantra that we are physically totally replaced every X period of time, that's actually untrue. Apparently, the vast majority of our brain cells are, in fact, NOT replaced over our lifetimes, and there may even be some apparent and significant effects upon our personas in those cases where brain cells ARE replaced. So, while I doubt we can read this much into this fact, it is possible that some element in those non-replaced brain cells may be vitally responsible for our self-awareness, and we may not be able to duplicate (or replace) brain cells without losing self awareness.

That's a bit of a stretch, but there it is.

3)In spite of thought experimenter (like Blackmore) trying to convince you to consider this experiment without considering the failure case scenarios, the fact is, you have to consider those very scenarios to get the full awareness of what is going on. Their mediocre attempts to get you to 'ignore the man behind the curtain' should tell any critical thinker that the position being expounded is weak in some way. I had this out with Interesting Ian.

The fact is, if the 'transporter' fails and two of you suddenly exist, if the potential for that to happen even exists (which, obviously, it does), this proves that, no matter how you want to define it, you2 is emphatically NOT the same person as you, and you would walk into the transporter, not to enjoy life on Saturn, but to die. Period, point blank, end of story.

The transporter is a duplicator with a built-in illusion - nothing more.

----

I can't really enumerate the remaining points, as I'm losing focus... :D

The twin argument should be pretty damning, too. Twins don't share experiences. They don't feel what each other feels (except in some unconfirmed, anecdotal cases). They don't (often) feel themselves to be two instances of the same person. Of course, even very similar identical twins are not fully, absolutely identical; but we've never seen evidence that a person shares more 'sense of self' the closer they are to another genetically similar person, after all.

The 'transporter' makes a twin for you at another spacetime location. How is that 'you' transporting? It's not. Congratulations, you've got a brother - not a new you.

If you have sex with your duplicate, is it masturbation? Homosexuality? Incest? Could you just imagine the laws that would appear governing that behavior?

Suppose the 'teletrans' did its job except - oops! Substituted one gender for another. Sorry, Bill, you're now a Betty. Try explaining that one to the wife.

I'm not claiming to be a materialist - I kind of have to support Ian's contention that the transporter would do nothing but create a dead lump of flesh at the receiving end. But from a materialist POV, it's still idiotic to think of this thing as nothing but a murderous duplication ray.
 
A couple of personal problems with this scenario:

1) The idea of non-continuity in sleep. I know it's rare - probably very rare - but it is possible to stay lucid while falling asleep and dreaming. Conscious awareness for most people pauses (or, more appropriately, tunes out the senses) every night; but the few cases where it doesn't tune out should be enough to make people understand that the self remains continuous and dynamic even through sleep.

Except we know for a fact that there are discontinuities. Not everyone sleeps lucidly, and even those that do don't sleep lucidly every time they sleep. Consciousness is not continuous for anyone -- period.

2) In spite of the popular materialist mantra that we are physically totally replaced every X period of time, that's actually untrue. Apparently, the vast majority of our brain cells are, in fact, NOT replaced over our lifetimes, and there may even be some apparent and significant effects upon our personas in those cases where brain cells ARE replaced. So, while I doubt we can read this much into this fact, it is possible that some element in those non-replaced brain cells may be vitally responsible for our self-awareness, and we may not be able to duplicate (or replace) brain cells without losing self awareness.

That's a bit of a stretch, but there it is.

In other words, the assumption that consciousness is information might not be true. This doesn't change the validity of the argument in the OP, only its possible soundness.

3)In spite of thought experimenter (like Blackmore) trying to convince you to consider this experiment without considering the failure case scenarios, the fact is, you have to consider those very scenarios to get the full awareness of what is going on. Their mediocre attempts to get you to 'ignore the man behind the curtain' should tell any critical thinker that the position being expounded is weak in some way. I had this out with Interesting Ian.

The fact is, if the 'transporter' fails and two of you suddenly exist, if the potential for that to happen even exists (which, obviously, it does), this proves that, no matter how you want to define it, you2 is emphatically NOT the same person as you, and you would walk into the transporter, not to enjoy life on Saturn, but to die. Period, point blank, end of story.

The transporter is a duplicator with a built-in illusion - nothing more.

Yes, I agree -- except in the case when relevant information (D2) is identical at the source and the destination.

Think of it this way: If consciousness is information, then a consciousness can be "godelized" into a (probably extremely large) integer. So then the teletransporter experiment becomes an exercise in transferring a number from a source to a destination.

If the original is allowed to experience for even an instant more, their "godelized" consciousness number will probably be very different from what was sent. And in that case, killing them would mean loosing an unrecorded number for eternity

But if the original is frozen, or destroyed in the same instant that the number is recorded, that one number is it. Killing the original doesn't entail loosing anything at all -- there was no more conscious thought that could lead to any other numbers. The consciousness exists in the form of a number and can be stored or transferred in any way one could fathom.
 
To totally duplicate a person, though, you would have to have a way to totally duplicate continuous first-person awareness of being - which would be impossible, since at the moment of destruction, that awareness ceases. It's a piece of information which cannot in any way be successfully transferred.

I'm not sure I'm using the right words here, but hopefully the point gets across.

As for the sleep/discontinuity issue, conscious awareness is not the only element of self - otherwise, we could just wipe out memory and claim to have the same self. Unconscious and/or subconscious activities are also vital to self - in fact, 'conscious awareness' may itself be an illusion (consider the discussion of whether or not we consciously make decisions). So while you may claim the consciousness is discontinuous, the underlying brain activity is not - and this may also be considered essential to 'self'. So in this state, as well, I remain unconvinced.

Show me someone who goes from absolutely no brain activity for a long time, back to a full and natural conscious state, with no loss or change of self, and I might be convinced otherwise.
 
Why doesn't it refer to anything? Once we decide what constitutes an independent living organism, "I" refers to those attributes.

~~ Paul

I figure if we wanted to designate "I" as the "independent living organism" or "the body" or whatever, we would refer to it as such and not need "I." To me, "I" is a mechanism by which identification with thought and feeling can be affected. It cements social bonds but it does not actually refer to anything. Dennett calls it the "benign user illusion."

Nick
 
Nick doesn't believe that the body equals the person.

I'm fine with the body equalling the person. However, neither the body nor the person equals "someone", and this shows up when one considers the teletransporter because the one bit is recreated.

Nick
 
To totally duplicate a person, though, you would have to have a way to totally duplicate continuous first-person awareness of being - which would be impossible, since at the moment of destruction, that awareness ceases. It's a piece of information which cannot in any way be successfully transferred.

You're missing the point. There is no persisting self anyway, so it doesn't need duplicating.

Nick
 
Show me someone who goes from absolutely no brain activity for a long time, back to a full and natural conscious state, with no loss or change of self, and I might be convinced otherwise.
I'm not sure I understand your point. Why did you include the caveat of "no loss or change of self"? Don't you alter your self every day? I'm not the person I was 40 years ago. Not at all.

So, is a person who wakes up from a coma not his or her self the way I am?

What about Clive Wearing? Isn't he more justified in declaring that he has been himself a hell of a lot longer than any of us since he never gains new memories? Forgive me but you seem confident about this thing called "self" in a way that I really don't think is justifiable. I used to. When I first came to this forum I used to argue passionately for the self. It was my inability to justify it that led me to accept that the self is an illusion. A damn good one to be sure.

I don't think the self exists. It's a label we attach to a collection of brain events and an evolutionary sense of our relationship to the world. I've not seen anything to suppose that there is much more than some thinking, sensing and feeling going on. I don't mind the label. It serves a purpose and I don't know what esle to call it.

You mention the subconscious. So what? A computer functions roughly at the level of our subconscious in that it isn't conscious. Why is that significant at all?
 
Last edited:
Nick227 said:
I figure if we wanted to designate "I" as the "independent living organism" or "the body" or whatever, we would refer to it as such and not need "I." To me, "I" is a mechanism by which identification with thought and feeling can be affected. It cements social bonds but it does not actually refer to anything. Dennett calls it the "benign user illusion."
I agree: It's a mechanism that promotes self-preservation by identifying my thoughts and emotions with my body, so that my thoughts make plans about how to protect my body. And so "I" refers to that entire system.

~~ Paul
 
"If we give each copy a wife and children..."

OK.

Luckally I don't know of anyone who thinks getting into the transporter would entail all their copies getting wife and children.

Perhaps the person is a Mormon or Muslim who's decided that polygamy is a bad idea, and wants to spread himself around the marriage a bit. His duplicates will get a wife each.

Clearly in such a case, killing the duplicates has a real world consequence. The first one out gets the first wife. We kill the rest, and the remaining wives are abandoned.

This may be far fetched, but the principle is simply that killing people has consequences, even where an exact duplicate might exist.
 
I'm not sure I understand your point. Why did you include the caveat of "no loss or change of self"? Don't you alter your self every day? I'm not the person I was 40 years ago. Not at all.

Is the Mississippi River still the Mississippi River from day to day? It's the same thing - the 'self' is a continuous and dynamic system - one which does change, yes, but those changes are gradual (normally) and continuous.

If you were to change in one moment from RF@15 to RF@55, you would be two different people with similar attributes; but a gradual change over 40 years means that you are you from the last minute to the next.

So, is a person who wakes up from a coma not his or her self the way I am?

Depends on his mental state upon awakening. If, for him, no changes have occured, and he continues to experience a gradual and continuous existence, then he's the same self; if he experiences a disruption, or some significant change, he's not the same self.

What about Clive Wearing? Isn't he more justified in declaring that he has been himself a hell of a lot longer than any of us since he never gains new memories?

It's an interesting case. I would suggest that his 'self' is damaged. It is non-dynamic, though continuous. And, apparently, he is capable of learning, so his self does actually change, though not within his sense of awareness.

Forgive me but you seem confident about this thing called "self" in a way that I really don't think is justifiable. I used to. When I first came to this forum I used to argue passionately for the self. It was my inability to justify it that led me to accept that the self is an illusion. A damn good one to be sure.

I think 'illusion' is too loaded a term. More like 'emergent property'. Without a sense of self, we'd all be some kind of p-zombie creature, and not one of us would possess any personal awareness.

I don't think the self exists. It's a label we attach to a collection of brain events and an evolutionary sense of our relationship to the world.

...?

I don't think 'x' exists. I think 'x' is a label we attach to the collection 'a,b,c'.

I don't think 'cars' exist. I think 'car' is a label we attach to a collection of mechanical events and various parts and fluids, which seem to work together to transport stuff here and there.

It's a silly thing to say, that way. Of course self exists - as a collection of brain events and an evolutionary sense of our relationship to the world. And since each of us has only a first person perspective of our self-state, it is from that perspective we must consider each such thought-experiment.

From that perspective, you step into the machine, the lights flash, and----

I've not seen anything to suppose that there is much more than some thinking, sensing and feeling going on. I don't mind the label. It serves a purpose and I don't know what esle to call it.

Yet you deny it.

You mention the subconscious. So what? A computer functions roughly at the level of our subconscious in that it isn't conscious. Why is that significant at all?

It's a part of the self, and is continuous in nature. It contains, among other things, our memories, learning patterns, fear/threat response codes, etc.

Try duplicating only the conscious portion of a person without their subconscious, and see what happens!

Think of it as the O/S of the computer. Without it, you could plug software into it all day, and it's not likely to do squat. Further, think of how significant problems can be if, say, we wiped out the O/S on your computer (let's say, just to be rude, that you run Windows Vista), and replace it with Mac O/S X (without changing any of your other software or settings). Is that significant at all?
 
You're missing the point. There is no persisting self anyway, so it doesn't need duplicating.

Nick

Yes, there is a persisting self. I think you're confusing 'self' with 'momentary awareness.' Self is a much broader term than that. It includes memory, subconscious brain states, and a host of other things which are continuous and dynamic. Lose any part of that, and you lose self. In fact, memory is a vast part of self - see what happens when someone suffers total amnesia! Yet memory is, itself, rather persistent, under normal conditions.

In my case, persistently vague.
 
Assume that consciousness is information processing.

I.e. that the real phenomenon of consciousness arises purely from information processing, i.e. independent of the processing mechanism underlying it, be it neurons, computer chips, or buckets and pullies.

I will accept it as a premise, but do not think it's viable per se. I am a converted Searle-ist. Consciousness is a real phenomenon, and thus arises (presumably) out of real-world atoms and energies somehow. Pure information processing is, in reality, just pushing around very different atoms, particles, and energies in one way (neurons) or a completely different way (chips) or yet another way (pullies and buckets.)

And I cannot believe that a real phenomenon arises from an interpretation of the motions of certain atoms and particles, as opposed to from the actual particles themselves, if you catch my drift.
 
Last edited:
My own take is that you're 99% right, but come to the wrong conclusion.

I liken human consciousness to a candle flame. When you go to sleep and lose consciousness (ignore dreams for the moment), the candle goes "out".

You are effectively dead.


Later on, the candle is re-lit, and your brain turns on the consciousness circuitry, which re-loads the necessary info, and your consciousness flickers back into existence.


Of course, if you copied this information to another candle hardware, it would "ignite" and report that it, itself, was you. And in that sense, it would be correct.

But the real question everybody wants to know is, are you dead?

And the answer is yes -- you are just as dead as you are as if you had fallen asleep and never woken up.


So I would find a teleporter that only teleported info as totally unacceptable. To me, anyway. If you wanna kill yourself so a clone can come alive, go for it.

After all, if you go to sleep and the clone wakes up, now what? If you leave the original "asleep", then it died. If you then wake it up, it'll take a look at the clone, then sigh, very relieved, "Oh Jesus, that was a close one. I almost died!"



So...ya. Go for it! :)
 
To totally duplicate a person, though, you would have to have a way to totally duplicate continuous first-person awareness of being - which would be impossible, since at the moment of destruction, that awareness ceases. It's a piece of information which cannot in any way be successfully transferred.

I'm not sure I'm using the right words here, but hopefully the point gets across.

As for the sleep/discontinuity issue, conscious awareness is not the only element of self - otherwise, we could just wipe out memory and claim to have the same self. Unconscious and/or subconscious activities are also vital to self - in fact, 'conscious awareness' may itself be an illusion (consider the discussion of whether or not we consciously make decisions). So while you may claim the consciousness is discontinuous, the underlying brain activity is not - and this may also be considered essential to 'self'. So in this state, as well, I remain unconvinced.

Show me someone who goes from absolutely no brain activity for a long time, back to a full and natural conscious state, with no loss or change of self, and I might be convinced otherwise.

If you want to say that subconscious brain activity is also essential for 'self' then yes the sleep problem isn't an issue for you.

However, it is irrelevant when it comes to the transporter problem because all brain activity could be transferred.

It seems like you just don't buy that consciousness could be information. That is a valid opinion and we will just have to agree to disagree -- but it doesn't invalidate the argument in the OP.
 
I.e. that the real phenomenon of consciousness arises purely from information processing, i.e. independent of the processing mechanism underlying it, be it neurons, computer chips, or buckets and pullies.

I will accept it as a premise, but do not think it's viable per se. I am a converted Searle-ist. Consciousness is a real phenomenon, and thus arises (presumably) out of real-world atoms and energies somehow. Pure information processing is, in reality, just pushing around very different atoms, particles, and energies in one way (neurons) or a completely different way (chips) or yet another way (pullies and buckets.)

And I cannot believe that a real phenomenon arises from an interpretation of the motions of certain atoms and particles, as opposed to from the actual particles themselves, if you catch my drift.

Why not?

I don't know of any evidence that supports such a position a priori. Everything we know about mathematics and computation shows that information based phenomena can be reproduced on any substrate meeting the constraints required for the phenomena.
 
Is the Mississippi River still the Mississippi River from day to day? It's the same thing - the 'self' is a continuous and dynamic system - one which does change, yes, but those changes are gradual (normally) and continuous.
It's funny you chose the river as that is an age old truism. You can never step into the same river twice. I guess it depends on what you mean by the river. Stating (asserting) that the self is a continuous and dynamic system isn't a compelling reason to accept your thesis. I agree with your premise but I don't think it leads anywhere and I'm not sure why you think it does?

If you were to change in one moment from RF@15 to RF@55, you would be two different people with similar attributes; but a gradual change over 40 years means that you are you from the last minute to the next.
From day to day? From month to month? Why is that important? It seems to me that the salient point is change and not the degree of change.

Depends on his mental state upon awakening. If, for him, no changes have occurred, and he continues to experience a gradual and continuous existence, then he's the same self; if he experiences a disruption, or some significant change, he's not the same self.
You seem to have an odd and fluid definition of self. It seems to conveniently fit the circumstances you need it to fit.

It's an interesting case. I would suggest that his 'self' is damaged. It is non-dynamic, though continuous. And, apparently, he is capable of learning, so his self does actually change, though not within his sense of awareness.
Of course it's dynamic. It's just not dynamic in the sense that we would think.

I think 'illusion' is too loaded a term. More like 'emergent property'. Without a sense of self, we'd all be some kind of p-zombie creature, and not one of us would possess any personal awareness.
Perhaps it is too loaded of a term but I'm not convinced. In any event, I reject your premise. It is the "illusion" of self that is the difference between us and P-Zombies. P-Zombies don't have self awareness.

...?

I don't think 'x' exists. I think 'x' is a label we attach to the collection 'a,b,c'.

I don't think 'cars' exist. I think 'car' is a label we attach to a collection of mechanical events and various parts and fluids, which seem to work together to transport stuff here and there.
:) This is rather funny because this is along the line of argument I'm making in another thread. However when I say that the "self" doesn't exist I don't mean it in that sense. Of course I think the self exists. I've said that. I don't think that the "self" exists in any Cartesian or homonculus fashion. "Self", aside from our biological entity together with our self awareness exists but it is also a sense and it is quite misleading.

It's a silly thing to say, that way.
Only if you don't understand what I mean. You are going to have to work with me on this a bit. It will take a little bit of effort to understand my POV.

I don't care for arguing by link and it is not my intent to do so but I think Blackmore can explain it much better than I.

The Grand Illusion:

First we must be clear what is meant by the term “illusion”. To say that consciousness is an illusion is not to say that it doesn’t exist, but that it is not what it seems to be―more like a mirage or a visual illusion. And if consciousness is not what it seems, no wonder it’s proving such a mystery.

...

You might want to protest. You may be absolutely sure that you do have such a stream of conscious experiences. But perhaps you have noticed this intriguing little oddity. Imagine you are reading this magazine when suddenly you realize that the clock is striking. You hadn't noticed it before, but now that you have, you know that the clock has struck four times already, and you can go on counting. What is happening here? Were the first three “dongs” really unconscious and have now been pulled out of memory and put in the stream of consciousness? If so were the contents of the stream changed retrospectively to seem as though you heard them at the time? Or what? You might think up some other elaborations to make sense of it but they are unlikely to be either simple or convincing.



Of course self exists - as a collection of brain events and an evolutionary sense of our relationship to the world. And since each of us has only a first person perspective of our self-state, it is from that perspective we must consider each such thought-experiment.
With all due respect I don't think you are saying much of anything here to help move me in any direction.

From that perspective, you step into the machine, the lights flash, and----
And?

Yet you deny it.
(see Blackmore above)

It's a part of the self, and is continuous in nature. It contains, among other things, our memories, learning patterns, fear/threat response codes, etc.
Agreed but this doesn't really tell us anything.

Try duplicating only the conscious portion of a person without their subconscious, and see what happens!
Take the engine out of the car and see what happens. So is the engine the car?

Think of it as the O/S of the computer. Without it, you could plug software into it all day, and it's not likely to do squat. Further, think of how significant problems can be if, say, we wiped out the O/S on your computer (let's say, just to be rude, that you run Windows Vista), and replace it with Mac O/S X (without changing any of your other software or settings). Is that significant at all?
I understand your point but I don't see how it advances anything. That consciousness is dependent on the "software" doesn't obviate my point. I fully accept your premises but they don't, via inference, lead me to think that consciousness isn't simply an illusion as explained by Blackmore above. You need to come up with an argument that is compelling that there is something more to the self than simply an emergent property of brain states that isn't what it seems to be. An illusion (as explained by Blackmore).

One last thing, I was for years a programmer and consultant on both Windows/Dos and Macs. It's a long story but the analogue of the computer to mind has played a huge role in my evolution of thought on this subject. And I can assure you I've not left many stones unturned in my quest. From John Horgan's undiscovered mind to Blackmore and Ramachandran and Pinker and many more I've read a hell of a lot. I apologize if I'm poor at advancing my idea. I am open to Channing my mind again but you will need a bit more (perhaps a lot more) than what you've provided so far. I hope your mind is open also. :).

In any event, as far as the discussion so far, it's as if you are trying to convince me of the significance of multiplication tables as it relates to space-time being a physical thing. I understand all that but it doesn't go to the heart of the problem.

Thanks,

RandFan
 
Last edited:

Back
Top Bottom