• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Resolution of Transporter Problem

I could be wrong but I believe existing evidence suggests that if you block the input from sensory neurons a person won't know they are standing in front of a mountain vista.
Yes, if you change the environment, you change what someone sees.
To me this implies that if you tricked sensory neurons into firing the way you want the individual would think they were wherever you wanted them to think they were.
in the same sense that headaches are caused by a lack of aspirin.
But at any rate this is irrelevant because I am talking about keeping every atom of every neuron in sync (if need be -- I don't think the granularity needs to be that fine, but who knows). So you are right that I am assuming such a thing is possible.

If you don't want to make that assumption then clearly the thought experiment is not for you.
You did not answer about the thrown ball.

We are embedded in our contexts. Remove an action from its context, and you have changed the action. Running for a bus is not the same as running for exercise is not the same as running from a mugger is not the same as running in an athletic competition. Even if the action is identical. Zap somebody while running from a mugger into a new situation and everything is different. You are assuming that the environmental impact is zero.
So what? If you start with the assumption 1 + 1 == 2, you end up with the conclusion 2 + 2 == 4.

If you start with the assumption that the Earth is flat, you end up with the conclusion that sailing to the edge is a bad idea.

Both are logically valid arguments.
I agree; garbage in, garbage out. You really should examine your assumptions more closely.
It seems like you are really attacking the assumption that self is information only. OK. But that wasn't the point of this thread. The point of this thread was to discuss what the implications are if that assumption is true.
Well, yeah, I addressed that in my first post.
Sure. If what is important is that the work survives in some form, then clearly a 0.5 probability of being the last surviving work (should one of them randomly disappear) makes it much more important than a copy with a 0.001 probability of being the last surviving work but not as important as a copy with a probability of 1.0.
If what is important is that the work survives in some form, then yes, it is only important that the work survives in some form. Can't argue with that. Would be boring.
Your argument here is severly flawed because you are ignoring the fact that humans assign dynamic value to things all the time. None of the coins minted in 1500 were that special in 1500. Now the remaining ones are simply because they are the remaining ones. So are you saying all those coins were just as special as the current survivors when they were minted? If so, why didn't anyone treat them that way?
Then why are you assigning value to even the last person? It's arbitrary, after all, and we have plenty more people.
I don't care about any "many worlds" interpretations because in those interpretations neither I nor anyone I care about knows anything of the other copies. In the teletransporter example I am explicitly informed of what happens.
And in the last thread I gave my conditions for entering it.
Absolutely. But why is sentimentality so bad, Mercutio? If we are sentimental because those originals were touched by the great Shakespeare himself, so what? You are sounding like a theist who is deluded into thinking that value must be absolute. Why can't we assign sentimental value to objects?
Didn't say it was; didn't say we can't. Never said value was absolute. I have no problems being sentimental (ask anyone). And in one sense, given our surplus of people and the arbitrary nature of "rights", our attitude toward killing anyone, ever, is pure sentiment.
What you have to realize is that not everyone is as sentimental towards their physical bodies as you are towards the original works of Shakespeare. I, for one, don't give a hoot about this rotting piece of meat my mind resides in. I don't like that I am bald, I don't like my complexion, I don't like my genetic diseases, I don't like a whole heck of a lot. And I certainly don't attach any extra value to it simply on the basis that it has been through 30 years of my life. What I do like is my mind. So if I could keep the mind and swap out the cruddy body for a new one -- ideally, one that I could pick and choose at whim -- I would be very inclined to do so.
Again, I gave my conditions for stepping into the machine. Sounds not that far from yours. But my conditions recognize that, yes, the scenario results in killing a person. If by "resolving", you are arguing that you are not killing a person since another copy exists, then your resolution does not answer any interesting questions, and is just one more exercise in circularity.
 
Yes, if you change the environment, you change what someone sees.

That isn't what I said. I am not going to play into your debate style of "agreeing" to something and then redefining what you are agreeing with as if I said it in the first place.

I said that if you interfere with the input from sensory neurons you change what someone sees. You can do that without changing the environment at all.

in the same sense that headaches are caused by a lack of aspirin.

No.

You did not answer about the thrown ball.

Any information that is relevant must be transfered.

That is my canonical answer. If momentum is important, it must be transfered somehow. Same for distance above the ground. I am not sure what this is supposed to illustrate.

You are assuming that the environmental impact is zero.

No. In fact, I explicitly said, only a few posts ago, that what is important is the effect of the environment rather than the environment. Maybe I should have said "impact of the environment" instead but as far as I know both phrases mean approximately the same. Do they not?

I agree; garbage in, garbage out. You really should examine your assumptions more closely.

Yes, I should, when the assumption is in question. Currently it isn't.

Not because I think it is true, but simply because it is one of the premises of the argument in the OP.

Well, yeah, I addressed that in my first post.

Err, not really. You haven't been specific about what you object to.

Then why are you assigning value to even the last person? It's arbitrary, after all, and we have plenty more people.

The last person is only "special" if the information isn't found anywhere else, I.E. if they are the last existing substrate.

Your question here is equivalent to "why is the last copy of Hamlet of value, after all we have plenty of other plays by Shakespeare?"

And in one sense, given our surplus of people and the arbitrary nature of "rights", our attitude toward killing anyone, ever, is pure sentiment.

Well, I think it could be argued that "not killing" is a very rational decision in the case of certain humans, namely ones that can be of use to us. But in general I agree with you here. Luckally I don't advocate killing anyone at all.

If by "resolving", you are arguing that you are not killing a person since another copy exists, then your resolution does not answer any interesting questions, and is just one more exercise in circularity.

I am arguing that if the "self" is nothing but information then the term "copy" has a vastly different meaning from what we normally attribute to it, the implications of which include being able to teletransport without actually killing.

I don't see why such an argument is circular -- I can assume self is nothing but information without ever referencing a teletransporter experiment.
 
Last edited:
You seem to have no problem understanding that if you kill off n-1 of them at random, the last one is a real person. Clearly, that means each of them is.

Suppose that using some futuristic super-quantum doubletalk device, we manage to create duplicates which even occupy the exact same space-time coordinates, and which (given that we have some future knowledge which we lack at present) have identical physical and mental states. Perhaps they are frozen in this state. Can we still dispose of any of them as we wish?

If we give each copy a wife and children, then we see that our actions are not without consequences. Even though they've been brought into synchronicity, their deaths have consequences. Even if we created each of them from a single copy, their deaths will have potential consequences in the future - for example, in the example I gave earlier of the arranged marriages to each copy. The difference between killing all but one of the copies and leaving them alive is apparent in the difference it makes to the future.
 
Suppose that using some futuristic super-quantum doubletalk device, we manage to create duplicates which even occupy the exact same space-time coordinates, and which (given that we have some future knowledge which we lack at present) have identical physical and mental states. Perhaps they are frozen in this state. Can we still dispose of any of them as we wish?

If we give each copy a wife and children, then we see that our actions are not without consequences. Even though they've been brought into synchronicity, their deaths have consequences. Even if we created each of them from a single copy, their deaths will have potential consequences in the future - for example, in the example I gave earlier of the arranged marriages to each copy. The difference between killing all but one of the copies and leaving them alive is apparent in the difference it makes to the future.

"If we give each copy a wife and children..."

OK.

Luckally I don't know of anyone who thinks getting into the transporter would entail all their copies getting wife and children.
 
Mercutio said:
Again, I gave my conditions for stepping into the machine. Sounds not that far from yours. But my conditions recognize that, yes, the scenario results in killing a person. If by "resolving", you are arguing that you are not killing a person since another copy exists, then your resolution does not answer any interesting questions, and is just one more exercise in circularity.
If you define murder as merely the destruction of a particular configuration of atoms, without taking into account the possible continuation of the information/memory signature of that entity, I wonder if you run the risk of trivializing the entity. Since this is precisely the opposite of what you're trying to do, it raises a new ethical problem. In particular, if there isn't something special about sentient beings, then you have made it just as immoral to crush a rock.

I don't know one way or the other; just wonderin'.

~~ Paul
 
Last edited:
If you define murder as merely the destruction of a particular configuration of atoms, without taking into account the possible continuation of the information/memory signature of that entity, I wonder if you run the risk of trivializing the entity. Since this is precisely the opposite of what you're trying to do, it raises a new ethical problem. In particular, if there isn't something special about sentient beings, then you have made it just as immoral to crush a rock.
Only if you didn't destroy the information/memory signature and or change the quality of life of that signature.

What is self? For me it is my experience of life. If my information/memory (and I would add sensory ability) were downloaded into lap top computer I might not have the same experiences I did before. Perhaps virtual life is better. No suffering from overindulgence and chick might really dig me but there's no guarantee of that. If this "signature" were to be put into another biological entity then wouldn't the signature but the important aspect? If we go back to my thought experiment of a person who is mortally sick and a brain transplant were performed wouldn't we see the brain as the important aspect and not the body?

Don't get me wrong. I'm not taking sides but the argument is interesting.
 
In the original thread (and, it appears, continuing here), people have made a big deal out of the requirement that destruction be instantaneous; it seems that people are accepting the notion that Darat2 and Darat3 will diverge after time1, and be unique individuals. Of course, they should be mostly identical at that point, but that first second seems to be one that gives people fits. (For myself, of course, each person, though identical, is separate even at time0; the fact that they diverge is to be expected, of course, but as we already can tell the difference between them, it is just another way in which they differ, not the only way in which they differ.)

What (apart from sentimentality, perhaps) is your justification in seeing value in non-identical, but not in identical, people? Is it simply because we have a replacement for the non-identicals? My employers seem to think that people in my position are pretty much identical; they replace them (us) without a whole lot of grieving. Practically speaking, Darat and I share 99.9999% or so of our DNA, and quite a bit of our learning history--both speak English, both know a bit of Greek but not enough to get by, both have educations that led us to skepticism... if Mercutio got in the machine and Darat came out (rather like what happened to the Admin position on the forum, now that I think about it), few would notice, and probably none would complain. All us pasty white guys look alike. We are very similar in genes, very similar in learning, very similar in so many ways, but each of these things are very important to the people who are calling us "individuals". One way in which we are 100% different, of course, is in location. Even when we were both at TAM3, we were not occupying the same space at the same time. (That's right, Westprog--your scenario demonstrates the lengths to which one would have to go to overcome the one difference seen as irrelevant!)

Dodger, you say you do not advocate killing anybody at all, but your scenario is set up to define "somebody" so that you could look at a corpse at your feet and say "that doesn't count; I've got a spare." Well, when I was Admin, the JREF had a spare; a button was pushed, I was self-un-adminned, and out of the machine stepped Darat.

Would you get in the machine if you were assured that the person who got out of the machine would be "pretty much like you"?
 
If you define murder as merely the destruction of a particular configuration of atoms, without taking into account the possible continuation of the information/memory signature of that entity, I wonder if you run the risk of trivializing the entity.
Boy, am I glad I haven't suggested anything remotely like doing that. (bonus: "information/memory signature"? How do you square that with the plasticity of memory? The assumptions on this thread are not supported by empirical data--yeah, I know, this is R&P.)
Since this is precisely the opposite of what you're trying to do, it raises a new ethical problem. In particular, if there isn't something special about sentient beings, then you have made it just as immoral to crush a rock.
Again, shades of Interesting Ian! Throwing together wheat, yeast, hops, and water, gives you beer. Or, gives you louloudiou pswmi. We are not our ingredients; we are what happens to them. You are defining people structurally; I am defining functionally. Contextually. Not as slabs of meat, but as people. They are a particular configuration of atoms, yes. That is a sufficient distinction between them. That is not every difference between them in normal life, but it is sufficient.
I don't know one way or the other; just wonderin'.

~~ Paul
 
Dodger, you say you do not advocate killing anybody at all, but your scenario is set up to define "somebody" so that you could look at a corpse at your feet and say "that doesn't count; I've got a spare."

Yes, exactly. If you have a spare, then destroying one doesn't count as killing.

Except, I am not saying just anyone can define what constitutes a "spare" -- only the self in question can do that. If you want to define "spare" such that Darat is a spare Mercutio, then so be it. I wouldn't consider any other human a spare of myself, though.

Would you get in the machine if you were assured that the person who got out of the machine would be "pretty much like you"?

If the machine destroyed the original? Absolutely not.

I am assuming that self is information. Specifically, I am assuming that this thing we call "conscious experience" is a fairly continual process of changing information. If the continuity of the process is broken then so is the "conscious experience."

If the original and the copy differ past a threshold (which I am assuming would be dependent upon the way neurons act as a substrate for information but must surely be crossed after even an extremely small amount of time) then continuity is broken -- the self at the source is no longer the self at the destination. If the process is instantaneous, or if the information of the self is frozen during the process, then continuity is preserved.

That is why I claim you could put a person in stasis, extract their D2, destroy the body, preserve the D2 for a thousand years on any substrate you wish, rebuild the body, and finally instantiate D2 in the new body and it would indeed be the very same self. Assuming self is nothing but information, of course.
 
Last edited:
In the original thread (and, it appears, continuing here), people have made a big deal out of the requirement that destruction be instantaneous; it seems that people are accepting the notion that Darat2 and Darat3 will diverge after time1, and be unique individuals.

For me the big deal about instantaneous death is only that I wouldn't use the device if death was slow and painfull. A slight delay between transfer and death would make it seem a little weird though. But that problem could be solved with proper device and interface design.

Looking at the screen and seeing "Transfer succesfull." "Initializing termination protocol." would be too creepy. I would prefer an alternative sequence of events: scanning -> death -> transfer. My death being a prerequisite for the transfer event. I note though that there is no practical difference between these two cases other than that the latter one would be more comfortable. Maybe because it would seem a lot like scanning -> death -> resurrection. Instead of scanning -> copying -> death.

Would you get in the machine if you were assured that the person who got out of the machine would be "pretty much like you"?

Would you have brain surgery if the person coming out of the operation room was "pretty much like you"?

Let's say I have a brain tumor. It's operable with 100% succes rate (same as the teleporter). Should I just spend my money on booze and loose women since it's not me that's coming out from the operation room? Should I even bother with the surgery?
 
Mercutio said:
Boy, am I glad I haven't suggested anything remotely like doing that. (bonus: "information/memory signature"? How do you square that with the plasticity of memory? The assumptions on this thread are not supported by empirical data--yeah, I know, this is R&P.)
Then how are you defining murder with respect to the transporter issue? I don't seem to be able to keep it straight.

Again, shades of Interesting Ian! Throwing together wheat, yeast, hops, and water, gives you beer. Or, gives you louloudiou pswmi. We are not our ingredients; we are what happens to them. You are defining people structurally; I am defining functionally. Contextually. Not as slabs of meat, but as people. They are a particular configuration of atoms, yes. That is a sufficient distinction between them. That is not every difference between them in normal life, but it is sufficient.
You know I am not suggesting anything about an immaterial mind. But what are the specific functional attributes of people that make them sacred, whereas a rock is not? A functional definition of a person would seem to allow for mere swapping of ingredients (structure), yet you are concerned that the two separate structures are sacred. Perhaps I don't know what you mean by functional.

~~ Paul
 
Last edited:
Paul - I think you are looking for a profound explanation, there isn't one. People are people and rocks are rocks and we (people) treat them as quite distinct when it comes to what you are allowed to do to them*. But in fact the same "objection" does apply: if I copy a rock and then destroy the original I have still destroyed a rock, the original rock is no more, it is a deceased rock.



*I'm in the UK and it is likely that there is a rock somewhere that is considered legally a person due to some parliamentary shenanigans in the 16th century.
 
Would you get in the machine if you were assured that the person who got out of the machine would be "pretty much like you"?
I don't know that before going to bed that the person who wakes up will be me. I don't cling to my sense of self that tightly. I try to make reasonable choices but in the end I only have incomplete knowledge for which to make decisions so I do the best I can and not worry to much about the rest.

I'd step into the transporter.
 
Darat said:
Paul - I think you are looking for a profound explanation, there isn't one. People are people and rocks are rocks and we (people) treat them as quite distinct when it comes to what you are allowed to do to them*. But in fact the same "objection" does apply: if I copy a rock and then destroy the original I have still destroyed a rock, the original rock is no more, it is a deceased rock.
And yet you don't give a rock's butt that you have done so. Why? There has to be more to the story.

I suppose it could simply be that we arbitrarily hold certain configurations of matter and associated processes sacred, while not caring about all the other configurations. I have no problem with that, but it sounds like Mercutio has something more.

~~ Paul
 
And yet you don't give a rock's butt that you have done so. Why? There has to be more to the story.

I suppose it could simply be that we arbitrarily hold certain configurations of matter and associated processes sacred, while not caring about all the other configurations. I have no problem with that, but it sounds like Mercutio has something more.

~~ Paul

I don't think he has but of course he'll have to confirm that himself.

As for your question "why". Well the shortest answer is "evolution". We consider kin and, by extension, other people to be more important than rocks (on the whole, many people are after all willing to kill to gain possession of some types of rocks) and therefore we give more value to a human existence than the existence of a rock.

Actually thinking about the idea of people being willing to kill to gain possession of rocks: consider the experiment in which we do duplicate a diamond to effect out transport. How many people would be willing to destroy the original rather than keeping the original and the "transported" copy?
 
Darat said:
As for your question "why". Well the shortest answer is "evolution". We consider kin and, by extension, other people to be more important than rocks (on the whole, many people are after all willing to kill to gain possession of some types of rocks) and therefore we give more value to a human existence than the existence of a rock.
Oh, absolutely. I'm not wondering why we behave as if other people are something special. I'm just asking about our intellectual philosophizing concerning this particular contrived example of the transporter.

Actually thinking about the idea of people being willing to kill to gain possession of rocks: consider the experiment in which we do duplicate a diamond to effect out transport. How many people would be willing to destroy the original rather than keeping the original and the "transported" copy?
That would depend on whether they had the same RFI chip ID. :D

~~ Paul
 
I don't know that before going to bed that the person who wakes up will be me.

Thank you for bringing this up -- I seem to have completely ignored this important point.

The self is destroyed upon unconsciousness. We know this because during sleep we have no experience of self. All that is left is the neural hardware that will "reboot" upon waking, creating a fresh self -- which shares the memories of any previous selves -- for another day.

Is the "you" of tomorrow the same "you" of today? Why or why not?

If you were to be killed and replaced with an identical doppleganger tonight in your sleep what possible effect on anything, including your consciousness, could that have?
 
But in fact the same "objection" does apply: if I copy a rock and then destroy the original I have still destroyed a rock, the original rock is no more, it is a deceased rock.

I guess the question from my side of the fence is "why is this an objection?"

Mercutio is correct that I can redefine "somebody" so that nobody is killed, and likewise he can keep his (outdated, in my opinion) definition so that somebody is killed, but this is all semantics. So the meat of the issue is "why is it so bad to destroy an aggregation of molecules in the form of a human if all the behaviors considered important to us other humans are saved?"
 
Right so you aren't saying multiple "I"s are not created and agree that each "I" that is destroyed is a person being killed, just that you don't think it's a big deal.

However, there is no actual "I." The word is merely a socially-necessary referent and there is actually nothing that it actually references. Thus it is not possible to "destroy an I" because the word does not refer to anything. So no "I's" are being destroyed because there are none anyway.

Nick
 

Back
Top Bottom