I don't see many materialists regularly committing suicide in order to avoid the everyday pains and suffering of living, so the evidence seems to be against that statement.
Do you deliberatly ignore the things I later said on that? Suffering is not the only issue here. Dying, however, involves suffering. Death, on the other hand, involves nothing to a person, personally. For reasons besides themselves, there are reasons to mind being killed, and for reasons for themselves, there are reason to continue living.
Oh, so now you're saying that divergence IS an issue.
When was it not an issue? It's always been the issue of whether or not the person can be killed morally.
Please re-read the scenario I posted. We are assuming that killing the original at the time of "teleportation" is only an option, and it is just as easy (in fact easier) to leave the person alive. Imagine that the entire room that the person is sitting in is copied to a new location (not just the person). Let's assume the room has no windows and is sealed so that it will remain exactly the same as the original room indefinitely until the door is opened. Now, the two people inside cannot diverge until they leave their respective rooms. 1) Are these two considered the "same person" until such time as they leave their rooms? 2) Which one of them becomes a different person the moment they open the doors to their rooms and why? 3) Is it OK to kill one of them before they leave their rooms? 4) Would it be OK to kill them after they leave their rooms? 5) Do you think either of them might not want to die? 6) Do you think it would make a bit of difference to either of them whether or not they have left their room?
1)If the rooms are deterministicly the same, then yes, they are the same person by every meaningful difference. I don't think that would happen (QM and whatnot), however, they would at least be trivially different from each other.
2) They both become different people afterwards. Exactly like Ian says, the materialistic definition of a person is essentially a continuously, infinitessimally, changing set of information. By no meaningful test could define a person better than the information which is represented by the position of atoms in their brain.
3) I'd say yes, provided suffering is avoided.
4) No, although there is going to be some wiggle room. An hour afterwards, I'd definitely say killing them is a problem. A nanosecond after, I don't think the difference would matter, but it could, at least in the legal definition.
5) Irrationally, assuming they use it of their own free will, as a transporter, yes, neither will just slit their own throat. However, before duplication, I can say I would not mind the fact that one iteration of me will be killed to make this work. Which is why the person who is being killed should be killed quick and painlessly, so they do not have time to suffer, either because they are being killed painfully, or because they have time to consider the fact that they will be killed.
6) Like I just said, giving them time to think will cause suffering in a person. (note that I do not care that this person is an iteration of a single "person". This is extended to anything with the capacity to suffer.)
The copy won't be indistinguishable from the original as soon as he leaves the room. Obviously the lives of the family would be vastly different between the scenario where the original lives and where the original dies. How do you know the family wouldn't care? Also, how do you know they're OK with having a dead corpse of their loved one left behind? How do you know they wouldn't prefer the original to live? How do you know the original wouldn't prefer to live?
For the same reasons that families don't grieve about children they chose not to have. That represents the possibility of a new person, not an actual person at the time.
Then you'll need to think carefully about the above questions. Why would you allow one of them to be killed the moment before they leave the room, but not the moment after? Which one of them suddenly becomes a different person when they leave the room than they were when they were still inside the room (or do both of them become different people)?
They both become different people, slightly afterwards. Neither is exactly the person who walked in. People continuously change, always.
Very different issue. The right to live has to do with an already living person, not with a potential person who doesn't exist yet. Taking a purposeful action to end an already existing life is considered murder, and is a crime in most countries.
Do you really believe that either "copy" even if exactly the same doesn't feel that they have anything to lose by dying? By virtue of them being in two different places, they have potentially very different futures (as do any people who might come in contact with them). Which one would you kill?
The one who beforehand, wanted to be killed. If this is being done for no reason (or just as a test), say the copy is 2 feet away and both are unconcious, I suppose it could be random.
Why would you mind being killed after teleportation but not during teleportation? Does the above scenario with the duplicate rooms change anything? Why or why not?
After, if I am not made to suffer in any way, and the only divergence is trivial, I can't say I'd really mind. However, suffering would be inevitable if I know beforehand that one person is to be killed, and "I", at that moment, could be either one of them.
The very act of killing someone causes divergence if you consider death to be that the brain ceases to change based on input from the outside world. It would be impossible to kill someone without causing them to be two different people (albeit one of them dead). Sounds like murder to me.
This just completely misses the point of divergence. Yes, that happens, but it's not meaningful. The divergence as a person is trivial, the functioning of a body is not a good way of defining a person.
Interesting, the same can be said about both copies.
Yes, but I don't consider it enough justificatin to create a life, essentially. Same thing as not having a baby so it can experience the joys of life.
I agree completely. Therefore, if the choice were left up to the people involved, neither would likely want to die. At the same time, neither would want to have two copies of himself wanting to sleep with one wife. An excellent reason to take the bus, in my opinion.
Except if they are not allowed to diverge, they are not losing a "self". I see no problem with copying and then deleting myself, as opposed to just cutting and pasting myself, to use a computer metaphor.
If you smash one, it has the same material in a different form, so that is clearly not the case. That said, to a materialist (and it's interesting that you use the term "material") there is nothing beyond the material in a person. therefore, the "value" of a person is entirely his or her material, and two people are more valuable than one. Given the potential of markedly different futures, killing one would indeed be a loss.
Allright, the vase has value in it's information and material. Destroying it causes changes in the information.
However, there is a difference between a "person", and a "human". A "person", as I'd use the term, only exists in a "human", a human being an animal of such flesh and such organs. However, a "person" could be an alien, or a computer program, or any number of things. The "person" aspect, the part that I'd have a funeral for if it was lost (or at least appeared to be to the best of my knowledge, etc etc), is only the information, which incidentally, is stored in the brain of a "human", for everyone I know.
Information is a perfectly acceptable concept for a materialist. It's just always stored in material things. However, the material is irrelevant, so long as it exists.
That is a very dualist attitude, I'm afraid. For the materialist, there is no "mind" outside of the material brain. And two brains are certainly better than one.
-Bri
Not if they are exactly the same. Making a copy of a program does not increase it's value on my computer. Making a copy of "me", does not increase my value.
Yes, having 2 iterations of a genious would be valuable. However, they would have to diverge before they could do anything.