• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

The slow transporter, questions raised

rocketdodger

Philosopher
Joined
Jun 22, 2005
Messages
6,946
Suppose the teletransporter (TTP) is altered so that the teleportation takes place slowly, over time, as such:

1) There is an identical body at the destination, that is in stasis "slave" mode. For every neuron at the source, Ns(n), there is an identical neuron at the destination, Nd(n).
2) At all times during the process, a neuron is either a master or a slave.
3) Only one of each pair can be a master neuron. That is, if Ns(n) is a master, Nd(n) is a slave.
3) Master neurons "run" normally, whereas slave neurons are, by some advanced technology, simply copying the behavior of the corresponding master at the other end. Slave neurons will, however, interact with other neurons normally, for instance by activating master neurons, etc.
4) One by one the masters at the source have their status swapped to slave with their counterparts at the destination. In other words, for n = 1 to n = <number of neurons in body>, we make Ns(n) a slave and Nd(n) a master.
5) At the end of the process the body at the source is now in full slave mode and the body at the destination is the master that is actually determining behavior.

I think most people would agree that this feels comfortable -- they would be willing to step into the teleporter and they would actually experience the transport. As their sensory neurons were "moved" to the destination, they would see and feel parts of both location, until eventually the source was completely replaced with the destination.

Here is the problem this scenario raises -- what duration of transfer is OK?

It seems to me that there is no change in any physical behavior at either end if the duration is shortened -- including down to zero, or instantaneous (which is physically impossible, but assume we could do it for the sake of argument). As long as none of the slave neurons get out of sync with their masters -- ever -- the duration of the process is irrelevant.

Which means if someone holds that consciousness is the result of physical processes, rather than some supernatural soul etc, they should be willing to step in the machine if it simply copies them to the destination and destroys the original an instant later.

Thoughts?
 
The word "transporter" is creating confusion here. Because nothing is being transported. Something is being destroyed and then a copy is being created at a different location.

There is no self which is aware. There merely appears to be. No one is in actuality experiencing. And so this whole scenario that you're laying out, where there appears to be some continuity between subjectivity and a destroyed and copied brain, is just nonsense. It is not consistent with science and materialism.

Nick

eta: the error in reasoning which you're apparently making, if I may be so bold, is to assume that the continuity of subjectivity that seems to be present is down to the presence of some self which is doing the experiencing. It isn't, not according to materialism. It's simply down to the relative continuity of the brain.

* How it seems is that the copied brain will create an identical subjectivity but that I will no longer be experiencing it. But this is just the illusion.

* How it is is that there is no one experiencing anyway!
 
Last edited:
The word "transporter" is creating confusion here. Because nothing is being transported. Something is being destroyed and then a copy is being created at a different location.

There is no self which is aware. There merely appears to be. No one is in actuality experiencing. And so this whole scenario that you're laying out, where there appears to be some continuity between subjectivity and a destroyed and copied brain, is just nonsense. It is not consistent with science and materialism.

Nick

eta: the error in reasoning which you're apparently making, if I may be so bold, is to assume that the continuity of subjectivity that seems to be present is down to the presence of some self which is doing the experiencing. It isn't, not according to materialism. It's simply down to the relative continuity of the brain.

* How it seems is that the copied brain will create an identical subjectivity but that I will no longer be experiencing it. But this is just the illusion.

* How it is is that there is no one experiencing anyway!

blah blah blah

wut??

Let me put it in a way you can't totally sabotage with stupidity, Nick:

Suppose you have a block of metal. That block of metal is destroyed and a copy is made at another location. Most people think this second block is not the same as the first.

Now suppose we do it molecule by molecule -- a molecule is destroyed at the source and a copy is placed at the destination. Little by little. Until there is nothing at the source and a block of metal at the destination.

Do people think of this scenario differently? Is the block somehow more "same" than in the first scenario?

If so, why? And does the duration matter? etc.

Nothing to do with this "there is no self" nonsense you constantly try to hijack threads with.
 
Suppose you have a block of metal. That block of metal is destroyed and a copy is made at another location. Most people think this second block is not the same as the first.

Now suppose we do it molecule by molecule -- a molecule is destroyed at the source and a copy is placed at the destination. Little by little. Until there is nothing at the source and a block of metal at the destination.

Do people think of this scenario differently? Is the block somehow more "same" than in the first scenario?

If so, why? And does the duration matter? etc.

Nothing to do with this "there is no self" nonsense you constantly try to hijack threads with.

But you are not a block of metal, RD. If you did it neuron by neuron, all that would happen would be one subjectivity slowly diminishing to zero and another slowly being recreated somewhere else. From a subjective point of view it's pretty meaningless. Yet you write above...

RD said:
As their sensory neurons were "moved" to the destination, they would see and feel parts of both location, until eventually the source was completely replaced with the destination.

...this infers to me that you clearly believe there will be a continuity of one subjectivity. Or, even more insane, a continuity of some observing self somewhere. That is not materialism. It's just idiotic.

Nick
 
Last edited:
But you are not a block of metal, RD. If you did it neuron by neuron, all that would happen would be one subjectivity slowly diminishing to zero and another slowly being recreated somewhere else. From a subjective point of view it's pretty meaningless. Yet you write above...



...this infers to me that you believe there will be a continuity of one subjectivity. That is not materialism. It's just idiotic.

Nick

You clearly didn't read the OP. I specfically state that there is a mechanism keeping the neurons at each end communicating with each other so that consciousness perception is continuous.

Honestly, I don't think I can learn anything from arguing with you, I have felt that for years now, so you don't need to respond to this thread. It is for other people.
 
You clearly didn't read the OP. I specfically state that there is a mechanism keeping the neurons at each end communicating with each other so that consciousness perception is continuous.

Well, OK. Fair enough. I didn't read your post clearly enough. Apologies.

Honestly, I don't think I can learn anything from arguing with you, I have felt that for years now, so you don't need to respond to this thread. It is for other people.

No one else seems much interested!

I don't see that there will be a continuity of subjectivity anyway, not according to materialism. Even if you had a means for copying neurons one at a time, there would still be neurons that are actually in the global workplace and once they're destroyed that subjectivity is lost. That's how I see it. Have to admit it's a more interesting question now.

Nick
 
Last edited:
Even if you had a means for copying neurons one at a time, there would still be neurons that are actually in the global workplace and once they're destroyed that subjectivity is lost.

So you are saying there might be single neurons that are just absolutely essential, and the discontinuity in one of them would imply discontinuity for the whole system?

I agree that this might be an issue. It would be interesting to know if there are such neurons or if there is no single neuron that the brain couldn't do without.
 
Why would I want to make a copy of myself and then animate that copy whilst killing myself?
 
RD, Sorry to ask you again but could you please define what you mean by the word consciousness as used in your OP.

Thanks
 
It seems to me you are arguing about historicity. You want to somehow give one portion of the present some status because it has a past that another portion of the present does not.

Further, it seems there is an element of personhood stuck on somewhere. Would the discussion be the same if I destroyed your 'new in the box' laptop and then paid for another of the same model? You wouldn't care. They are identical.

How is it any different with people? If you are going to make duplicates, no matter that you label them master and slave, on what basis is one any different than the other, save historically? And why should it matter? What is this property that is moving across time in the discrete sense, but somehow is still has an extension through time and gains rights thereby?

If you make a copy of a human being and then destroy a human being -- either the original or the new -- you have destroyed a human being. Now, whether the ability to make multiples changes the value of human life (which could be) down to near zero, we've still killed a human being.

That might have legs, actually. Let's do it with the Mona Lisa and see how we feel about it.
 
So you are saying there might be single neurons that are just absolutely essential, and the discontinuity in one of them would imply discontinuity for the whole system?

I agree that this might be an issue. It would be interesting to know if there are such neurons or if there is no single neuron that the brain couldn't do without.

I don't think we really know enough about the brain to answer that question authoritatively. If we take the GWT model, then neurons that happen to be in the global workspace in essence are the correlates of consciousness, as I understand it. They're not different or special in any way, compared to other neurons, but their location means they're the ones. Thus you can't shift the global workspace. You can feed it from different sources, but you can't move it. I hasten to add that this is just my understanding.

I recall that Dennett's earlier model "Multiple Drafts Theory" denied the existence of a "place in the brain" where consciousness happens. But he subsequently swapped sides and came out for GWT. And these days there do appear to be studies confirming that there are certain areas of neuronal activity which directly translate to conscious perception. Meaning you can interpret their activity with machines and get rough images that correspond to what the individual is seeing.

Nick
 
Why would I want to make a copy of myself and then animate that copy whilst killing myself?

The idea is that instead of a clean break between the original and the copy, it is a gradual process and the person is aware of both locations at once and thus stays the same "person."
 
RD, Sorry to ask you again but could you please define what you mean by the word consciousness as used in your OP.

Thanks

Just the usual.

Something like a seemingly continuous perception of ourselves and our environment. What you loose when you go to sleep and get back when you wake up the next morning, etc.
 
I don't think we really know enough about the brain to answer that question authoritatively. If we take the GWT model, then neurons that happen to be in the global workspace in essence are the correlates of consciousness, as I understand it. They're not different or special in any way, compared to other neurons, but their location means they're the ones. Thus you can't shift the global workspace. You can feed it from different sources, but you can't move it. I hasten to add that this is just my understanding.

Well the question is whether or not any individual neuron is essential to the functioning or if the global workspace is a network robust enough to tolerate the loss of a neuron here and there.
 
It seems to me you are arguing about historicity. You want to somehow give one portion of the present some status because it has a past that another portion of the present does not.

Further, it seems there is an element of personhood stuck on somewhere. Would the discussion be the same if I destroyed your 'new in the box' laptop and then paid for another of the same model? You wouldn't care. They are identical.

How is it any different with people? If you are going to make duplicates, no matter that you label them master and slave, on what basis is one any different than the other, save historically? And why should it matter? What is this property that is moving across time in the discrete sense, but somehow is still has an extension through time and gains rights thereby?

If you make a copy of a human being and then destroy a human being -- either the original or the new -- you have destroyed a human being. Now, whether the ability to make multiples changes the value of human life (which could be) down to near zero, we've still killed a human being.

That might have legs, actually. Let's do it with the Mona Lisa and see how we feel about it.

sorry but none of this has anything to do with the OP as far as I can tell
 
Well the question is whether or not any individual neuron is essential to the functioning or if the global workspace is a network robust enough to tolerate the loss of a neuron here and there.

I'm sure it can tolerate the loss of many neurons, which are no doubt anyway being replaced on a regular basis.

GWT is location specific, as I understand it. There is no qualitative or functional difference between conscious and unconscious networks (or conscious and unconscious neurons). It is simply how they're wired up. Thus if you destroy a group of neurons in the global workspace, say one trolling away creating visual representations, then you destroy permanently that aspect of vision. The fact that it is being recreated somewhere else doesn't make any difference.

I guess you might be able to transfer them over one by one though, or in discrete blocks, and so effectively replicate what happens when cells die and are replaced with others, with no attendant loss of subjectivity. Maybe your slow transporter would be less confrontational to the human mind! Perhaps you should patent the idea!

Nick
 
Last edited:
I guess you might be able to transfer them over one by one though, or in discrete blocks, and so effectively replicate what happens when cells die and are replaced with others, with no attendant loss of subjectivity. Maybe your slow transporter would be less confrontational to the human mind! Perhaps you should patent the idea!

Nick

Nick that is the whole point of this thread...
 
I don't understand the Master/Slave relationship in the OP.

Suppose we are at the half-way point numerically. I'll use capital for Master and lowercase for Slave.

Neuron A fires, so a fires. A affects b at the one end. a affects B at the other.

In what sense is B the Master of b if it is altered by a? And doesn't it get exponentially worse as the connections increase?

It seems to me the relationship cannot hold.
 
You clearly didn't read the OP. I specfically state that there is a mechanism keeping the neurons at each end communicating with each other so that consciousness perception is continuous.

Honestly, I don't think I can learn anything from arguing with you, I have felt that for years now, so you don't need to respond to this thread. It is for other people.

The consciousness is situated in the brain and destroying that original brain, no matter how much you try to disguise it, is pretty much detsroying the original person and making a copy of it somewhere else. For an external person point of view nothing hapenned the person are identical, but the second one came into existencve with all the memory of the first one, and the first one from his perspective died.

The only way to get around that is to imagine something BESIDE the brain is holding our siege of consiousness. Like a soul. Good luck proving that.

The only way to "transport" which would work, would be to physically transport that person, not destroying/reconstructing.

*that* is the materialist point of view, and so far as I can see, the only one which can be backed up with evidence.

ETA: another way to view it, is that transporting "states" of neuron is defacto the same as copying the state on the new brqain, and deactivate the old one, but if you omit the deactivate step, or duplicate the state onto more than 1 slave brain, you see imemdiately where the problem is.
 
Last edited:
The consciousness is situated in the brain and destroying that original brain, no matter how much you try to disguise it, is pretty much detsroying the original person and making a copy of it somewhere else. For an external person point of view nothing hapenned the person are identical, but the second one came into existencve with all the memory of the first one, and the first one from his perspective died.

The only way to get around that is to imagine something BESIDE the brain is holding our siege of consiousness. Like a soul. Good luck proving that.

The only way to "transport" which would work, would be to physically transport that person, not destroying/reconstructing.

*that* is the materialist point of view, and so far as I can see, the only one which can be backed up with evidence.

Well, RD's method looks sound, and doesn't contravine materialism, as far as I can tell. Isn't it that neurons are anyway dying and being replaced all the time?

Theoretically, using this technique, you could transfer your consciousness onto silicon.

Nick
 

Back
Top Bottom