• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

The Star Trek Transporter Enigma

Hu. Where the hell did you get your definition of materialist ? Or of self ?

Sure , as a materialist I know (well assume in absence of evidence to the contrary) that my consciousness is the emerging property of the function of my neuron in my brain. But duplicating my brain and killing *THIS* self, will not recreate *ME* it will recreate another one with an exact identical me memory.

It will absolutely recreate *ME*. As you have just affirmed *ME* is merely a process emerging from brain activity.

Now the body will be a different instance of the same body - a copy. But the *ME* will be identical.

The whole subjective field, including the notion that there exists a self which is experiencing it, will be perfectly recreated.

Nick
 
Last edited:
Emotions have their own logic too. For sure, if you look at it from the perspective that you are going to die when you push the button, then there will be an emotional response. But if you recognise that the copy lives and that there is not an experiencing self anyway under materialism, then the emotional response can be understood and let go of. Like !Kaggen says, if you won't travel you ain't a materialist. Simple as that.

Nick

I disagree. This is not a simple exercise -- understanding -- really understanding --why the destination you is the same individual as the source you takes time and effort and background information that most people don't have.

It is much easier to understand the basic premise of monism and see that it must be true, and then to further understand that physicalism is true. That only requires a basic grasp of logic and mathematics.

So I don't think it would be uncommon for there to be materialists that simply don't fully understand how the TTP would not destroy their self and because they don't "grok" it they wouldn't want to use it. Like I said, none of these people actually provide logical arguments against the TTP, they just admit that it "feels" wrong to them.
 
This is the way I put it in another thread:

My position is that human consciousness is a form of self referential information processing. It is an algorithm -- a series of computation steps -- that knows about itself.

My position is that the steps in the algorithm -- like any other algorithm -- can be thought of as a series of state transitions within the systems the algorithm is instantiated upon. Think about how programs are executed on a computer, how each step in a program represents a set of state transitions in the hardware. Well, my position is that the algorithm of consciousness is the same kind of thing in our brain -- the steps correspond to state transitions in our neural network.

My position is that these state transitions are deterministic, assuming quantum randomness is insignificant. This means the next state is determined by only the current internal state, the current external state, and a deterministic state transition function (which in the physical domain is simply the laws of physics).

My position, then, is that you can model consciousness (any algorithm, actually) as a series of state transitions in some system somewhere. That is, F(Si(t), Se(t), t) --> Si(t+1), where F( ) is the state transition function, Si( ) is the internal state, and Se( ) is the external state. If you looked at time slices of consciousness -- we can use plank time as the duration since then we know we captured any relevant events -- the algorithm would look something like this in the physical domain: S(1)-->S(2)-->S(3)--> ... -->S(current time).

My position is that consciousness is those deterministic transitions between states, the "--->" you see above. It is the algorithm itself, not the physical stuff the algorithm is running on. It isn't your brain, it is the directed "movement" from one state of your brain -- or any brain -- to the next.

My position is that if you take a subsequence of this algorithm -- suppose S(10)-->S(11)-->S(12) -- and split it between multiple systems, or instances, it remains the same algorithm precisely because the deterministic state transitions are exactly the same. In other words, if F(Si(10), Se(10), 10) occurs on system A, and determines state 11 on system B, and if F(Si(11), Se(11), 11) occurs on system B and determines state 12 on system C, the algorithm and hence the consciousness is exactly the same as it would be if everything occured in the same system.

So if your brain is in state 1, and the laws of physics combined with state 1 result in state 2 one planck time later, then the system where state 2 is located should be irrelevant. State 2 is still part of the algorithm, the same algorithm, because it was determined by state 1.

And finally, my position is that if you somehow add an intermediate step in there between determining state 2 and the system actually being set to state 2 -- such as communicating across space to an identical system that it should be set to state 2 -- the algorithm and hence the consciousness is still the same, because state 2 is still determined by state 1. The fact that there was a middleman doesn't change that key element. Nor would it change if that communication took a very, very long time -- if the original was scanned, then destroyed, and the information took a billion years to reach the destination, and only then was the copy made -- it would still be the same algorithm and hence the same consciousness. Because state 2 was determined by state 1
 
Last edited:
So I don't think it would be uncommon for there to be materialists that simply don't fully understand how the TTP would not destroy their self and because they don't "grok" it they wouldn't want to use it. Like I said, none of these people actually provide logical arguments against the TTP, they just admit that it "feels" wrong to them.

"Just 'feels' wrong to them?!" What kind of an argument is that?

Look, what is it that feels wrong to you?

Nick
 
"Just 'feels' wrong to them?!" What kind of an argument is that?

Look, what is it that feels wrong to you?

Nick

Well, until one groks the true nature of consciousness, I think the discontinuity in bodies can be troubling. After all, there is a gigantic discontinuity...
 
You haven't removed the paradox. You just only asserted it isn't an identical person which is created, but the same.
I didn't say I removed the paradox, I just showed that the paradox only arises when one views the experiment in a certain way. I can't prove that this view is wrong, but I can show that people who take that view try to look at the issue from viewpoints that the experiment makes impossible.

Difference between identical and same.
I don't think there is a difference. They are the same/identical.

You can't get around that conundrum unless *you* presuppose that something else is saved (soul, whatever) and is transported and reconstructed on the other side to have a SAME individual instead of identical.
The thought experiment explicitly states that everything material about a person is transported and reconstructed at the other side. That includes the "soul or whatever" if this is a material thing or some emergent property of the material things that comprise you. And even if the "soul or whatever" is not a material thing that can be scanned and transmitted, there is still no reason to assume that it will stick to one place.
 
Well, until one groks the true nature of consciousness, I think the discontinuity in bodies can be troubling. After all, there is a gigantic discontinuity...

We take the notion of an experiencing self for granted, as a rule, I think. And so it certainly seems on the surface as though, well, the copy might be living it up on a Thai beach with Elin Grindemyr, but what about me? I'll be dead and gone! This is how it seems.

But materialism does assert that this notion of an experiencing self simply cannot be what it appears to be. It cannot have any separate existence and must simply be just another aspect of consciousness itself. Essentially it is just an idea. This is actually very clear, I think, just acutely counter-intuitive.

So, I don't see that you can in any way claim to be materialist and say you wouldn't push the teleport button. And to say that you won't do it because it "just feels wrong" is ridiculous! I mean, how would you regard someone who claimed that angels, water memory, and pets-who-know-when-their-owners-are-coming-home must all exist because they just "feel right?"

Nick
 
Last edited:
Darwin's Beard!
Can we have a subforum for transporter threads? If not, can we have posters look for previous threads before starting new ones about the same topic?

Thank you, thank you, thank you!

Damn it... alll philosophy is mental masturbation but this recent transporter crap has turned into a circle jerk. STOP IT!

Please. :)


OK, you! You're first! Stand right here.


If he won't I will. Enough.

FFS this is worse than CTs. Nerds.

(I R 1 2)




(Whew... that felt good.)
 
Last edited:
Darwin's Beard!
Can we have a subforum for transporter threads? If not, can we have posters look for previous threads before starting new ones about the same topic?
Well, here's post #50, which in my book means that this is a successful thread that's generating enough enthusiastic responses that I don't need to apologize to anybody for starting it.
 
Had I known there is ST transporter discussion here I would visit this section more often...

Anyway:

1) Maybe we can get around uncertainty (http://arstechnica.com/science/news...-topple-heisenbergs-uncertainty-principle.ars)

2) As for copies one could use another theoretical and favourite thing - wormholes... (but we would get outside of ST)

3) I think that in case all properties of particles would match then it could contain "same" soul as original.

I suspect that for solution for this type of debate we have to wait for few more breakthroughs in our understanding of brain and soul. (whatever soul is)
 
Last edited:
Well, here's post #50, which in my book means that this is a successful thread that's generating enough enthusiastic responses that I don't need to apologize to anybody for starting it.

OK, let's leave apologies out of the discussion. My only question is "is there anything in this thread that could not have been discussed by adding it to the end of the transporter thread started last week or to the to the end of the transporter thread started two weeks ago or to the end of the transporter thread started two days before that?"
 
So, I don't see that you can in any way claim to be materialist and say you wouldn't push the teleport button.

I know, but that is because you are blind sometimes.

I could generate a list of thousands and thousands of things that people who believe in science and mathematics know *should* work a certain way but still refuse to put their lives in the hands of science and mathematics without seeing it work -- many, many times -- first.

If you think about it, that is a pretty solid strategy, since although mathematics is infallible the people who use mathematics are quite often wrong. Your lineage wouldn't last very long if it jumped into any machine for which you were sure you understood the blueprints without making sure it really worked first.
 
Let's suppose that the fictional Star Trek transporter works like this:

A computer records the identity and position of every particle of your body as the transporter disassembles it, and the particles are stored locally in some sort of storage space, or perhaps annihilated by conversion to energy.

Simultaneously, it recreates those particles at a remote location, or perhaps harvests them from existing matter at that location, and reassembles your body to form an exact duplicate of what it disassembled. The important point here is that the only thing actually transferred to the destination is information; no actual particles of matter travel across the gap. Thus, you've been transported.

Or have you?

Objectively, it seems so to everyone, including you. After you've been transported, you seem to be the same person with the same memories that you had before, but are you?

Or did the "you" that stood on the transporter actually get killed? Does the transporter actually execute people and replace them with duplicates? Is it suicide to step on a transporter platform?

This seems more like a philosophical than a scientific question, so I'm posting it here. Some might argue that the Star trek transporter must be forever impossible because of the difficulty of resolving the paradox. Some might argue for the existence of a soul. What do you think?

By the way, a similar puzzle is a situation like the one that occurred in the movie The 6th Day, where the original "you" isn't annihilated but survives while another "you" is created. Which one is you? How do you divide the property?

As I understand “classic ST transporter theory", there is a brief but measurable time that neither your body or your consciousness exists. Therefore, you are by definition, a construct made by a machine after the image of a dead person. If this bothers you, stay away from ST type transporters. Only use transporters that don't involve discontinues of your physical existence. You know, like planes, trains, buses, space shuttles etc.
 
Well of course there is a paradox. Scotty is not looking forward to the onerous task of moving the transporter from A deck to B deck. After a few Romulan ales he foolishly decides to have the transporter transport itself between decks ..... what happens ?

If we believe the materialist thesis that the transporter kills and clones, then it's use can be simplified. We redesign the transporter to clone a duplicate at the distant coordinates, and leave the original intact, then at the end of it's usefulness we disintegrator the clone. The number of unnamed security officers lost on "away missions" drops dramatically. The phrase "beam us up" is rarely heard and most away missions must be terminated remotely. "Kirk" becomes the most common surname among several species.
 
We tend to think ourselves as detached from our body and brain at times.

In a way, they appear to be a shell of the self or soul.

So when a thought experiment like this is presented, it can seem like a paradox to our naive intuition about the self.

Our shell was teleported. Did our self and soul follow? If two shells are teleported from the original shell, did we make 2 selves?

What many of us fail to grasp is that the shell is our self. This body, brain, etc. is all me. The real me is based on all of these things.

If 2 of me were teleported from the same me, then maybe initially we would share the same thoughts and everything. But it's not like our consciousness split into two parts and we lost anything. I have my own consciousness and my double has his.

I may be able to empathize with my double much easier, knowing that my double is exactly me at a different location, but I would still see me as myself and my double as not completely me.


Your consciousness, who you are, changes regarding your experience in this material universe. Me in the future is not me now, although it's nice to think of my consciousness and sense of self as somewhat eternal.


Trying to adapt a sense of self using a materialistic worldview is one of the toughest transitions I had from a dualistic worldview to how I think now. It was brain racking at times but necessary to remove cognitive dissonance.
 
I think this was discussed in the James Blish Star Trek novel, Spock Must Die. McCoy was discussing the moral implications of transporter use. Scott that reminded him that any difference that makes no difference, is no difference at all.

Ranb
 
We take the notion of an experiencing self for granted, as a rule, I think. And so it certainly seems on the surface as though, well, the copy might be living it up on a Thai beach with Elin Grindemyr, but what about me? I'll be dead and gone! This is how it seems.

But materialism does assert that this notion of an experiencing self simply cannot be what it appears to be. It cannot have any separate existence and must simply be just another aspect of consciousness itself. Essentially it is just an idea. This is actually very clear, I think, just acutely counter-intuitive.

So, I don't see that you can in any way claim to be materialist and say you wouldn't push the teleport button. And to say that you won't do it because it "just feels wrong" is ridiculous! I mean, how would you regard someone who claimed that angels, water memory, and pets-who-know-when-their-owners-are-coming-home must all exist because they just "feel right?"

Nick

Can I try to clarify what I think you're saying here? Are you saying that it doesn't matter if the original consciousness is destroyed, because from moment to moment consciousness is created from brain activity and each person's sense of a continuous experiencing self is an illusion? In other words, are you saying that it doesn't matter if the original "me" dies and a copy carries on, because from moment to moment that's what happens anyway (despite our sensation of continuous awareness)?
 
I know, but that is because you are blind sometimes.

I could generate a list of thousands and thousands of things that people who believe in science and mathematics know *should* work a certain way but still refuse to put their lives in the hands of science and mathematics without seeing it work -- many, many times -- first.

If you think about it, that is a pretty solid strategy, since although mathematics is infallible the people who use mathematics are quite often wrong. Your lineage wouldn't last very long if it jumped into any machine for which you were sure you understood the blueprints without making sure it really worked first.

Well, that the machine might go wrong is a whole different subject. The usual caveat Susan Blackmore uses when giving her students the Teletransporter scenario is that you can't use this as a reason not to travel.

I assume she's trying to get them to actually deal with the thought experiment at a deeper level, and a lot try to avoid the inner situation and rather go off into "oh, but what if this..., what if that..."

Certainly, having participated in numerous transporter threads on the JREF, it's my experience that this happens very often. People who otherwise consider themselves to be highly rational, scientifically-minded individuals start saying "Oh, but it just doesn't feel right." Or "but what if this goes wrong, what if that?"

Nick
 

Back
Top Bottom