I never understood the cult for the singularity. I mean, even if mind uploading happens, you do not really "upload" your mind, you jsut put a copy of its memory, itnerraction, and personality ina comptuer system, so that virtually there would be no way to differentiate you from it. Just like the teleportation paradox, it suffers from the fact that you are just creating a copy, and if the original dies, "you" died. The copy as a separate entity might live and go on eternaly as long as electricity is paid, but the original Aepervius the human would have died.
Therefore I would not see that as immortality, more as a way to produce an immortal indentical twin offspring from me.
Mildly interresting , but not that much.
I tend to agree wth this, but there is another way to look at how it may be possible or even the same. It sounds as though you've probably already considered the following, but I'll post it anyway for the benefit of the the general discussion, and look forward to your response.
Suppose that instead of instantly uploading our consciousness into a machine brain, we develop technology on a nanoscale that slowly converts our biological nero-networks, one connection at a time, over a period of say several weeks or months, in a manner that reproduces the same functions? Apart from sleeping, we would maintain our continuity of consciousness and wouldn't know the difference. When the transition was complete, could we not then claim to have made the "transition"?
Does the physical construction of the processing unit really have to be made of the same materials in the same configuration, or does it just need to perform the same job as well? Cells live and die all the time. We're not made of exactly the same atoms we had when we were born. We are very different as adults in many ways. My personal feeling on this is that the person who makes gradual cellular transition would still be the same "person", owning the same "consciousness", retianing the same memories and posessing ownership of their "selves".
It would not be right or fair at the point of complete transition to suddenly declare this conscious being as "dead", strip them of their rights and start reading their will. In every measurable way that matters, I contend this person would in fact be the same "person" as before, with all the rights, privileges and obligations that accompanied them before the transition to a new brain were complete.
Now this is where the problem gets tricky. In the preceeding example we have no copy at the end of the process and the transition maintains a continuity of consciousness that is indistinguishable from normal biological function. Now suppose we come up with a way to perform that task faster? How fast could this task conceivably be before the retention of self becomes untennable? What if it could happen instantly, and the person doesn't even know it had happened? Then what? Provided there are no "copies", there really isn't any difference here than in the first example above.
Where it starts to get murky is when we add the downloading issue into the mix and end up with copies. At the instant the download is complete and the "new you" is turned on, you are no longer a single minded entity. What then? Can we say we are still in essence a single entity but with two minds ... a "multiprocessor unit" with independent sensory inputs capable of simultaneous autonomous operation? Would many of such copies take on the persona similar to that of a "corporate entity", where all the "yous" are partly responsible for the actions of the collective? Or would the new "yous" be considered nothing more than advanced desktop computers and be considered property you could switch off at will?
Whether we like it or not, if we keep progressing the way we are, these are the kinds of issues we're facing. It's exciting and frightening at the same time. We live in amazing times.
j.r.