• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Why should you be emotionally invested in strong AI?

rocketdodger

Philosopher
Joined
Jun 22, 2005
Messages
6,946
I am aware that there are many individuals emotionally invested in the notion that strong AI is false. I can understand why -- the idea that our consciousness has some magical component that is beyond the boring and cold scientifically understood world is one that can be tempting.

But there are also very good emotional reasons to support the notion that strong AI is true.

Because if strong AI is true, the implications aren't limited to a realization that we are all just meatbags having the illusion of non-deterministic thought. There is so much more.

If strong AI is true, humans will someday be able to upload their consciousness into any suitable substrate. This is not only the closest thing to immortality that could be available in our universe, but it is also far better than immorality. The ability to upload our consciousness implies the ability to modify it as well -- in any way we desire. Being able to upload also implies the ability to travel at lightspeed between suitable locations. So if you are interested in living forever, or living in any way you could possibly think of, you should want strong AI to be true.

Now you might say "but religion tells us that immorality is available now -- we don't have to wait for the technology, which might not arrive for hundreds of years."

To that I reply that my own estimates put the arrival time of such capability at less that 50 years from now. In fact such a thing might be possible within 20 years or so, and economic feasibility would follow within a few decades.

But even if you can't wait that long, you can always freeze yourself. Because an implication of strong AI being true is that technology to thaw you and bring you back to life is no longer relevant -- all that is needed is the technology to scan your frozen brain and extract the topography of the neural networks contained within. After that, the upload technology takes care of everything else.

Hopefully I have shown that there are emotional reasons to support strong AI that are just as good, if not better, than those for opposing it.
 
Last edited:
I am not very knowledgeable in this area so forgive my question, but why are you assuming that strong AI will allow the ability to upload individuals' consciousness?
 
I am not very knowledgeable in this area so forgive my question, but why are you assuming that strong AI will allow the ability to upload individuals' consciousness?

Because an implication of strong AI is that there is no loss of "essence" of consciousness, whatever that may be, when the intelligence is instantiated upon a non-biological substrate.

That is, if AI can really be conscious, then there is no logical reason why our own consciousness could not be transferred to a non-biological substrate.
 
You do realize that this part is a bunch of hooey, don't you?

I think it is a bunch of hooey that those frozen people will ever be un-frozen and "fixed" or whatever they think will happen.

I do not think it is implausible that at some point in the future -- certainly within a few hundred years, much sooner if my estimates are correct -- there will be technology to extract from a frozen brain the information needed to completely reconstruct a model of the neurons and their connections.

Assuming the brain is kept at a low enough temperature (maybe they aren't currently, but that isn't a fault of my argument), there is no reason that any neural connections should degrade past the point of being able to recover the topography.
 
Now you might say "but religion tells us that immorality is available now -- we don't have to wait for the technology, which might not arrive for hundreds of years."

.

Oh yes, I think religion all too often does preach the availability of immorality. Just look at the recent sex scandals in the RCC.

No technology required.

....

Sorry, couldn't resist...

....

I'm a PKD and Asimov fan, too. AI would certainly make things more interesting, wouldn't it?
 
Because an implication of strong AI is that there is no loss of "essence" of consciousness, whatever that may be, when the intelligence is instantiated upon a non-biological substrate.

That is, if AI can really be conscious, then there is no logical reason why our own consciousness could not be transferred to a non-biological substrate.

By "uploading" are you talking about moving consciousness from one place to another, or simply copying it?

If I uploaded my consciousness to a substrate, would I be me, or would the substrate-I be me? Would either me or the substrate-me be disposable? Is it sufficient for my personality and memory to survive for me to survive, from a subjective standpoint?

Can you envision a world in which everyone is constantly dying and being recreated, but nobody notices because the personalities and memories are continuous, and we are unable to experience our own death and resurrection subjectively?
 
By "uploading" are you talking about moving consciousness from one place to another, or simply copying it?

If I uploaded my consciousness to a substrate, would I be me, or would the substrate-I be me? Would either me or the substrate-me be disposable? Is it sufficient for my personality and memory to survive for me to survive, from a subjective standpoint?

Can you envision a world in which everyone is constantly dying and being recreated, but nobody notices because the personalities and memories are continuous, and we are unable to experience our own death and resurrection subjectively?
You are assuming a "me" that doesn't exist (according to the way of thinking expoused in this thread).

All you have is a brain, and processes running on the brain. The processes are self aware, and refer to something called 'me'. But there is no 'me' there.

If this is all true (and I think it is, I don't see any mechanism but the brain to create all this), then if you were to copy your brain state to silicon or whatever, the processes running on silicon would think "I'm still me!!!". Meanwhile, the processes running on the brain would think "hey, that silicon sure is acting a lot like me".

It is absolutely no different from the fact that you don't have my consciousness. That's not because there is a "you" and "me", but because the processes running in each of our brains are not networked in any way.

It's also no different, given these assumptions, then you falling asleep and waking up. There is no 'me' thing - it's just that the processes that are running in your brain now have access to memory of previous times when they were active, and your brain creates the fiction of a 'me'. But, if we swapped out your brain for silicon you couldn't tell. If we ran 10,000 instances of your brain on different computers, none of those 10,000 could tell - they'd all still think they were the you of the meat brain.
 
By "uploading" are you talking about moving consciousness from one place to another, or simply copying it?

One and the same, according to the computational model.

If I uploaded my consciousness to a substrate, would I be me, or would the substrate-I be me?

Both would be you for an instant, after which they would diverge. If it worries you, go ahead and keep a pistol nearby the original like in The Prestige.

Would either me or the substrate-me be disposable?

Up to you.

Is it sufficient for my personality and memory to survive for me to survive, from a subjective standpoint?

Yes. Your subjective experience could be recreated (or restarted, as it were) from the neural map data. Kind of like what happens when you go to sleep and then wake up -- there is a break in subjective experience but you are still the same person nonetheless.

Can you envision a world in which everyone is constantly dying and being recreated, but nobody notices because the personalities and memories are continuous, and we are unable to experience our own death and resurrection subjectively?

Yes, in theory it would be identical to your experience right now.

But furthermore I can envision even a world where you can recombine with other instances of yourself and merge the memories.

Think about it -- try to remember a place you have been, and the stuff you did there. Now ask yourself this -- were you thinking about that, at all, prior to me mentioning it in the above statement? That is what it would be like to acquire memories from another copy of yourself -- you wouldn't know the difference.

Pretty cool stuff, if strong AI is true.
 
Yes. Your subjective experience could be recreated (or restarted, as it were) from the neural map data. Kind of like what happens when you go to sleep and then wake up -- there is a break in subjective experience but you are still the same person nonetheless.

That makes perfect sense but seems so intuitively incorrect at the same time.

If I were to upload my brain into a robot brain right now, there would be two of "me", but it seems like the "real me" would still be the one operating in this body. And when my body dies, "real me" would die, too, even if the duplicate lived on.
 
Quite a while back, I read a discussion on the concept in OMNI magazine. The idea (according to the article) would be that at the point one was ready to make the "transfer" (or upload or whatever), one would indulge in switching back-and-forth from the body's POV to the machine POV.
Eventually, the ability to tell the difference (provided sufficient computational power) would be such that you could dispose of the biological body and simply continue on as an AI.

Fred Pohl had a nice exploration of what this might be like in one of his HeeChee novels; with the ability to "think" so much faster than a human that speaking to one would be an exercise in multi-tasking to avoid being bored....
It's an interesting notion, but one that's down the road a bit.
 
That makes perfect sense but seems so intuitively incorrect at the same time.

If I were to upload my brain into a robot brain right now, there would be two of "me", but it seems like the "real me" would still be the one operating in this body. And when my body dies, "real me" would die, too, even if the duplicate lived on.

You should go read the "teleporter" thread if it still exists. It really challenged my thinking on this matter, and brought me over to RD's position after much internal debating.
 
Quite a while back, I read a discussion on the concept in OMNI magazine. The idea (according to the article) would be that at the point one was ready to make the "transfer" (or upload or whatever), one would indulge in switching back-and-forth from the body's POV to the machine POV.
Eventually, the ability to tell the difference (provided sufficient computational power) would be such that you could dispose of the biological body and simply continue on as an AI.

Fred Pohl had a nice exploration of what this might be like in one of his HeeChee novels; with the ability to "think" so much faster than a human that speaking to one would be an exercise in multi-tasking to avoid being bored....
It's an interesting notion, but one that's down the road a bit.

That is a really good idea, and would make me much more comfortable. You could have the bio body and the AI share memory for the switching period so that it feels very seamless to the mind being transferred.
 
Two questions.

1.) RD, what are you smoking?
2.) Can I have some? :)
 
It's a risky form of insurance, but if we can unfreeze people while preserving their organs at some point in the future, then it's not exactly a bunch of hooey.
.
Storing a bunch of people, with the population increasing as it... is it realistic to expect the future would -want- more people?
Especially as in the typical Sci-fi scenario, these corpsicles are infected with probable long erased diseases for which the living population has no resistance.
 
That is a really good idea, and would make me much more comfortable. You could have the bio body and the AI share memory for the switching period so that it feels very seamless to the mind being transferred.

Yes.

In fact, this is the solution that I proposed in the teleporter thread. If you are hesitant about your consciousness being reduced to numbers, then you can always use this slower option.

My version was to teleport neurons one at a time, so that at any given instant neurons at the source are getting impulses from neurons at the destination and vice versa, until all the neurons have been moved to the destination.
 

Back
Top Bottom