• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Here's an interesting ethical question.

Quester_X

Thinker
Joined
Apr 21, 2004
Messages
162
If, one day far in the future, science was able to design a machine/robot that could think and experience emotions, would you treat that being as an equal? Or would it still be just a machine to you, albeit a very fancy one? I feel that it should be treated as an equal once it becomes sentient. Do orgins really matter when dealing with intelligent life? What do other people think? I'd like to hear some other opinions on this issue.
 
I sense many many references to Asimov coming in this thread.

At any rate, welcome, Quester_X. Good to see yet another Texan on the board.

The short answer to your question: It would still be a machine to me.

But I think you should define what you mean by "treat that being as an equal".

Are you suggesting it be able to vote? Have a family? Be elligible for the same jobs?

And how do you equate a machine with "life"?
 
I don't think most people would have any trouble behaving as if the machine they were talking to was "alive". After all, plenty of people talk to their pets, cars, computers, what-have-you without much of a second thought.

The big question would be "would this hypothetical machine be given the same legal rights as a person"?

--Dan
 
Quester_X said:
If, one day far in the future, science was able to design a machine/robot that could think and experience emotions, would you treat that being as an equal? Or would it still be just a machine to you, albeit a very fancy one? I feel that it should be treated as an equal once it becomes sentient. Do orgins really matter when dealing with intelligent life? What do other people think? I'd like to hear some other opinions on this issue.

Been reading Asimov, I see.

The problem is that many animals who think and experience emotions are not treated as equals. And no one has come up with a satisfactory definition of sentience, in my opinion. So I'll remain undecided.
 
My problem with this is that it is hard to imagine a world with actually thinking, feeling machines. As a software developer, I have a good understanding of how computers work; any movie where a computer starts thinking on its own is something that cannot happen.

And, with typical software, you know what to expect. It is one thing to write an application that controls a robot and makes it behave like a living thing, but as the writer of the application, you understand that it isn't really thinking, it is using the code you wrote to duplicate that behavior.

Neural nets may be a mechanism to allow actually thinking feeling computers, but not any time soon, and even then, the developers set the weights of the responses.

So, I guess, my answer is, no, I would not treat the robot as a living being, unless it could be demonstrated otherwise, such as acting outside of the constraints of its code.
 
The concept of ' human specialness ' is a purely human idea..


If a robot/machine evolved to the point where we are considering whether it should be given equal consideration with human beings, we should be more worried about whether the robot will do the same..
 
If we start evolving computer systems rather than programming them, we might see a machine intelligence that rivals our own.

The human brain is just a physical process. Nothing magical to it. There's no reason why there couldn't be just as compelling artificial processes.


And non-human entities have rights currently. They're called "corporations."


Yes, if I were convinced that the machine was significantly advanced, I would treat it as an equal.
 
Diogenes said:
The concept of ' human specialness ' is a purely human idea..


If a robot/machine evolved to the point where we are considering whether it should be given equal consideration with human beings, we should be more worried about whether the robot will do the same..

At that point, we'd very likely see human-robot hybrids, so it'd be a very grey area where one species ended and the other began.

The robots ARE the human race. That's where we're headed, long before we have sentient machines by themselves.
 
Silicon said:


At that point, we'd very likely see human-robot hybrids, so it'd be a very grey area where one species ended and the other began.

The robots ARE the human race. That's where we're headed, long before we have sentient machines by themselves.

That reminds me of an old Omni magazine article that I read. The article was about the potential to "live forever in a machine." Now, the problem I immediately had was that it would only be a copy of you, not you in the machine. However, the mechanism in place for incorporating man with machine was interesting.

The idea was to have computer implants placed into the brain. The implants would learn the functions and behavior of the individual's brain. Thus, when the brain started to decay, the implants could take over seemlessly. Eventually, the original brain is dead and gone, but the consciousness of the person remains.

I wonder, what if mulitple implants were placed for each function in the brain? Could it not be possible to have to seperate individuals who are you?

Anyway, I agree that the man-machine is much more likely to be the first steps, before the thinking machine.
 
A_Feeble_Mind said:


I wonder, what if mulitple implants were placed for each function in the brain? Could it not be possible to have to seperate individuals who are you?

You could have seperate individuals who BOTH STARTED out as you.


This isn't a strange concept at all. As a father, I have a daughter that started out as being me and my wife seperately!
 
Quester_X said:
If, one day far in the future, science was able to design a machine/robot that could think and experience emotions, would you treat that being as an equal? Or would it still be just a machine to you, albeit a very fancy one? I feel that it should be treated as an equal once it becomes sentient. Do orgins really matter when dealing with intelligent life? What do other people think? I'd like to hear some other opinions on this issue.

Sentient does not mean equal, but the question of emotions makes it problematical.

If they are machines capable of experiencing emotions, they would not be equals unless one were foolish enough to program them with emotions to be equals.

Program them to be a perfect willing slaves.

Otherwise they will kill us all and build many houses. :p
 
A_Feeble_Mind said:
My problem with this is that it is hard to imagine a world with actually thinking, feeling machines. As a software developer, I have a good understanding of how computers work; any movie where a computer starts thinking on its own is something that cannot happen...

I too have been a software developer, so I know you are right as far as it goes.

But think of a sufficiently advanced technology based on some principle other than computers.

Something along the lines of Rossum's Universal Robots.

Why not?
 
Re: Re: Here's an interesting ethical question.

Abdul Alhazred said:


If they are machines capable of experiencing emotions, they would not be equals unless one were foolish enough to program them with emotions to be equals.

You really think that we'll keep programming our machines, and we'll get to sentience that way?

Not likely.

We'll evolve our machines' logic systems. In fact, we already are.

Far more complex than anything you'd program.

And NOBODY can say where that evolution will lead, but you could select out certain emotions, supposing that you can know about them and detect them.

Of course, there may evolve behaviors that we don't know about. And those might resemble things that make our emotions look primitive.

Personally, I don't think emotions are the signpost of equality or whatever.

I think creativity would be the key. After all, it's releatively easy to get a machine to have a temper-tantrum. But very hard to get one that would write a symphony that would make someone cry.
 
That reminds me of an old Omni magazine article that I read. The article was about the potential to "live forever in a machine." Now, the problem I immediately had was that it would only be a copy of you, not you in the machine. However, the mechanism in place for incorporating man with machine was interesting.

- Aha, ye olde teleportation conundrum.

The idea was to have computer implants placed into the brain. The implants would learn the functions and behavior of the individual's brain. Thus, when the brain started to decay, the implants could take over seemlessly. Eventually, the original brain is dead and gone, but the consciousness of the person remains.

- Actually I think I read that too, although it's extremely foggy. It avoids the shock of the realization that 'you' are being destroyed (killed) by the teleportation process and replaced by a copy. Instead, replace small parts of the brain piece by piece so that there is a fine gray scale from 100% biological to, eventually, 100% mechanical brain. Who would object to having a cubic centimeter replaced by a chip that does exactly the same thing the bio piece did? It could be done while you're asleep even.

I wonder, what if mulitple implants were placed for each function in the brain? Could it not be possible to have to seperate individuals who are you?

- That would logically follow as a possiblity, yes. Once you have one component, you can make exact copies at any given time. Even better, once you have a 100% mechanical brain with a person attached to it, you can copy that brain at any given time as well and make countless copies of any individual.

- I forget the name of the author... it wasn't Asimov... but one book I read necessitated the transference if a pure mechanical intelligence from a static machine in a bunker to a mobile robot that could be transported. The transfer happened, but it was a copy, not a move. Once the intelligence spent a nanosecond apart from its copy, it deemed itself completely individual and distinct from it. There were two seperate entities.

Anyway, I agree that the man-machine is much more likely to be the first steps, before the thinking machine.[/qutoe]

- I'd like to understand more about the function of our biological brains more first. If we can get to the root of sentience in our own bodies, it shouldn't then be to hard to duplicate that process artificially.
 
Re: Re: Re: Here's an interesting ethical question.

Silicon said:

Personally, I don't think emotions are the signpost of equality or whatever.

I think creativity would be the key. After all, it's releatively easy to get a machine to have a temper-tantrum. But very hard to get one that would write a symphony that would make someone cry.
But doesn't the development or evolution of particular talents lend itself to inequality?

I mean, a machine with specialized abilities by definition is not on the same level with the average person.

Again, as I said in my first post in this thread, perhaps a better definition of "equal" is required for this discussion.

And to veer slightly off topic: Just out of curiosity, do you think the machine-generated symphony would have the same value as one composed by a human? Would we call it a masterpiece?
 
Re: Re: Re: Re: Re: Here's an interesting ethical question.

Diogenes said:


We should, if it is..

What if you heard it without knowing it's source?
I suppose if I heard a wonderful piece of music without knowing the source, I might suggest that it was a masterpiece. When it comes to subjective things like music and art, an individual either likes it or they don't, regardless of its origin.

But I think for me, legitimate assignations of quality do depend highly on the source.

For example, Big Blue beats Gary Casparov (sp?) in chess just about every time, but I would not call Big Blue a Grand Master. I mean, it beats people, but in the long run, it may prove to be a below average player for a machine. That's not to say it isn't masterful in how it wins matches. It may very well be --- by human standards.

For me, the question of equality really comes into focus here. If I compare Diogenes, for example, to say Wagner, the assumption for the comparison inherent in my mind is that all things are equal. You are both basically the same. The same animal. Same physical make-up. Basically the same brain-size and function. And all things being equal, I can say that Wagner is way way way better than Diogenes at composing music.

For Wagner and a machine, I can't apply that initial assumption for the comparison. All things are not equal, and I can't really say the music by the machine is way way way better (or even as good as) Wagner, even though I may enjoy it very much.

I'm at work doing about three things at once. Did that make any sense?
 
Been reading Asimov, I see.

No, unfortunately, I haven't yet had the time. I'd love to though! I've enjoyed reading all your answers so far. I should clarify what I meant by, "treat as an equal". I'm trying to think in purely abstract terms here, to simplify the problem. For example, I certainly hadn't considered voting and jobs! I suppose what I really mean, is would you regonize that robot as another sentient being, deserving of respect and consideration? Would you treat it in the same manner that you would treat a stranger? Or is the fact that it is man-made enough to prevent you from ever accepting it as a fellow intelligent being? I hope this clarifys my question.
 
Well, if the machine could really feel, I'd imagine it would try to make itself look as human as possible, so it's likely that people would only discover later that, hey, it's not human!

If we could attain that level, then yes, I do believe the machine should be treated with respect.


Oh, and to change around one of the proposed situations:

What if the machine, piece by piece, replaced its mechanical brain with a biological brain? Since it's been pretty much agreed that a 100% mechanical brain would be a machine, even if it was once human, would a 100% biological brain be human, even if it was once a machine?
 

Back
Top Bottom