• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Sentient machines

Why wouldn't I?
You wouldn't because you want to take behaviors which you think allude to conscious perception at face value and say that they unquestionably indicate consciousness. This is, at least, how your statements so far can be interpreted. If you really wouldn't do such a thing and I read you incorrectly, then good for you.
Perhaps it is a false dichotomy in that sense, but only if there is any meaningful difference between "organic substances" and inorganic ones. However, is there? Not in the slightest. Both are made of the same atoms, so there is no reason why there should be any difference between the two. Either there is some kind of "natural kind" to an organic substance, which is of course nonsense, or there is some special 'ghost in the machine' to organic substances. How, exactly, is there a difference between the two?
I don't think you could find a single chemist that would tell you that the properties of different chemicals are the same because all chemicals are composed of protons, neutrons, and electrons. It's pretty presumptuous of you, in the face of all the countervailing experimental evidence, to proclaim all chemicals or chemical compounds the same. Organic molecules have properties which are distinct to them just as much as any other family of molecules will have its own defining attributes.
 
You wouldn't because you want to take behaviors which you think allude to conscious perception at face value and say that they unquestionably indicate consciousness. This is, at least, how your statements so far can be interpreted. If you really wouldn't do such a thing and I read you incorrectly, then good for you.

I apologise if my comments have come off this way, for this is not how I think. I think there is no such thing as 'consciousness', for consciousness is just a word we have applied to a specific phenomenon created by physical processes in the brain. A p-zombie, to my understanding, is a creature that appears conscious, and has the same processes in the brain. No, I can find now difference between this and any human, for truely something with the same processes as the brain must be conscious.

I don't think you could find a single chemist that would tell you that the properties of different chemicals are the same because all chemicals are composed of protons, neutrons, and electrons. It's pretty presumptuous of you, in the face of all the countervailing experimental evidence, to proclaim all chemicals or chemical compounds the same. Organic molecules have properties which are distinct to them just as much as any other family of molecules will have its own defining attributes.

Firstly, I did not say properties. I know different chemicals have different properties. However, you now seem to be saying that consciousness is created by the chemicals that make up the brain. This is different. I was arguing that, if consciousness is created by the processes in the brain, then there is no difference between a 'natural' brain and a 'mechanical' brain. If you argue that it is, somehow, the chemicals which form consciousness, I'd be interested as to how you come to this conclusion.
 
I apologise if my comments have come off this way, for this is not how I think. I think there is no such thing as 'consciousness', for consciousness is just a word we have applied to a specific phenomenon created by physical processes in the brain. A p-zombie, to my understanding, is a creature that appears conscious, and has the same processes in the brain. No, I can find now difference between this and any human, for truely something with the same processes as the brain must be conscious.
If this is what you're doing, then you're just taking the element of subjective experience out of the definition of consciousness. You're not describing consciousness anymore. You're basically saying that everyone is a p-zombie, that there is no such thing as a private subjective experience, but are then changing the definition of the word "consciousness" in order to demonstrate to yourself that people aren't p-zombies. You can't change the conceptual nature of things by relabeling concepts to give their observable interaction the superficial appearance of fitting your worldview. You seem to think that the debate over consciousness is just a semantic one when it is not.
Firstly, I did not say properties. I know different chemicals have different properties. However, you now seem to be saying that consciousness is created by the chemicals that make up the brain. This is different. I was arguing that, if consciousness is created by the processes in the brain, then there is no difference between a 'natural' brain and a 'mechanical' brain. If you argue that it is, somehow, the chemicals which form consciousness, I'd be interested as to how you come to this conclusion.
I think you might not understand what "organic" means. Organic compounds denote a class of molecules that are carbon-based. It has nothing to do with what is "natural" or what is "mechanical."
 
Consciousness is the awareness of a thing. Awareness can be determined only if a change in the observed thing results in a change in the aware thing.l

What is the entity that is aware? If you see something, is your eye aware of it, or does it merely process signals? Are the parts of the brain that monitor and process signals from the eye aware?
 
If this is what you're doing, then you're just taking the element of subjective experience out of the definition of consciousness. You're not describing consciousness anymore. You're basically saying that everyone is a p-zombie, that there is no such thing as a private subjective experience, but are then changing the definition of the word "consciousness" in order to demonstrate to yourself that people aren't p-zombies. You can't change the conceptual nature of things by relabeling concepts to give their observable interaction the superficial appearance of fitting your worldview. You seem to think that the debate over consciousness is just a semantic one when it is not.

Firstly, you are the one with a problem with people being p-zombies, not I. Secondly, you misunderstand my view point. To me, we are just robots. Thus, while there are subjective experiences, there is no 'magical' consciousness, as it is normally thought of. We just accept stimulae and return an output. You seem to argue that there is more to it the that. I've asked for your proof. You have said that we all have consciousness, or the "experience of our experience" (as far as I can tell). I've said this time and time again. Materialism gives both an explanation for consciousness and is able to predict it. I've yet to hear a theory that does better.

I think you might not understand what "organic" means. Organic compounds denote a class of molecules that are carbon-based. It has nothing to do with what is "natural" or what is "mechanical."

*sigh* No, Batman Jr., I do not. I know exactly what an organic substance is. This is, naturally, not what I was refering to. You had said (or implied) that there is a difference between a human brain and a computer that simulates perfectly a human brain. The human brain being the organic, and the computer brain being the mechanical. I asked you for proof that there would be any difference between a human brain and a perfect simulation of a human brain. I've yet to read it.
 
Last edited:
I think you might not understand what "organic" means. Organic compounds denote a class of molecules that are carbon-based. It has nothing to do with what is "natural" or what is "mechanical."

I don't really get this argument. Are you suggesting that only organic compounds could be put together to form a concious brain?
If so, I'm sure that you at least recognise that the structure of those organic compounds is necessary, but perhaps it's more than just the structure, is also has to be composed of those compounds specifically, for some reason, in order to work?
Okay. But what reason do you have for suggesting that? Is there any reason to beleive that what the brain does is dependant on what it's made of?
What I mean is that while it might be true that the material of the brain is a good substance for what it does, why should it be the only possible one?
A bridge could be made of wood or steel or aluminum or rope and still function. They would be different in some ways, but for the function of a bridge, given a specific gap, there are multiple materials that might work for making it.
Computers are the same, silicon chips are a good thing to use to make computers, but transistors can still do the job. Or vacuum tubes.

Maybe brains aren't like that. OKay, but I can't really see why.

That said, even if it's true, what would stop us from making a computer from organic compounds?
 
I'd like to add my own viewpoint to this thread.

I think it's possible to create a computer that is concious. I think that a perfect simulation of the human brain would necessarily be concious.
I also think that it would be possible to create a concious computer that was very very different from a human brain, and even much less complex (probably) if we understood conciousness well enough and had as a goal creating a concious machine. I can't see why we would have that goal, though.

On the other hand, just because a computer could simulate the actions of a human being well enough to fool anyone doesn't prove that it is concious, at least not to me.
The reason is that we have no reason to believe that the way that the computer is mimicing those actions is similar to the way that we create them in the first place. We might be far less efficient, or it might be. On the other hand I can see that it might be likely.
One poster suggested that it's not a given because, after all, we all know that people are capable of creating a false impression of emotion. If we can lie about what we're feeling, why not a computer?
On the other hand, I find it difficult to lie about my emotions. I think most of us do, at least more difficult than presenting them as they are. I think this is because it actually is more difficult to present an emotion without feeling it than it is to feel it, in principle.
But I could be wrong about that.

So I think if a robot acted human enough, I'd figure it was concious, but I wouldn't put as much certainty in the fact as I do in other human beings. But I also think there are other ways we might determine it's conciousness. Some of those we haven't figured out yet.
 
Firstly, you are the one with a problem with people being p-zombies, not I. Secondly, you misunderstand my view point. To me, we are just robots.
I don't have a problem with p-zombies. My agnosticism depends on being open to the possibility of p-zombies existing, even to the possibility that I myself am a p-zombie.
Thus, while there are subjective experiences, there is no 'magical' consciousness, as it is normally thought of. We just accept stimulae and return an output.
The phenomena our brains exhibit of accepting stimuli and returning outputs are not sufficient evidence for subjective experience. Subjective experience can be nonexistent and, at the same time, these scientifically observable processes could remain unaffected. This is exactly my point.
You seem to argue that there is more to it the that. I've asked for your proof. You have said that we all have consciousness, or the "experience of our experience" (as far as I can tell). I've said this time and time again. Materialism gives both an explanation for consciousness and is able to predict it. I've yet to hear a theory that does better.
The part which is the "more to it than that" is the subjective experience. You cannot observe subjective experience, and your saying that I said that we all have subjective experience is a straw man. I have maintained throughout the course of this entire argument that we can't know of any such thing. With that in mind, you also can't claim your materialism to have predictive and explanatory value when it is impossible to check your predictions and explanations.
*sigh* No, Batman Jr., I do not. I know exactly what an organic substance is. This is, naturally, not what I was refering to. You had said (or implied) that there is a difference between a human brain and a computer that simulates perfectly a human brain. The human brain being the organic, and the computer brain being the mechanical. I asked you for proof that there would be any difference between a human brain and a perfect simulation of a human brain. I've yet to read it.
Whatever, but we now know I was trying to highlight the chemical differences between the makeup of a computer and a brain. If it is to be assumed that the computer wouldn't use organic compounds, then that would constitute proof that the computer simulation would be different from the actual brain. Since these differences in chemical makeup would go unaccounted for in a computer replica of the neurology of the brain, you could not conclude that the simulation and the brain are the same.

roboramma said:
I don't really get this argument. Are you suggesting that only organic compounds could be put together to form a concious brain?
No, I'm saying that chemical makeup couldn't be discounted as a factor in creating consciousness.
roboramma said:
If so, I'm sure that you at least recognise that the structure of those organic compounds is necessary, but perhaps it's more than just the structure, is also has to be composed of those compounds specifically, for some reason, in order to work?
We know that behavior is caused by structure. In order to show that structure and subjective experience are related, you would first have to show that subjective experience is a necessary epiphenomenon of behavior and then construct a syllogism using the premises linking structure and behavior and behavior and subjective experience to prove that structure causes subjective experience. This is impossible given that subjective experience cannot be observed.

roboramma said:
Okay. But what reason do you have for suggesting that? Is there any reason to beleive that what the brain does is dependant on what it's made of?
Proving that it is dependent on chemical makeup is not my point. I'm explaining why it hasn't been proven to be independent of chemical makeup.

roboramma said:
What I mean is that while it might be true that the material of the brain is a good substance for what it does, why should it be the only possible one?
I don't claim that it is. I only say that it is possible that some substances wouldn't produce subjective experience.
roboramma said:
A bridge could be made of wood or steel or aluminum or rope and still function. They would be different in some ways, but for the function of a bridge, given a specific gap, there are multiple materials that might work for making it.
This is true, but we cannot check the functions necessary for the existence of consciousness, so we can't tell which substances work and which don't.

roboramma said:
Computers are the same, silicon chips are a good thing to use to make computers, but transistors can still do the job. Or vacuum tubes.

Maybe brains aren't like that. OKay, but I can't really see why.

That said, even if it's true, what would stop us from making a computer from organic compounds?
You could make a computer out of organic compounds to equalize things more, but you still wouldn't be able to tell if the human brain that it is being compared to is conscious to begin with.
 
On the other hand, just because a computer could simulate the actions of a human being well enough to fool anyone doesn't prove that it is concious, at least not to me.
Then the behavior of a human being wouldn't be enough to prove to you that he or she was conscious.

That being the case: what would? What, if anything, leads you to believe that humans are conscious?
 
Then the behavior of a human being wouldn't be enough to prove to you that he or she was conscious.

That being the case: what would? What, if anything, leads you to believe that humans are conscious?
Proof and "leads you to believe" I think are quite different but perhaps I'm wrong. Also there is no absolute proof to anything but that fact should not itself allow for anything less than scientific rigor when we are talking about proof, right? In any event, what leads me to believe that humans are conscious are emotions including empathy and the ability to recognize that another person is capable of emotions and thought. I could assume that emotion is not conscious based but this seems irrational to me. Just thinking off the top of my head here. Please feel free to pick apart my reasoning (assuming there is any there to pick apart).

RandFan
 
Then the behavior of a human being wouldn't be enough to prove to you that he or she was conscious.

That being the case: what would? What, if anything, leads you to believe that humans are conscious?
First off, as I said, I would take the computer simulating human behavior well enough as evidence that it was concious, and I would probably treat it as such.
But I take it as less strong evidence than a human displaying those same behaviors. Why? Because I can be reasonably certain that not only the behaviors themselves, but also the mechanisms behind them are the same in other humans as they are in me.
They were developed by the same process (natural selection), and are probably the same throughout our species.

There is less reason to believe that the underlying mechanism in a robot making a simulation of human conciousness would be the same.

If two different people were to independantly write a computer program that was to have specific outputs, they might do so in very different ways, but still deliver the same outputs. It doesn't mean the programs are the same. And anything else that arose from those programs (maybe bugs) wouldn't necessarily be the same from one to the other.

I don't think that one could create a human-like intelligence without either designing conciousness into it somehow, or having conciousness arise from it. But I don't think that's foregone conclusion either. It seems likely, but not given. And slightly less likely than that two human minds which not only have the same outputs, but the same programmer, are both concious. I doubt that conciousness just appeared in me after evolution caused the rest of the human race to act concious. But a computer that was only programmed to mimic it doesn't have that same argument.
 
Then the behavior of a human being wouldn't be enough to prove to you that he or she was conscious.

That being the case: what would? What, if anything, leads you to believe that humans are conscious?
Just to make my last post more concise. Behavior plus similar origin leads me to believe other humans are concious. Only behavior would lead me to beleive a robot was concious. A strong argument still, I think, but not the same argument.
 
Batman: Let us assume that we know for certain that we are conscious. Or more specifically, we each know that ourself is conscious. Now, we have no way of knowing if anyone else is conscious, but we assume that they are based on their actions. If, however, we make this assumption, then why do we not also make the assumption that a computer would be to, based on its actions? If you say "because a computer is made of different stuff", I would ask how you know that this makes any difference? I do not need to show you that it doesn't make any difference, you need to show me that it does. Sure, they are made of different chemicals, but how would this affect the processes in the brain? If you accept human consciousness, but deny computer consciousness, I say you are either attributing a ghost in the machine to human consciousness, or you have some kind of definition of a 'natural kind' of brain and an 'unnatural kind of brain'. I think you need to make your position clearer to continue this discussion, so I will ask you a few questions I would like answered.

1) Why do you believe other humans are conscious.
2) Why can a computer not be conscious based on these same criteria.
3) How does the chemical make up of the brain affect the processes (keeping in mind that, as Robo said, computers can be made of different materials and still have the same function).
4) What is your basis of belief that there is any difference between acting conscious and having conscious.
5) Define "subjective experience" as you define it.
 
Batman: Let us assume that we know for certain that we are conscious. Or more specifically, we each know that ourself is conscious. Now, we have no way of knowing if anyone else is conscious, but we assume that they are based on their actions. If, however, we make this assumption, then why do we not also make the assumption that a computer would be to, based on its actions? If you say "because a computer is made of different stuff", I would ask how you know that this makes any difference? I do not need to show you that it doesn't make any difference, you need to show me that it does. Sure, they are made of different chemicals, but how would this affect the processes in the brain? If you accept human consciousness, but deny computer consciousness, I say you are either attributing a ghost in the machine to human consciousness, or you have some kind of definition of a 'natural kind' of brain and an 'unnatural kind of brain'. I think you need to make your position clearer to continue this discussion, so I will ask you a few questions I would like answered.

1) Why do you believe other humans are conscious.
2) Why can a computer not be conscious based on these same criteria.
3) How does the chemical make up of the brain affect the processes (keeping in mind that, as Robo said, computers can be made of different materials and still have the same function).
4) What is your basis of belief that there is any difference between acting conscious and having conscious.
5) Define "subjective experience" as you define it.
If we believe one to be conscious based on their actions and the computer exhibits those same conscious actions, then yes, we would have to say the computer is conscious. I'm not talking about this problem in terms of our day-to-day pragmatism but from an epistemological point of view instead. In other words, what I'm really doing is questioning my own perceptions of what is conscious and what isn't.
 
If this is the case, I see no reason to deny a computer consciousness.
If you use behavior as the defining criterion, then you can't deny a computer consciousness; this is correct. On the other hand, if you use private subjective experience as the defining criterion, then it is possible to deny the consciousnesses of computers and humans alike.
 
Last edited:
If you use behavior as the defining criteria, then you can't deny a computer consciousness; this is correct. On the other hand, if you use private subjective experience as the defining criteria, then it is possible to deny the consciousnesses of computers and humans alike.

I agree. But isn't it a bit of a useless criteria? I have no particular need to be a solipsist, and I'm sure you don't either.
 
I agree. But isn't it a bit of a useless criteria? I have no particular need to be a solipsist, and I'm sure you don't either.
It's not useless because it helps us to understand why science can never understand what consciousness is or determine whether it exists or is just an imaginary phantom of behavior.
 
It's not useless because it helps us to understand why science can never understand what consciousness is or determine whether it exists or is just an imaginary phantom of behavior.

It is only like this if we are solipsists. There are other problems with being a solipsist just beyond what I said before.
 

Back
Top Bottom