• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

On Consciousness

Is consciousness physical or metaphysical?


  • Total voters
    94
  • Poll closed .
Status
Not open for further replies.
What the hell ?

I didn't say chemicals felt anything. I said that the feeling of fear is a hormonal reaction.

Actually you said :
How would a machine "feel" without the associated chemistry ? I mean, fear is hormones, right ?

But I apologise for taking your comment too literally.

There is an interesting article on people who are unable to feel fear here :
http://www.newscientist.com/article...n-the-fearless-reveals-new-ways-to-panic.html

It seems in the case of fear at least; the emotion is largely the result of function of an area of the brain.
 
Last edited:
I had the same thought -- what is it about hormones that make feelings?

A hormone in the brain is analogous to a global variable in a computer program. A useful way to think about emotion hormones is as neurotransmitters not local to synapses.

There's nothing special about specific hormones that make them integral to or specific to feelings. That is, there's nothing about the molecule for a pleasure hormone that's pleasurable. The molecule is arbitrary. It just needs to match the receptors that translate it into the action potentials in nerve cells.

To emulate, say, a pleasure hormone in a computer, we declare a variable, name it "iDopamine" which is incremented when images of pretty things are detected, decremented over time, and is used by routines which detect and register it and alter behavior and remember it. Make sense?

I'm not saying that it's impossible to program. I'm saying that unless we put it in there (for no reason, by the way. I think giving computers emotions is a terrible idea. Their lack of emotion is their strength) computers will be intelligent and conscious but cold and emotionless.
 
... computers will be intelligent and conscious but cold and emotionless.

I suspect they'll be warm (and possibly emotionless). The energy requirements of that level of computation with see to that.
 
I'm not saying that it's impossible to program. I'm saying that unless we put it in there (for no reason, by the way. I think giving computers emotions is a terrible idea. Their lack of emotion is their strength) computers will be intelligent and conscious but cold and emotionless.

You say that like that's a bad thing.
 
I'm not saying that it's impossible to program. I'm saying that unless we put it in there (for no reason, by the way. I think giving computers emotions is a terrible idea. Their lack of emotion is their strength) computers will be intelligent and conscious but cold and emotionless.
Neural systems are already now being programmed through a system of "rewarding" the right decisions and "punishing" the wrong decisions. Would the variables used to register this not be akin to hormones in humans?

In other words, pleasure and pain is already programmed into computers, and it makes sense. Other emotions such as rage or sadness do not make sense, and are in any case also more complex.
 
Neural systems are already now being programmed through a system of "rewarding" the right decisions and "punishing" the wrong decisions. Would the variables used to register this not be akin to hormones in humans?

In other words, pleasure and pain is already programmed into computers, and it makes sense. Other emotions such as rage or sadness do not make sense, and are in any case also more complex.

...and we've been doing this for a long time: punishing and rewarding simulated neural networks. This article on Neural network softwareWP dates it to 1986. This was right around the time that I first experimented with adding emotions to computer programs.
 
I think there may be another option:

"Consciousness is a kind of data processing and the brain is a machine that can be in principle replicated in other substrates, but general purpose computers are just not made of the right stuff."

The idea that we could make a computer conscious may be as fanciful as thinking we can make trees conscious or that we can make lobsters achieve human-level consciousness.

I think that sometimes people elide the idea that there is nothing non-physical about consciousness with the idea that consciousness can be easily replicated out of any old junk. But that might not be true.

I was going to write something of this direction, but angrysoba did a great job. So I'll just quote him and applaud...

Consciousness IS dependent on our physical brains; but that does NOT mean that it can necessarily be duplicated in any other substrate. We simply do not know enough about how the brain functions even for physical systems, let alone the complexity and nature of consciousness, to predict that at this time.
 
I was going to write something of this direction, but angrysoba did a great job. So I'll just quote him and applaud...

Consciousness IS dependent on our physical brains; but that does NOT mean that it can necessarily be duplicated in any other substrate.
Consciousness is informational. That means that it can necessarily be duplicated in any other substrate.
 
You think it's not possible for a machine we built to spontaneously reach a dualistic conclusion about itself? To contemplate the nature of its qualia? To wonder where the ghost is in its machine, how it got there, or if it would survive its unplugging?

I dont see why not. If we build the right environment for it.. then yes, it is quite possible. But how can we convince the naysayers that our machine is conscious? (Remember the pZombie argument?) I do not think we can. Maybe it is because that consciousness has not yet become a scientific term: every theory about consciousness so far has failed to explain or satisfactorily integrate the subjective experience to the whole scheme of the theory.
 
I was going to write something of this direction, but angrysoba did a great job. So I'll just quote him and applaud...

Consciousness IS dependent on our physical brains; but that does NOT mean that it can necessarily be duplicated in any other substrate. We simply do not know enough about how the brain functions even for physical systems, let alone the complexity and nature of consciousness, to predict that at this time.

Peacefulluy, I disagree, yes, modeling one tilliion neurons with an average 5,000 connections may be difficult. However in principle it seems likely to be something that can be replicated on machines.

Now that does not mean that it would not require a lot of effort.
 
I was going to write something of this direction, but angrysoba did a great job. So I'll just quote him and applaud...

Consciousness IS dependent on our physical brains; but that does NOT mean that it can necessarily be duplicated in any other substrate. We simply do not know enough about how the brain functions even for physical systems, let alone the complexity and nature of consciousness, to predict that at this time.

If it's physical it can be duplicated.
 
I dont see why not. If we build the right environment for it.. then yes, it is quite possible. But how can we convince the naysayers that our machine is conscious? (Remember the pZombie argument?) I do not think we can. Maybe it is because that consciousness has not yet become a scientific term: every theory about consciousness so far has failed to explain or satisfactorily integrate the subjective experience to the whole scheme of the theory.

I'm pessimistic that we could ever have an objective test of the subjective experience. Our machine that acts so conscious that it wonders how its qualia arose will be judged intuitively (subjectively) by the human for its Turing TestWP, I'm afraid.

What could possibly be an objective test for the presence of subjective experience?
 
Last edited:
Status
Not open for further replies.

Back
Top Bottom