On Consciousness

Is consciousness physical or metaphysical?


  • Total voters
    94
  • Poll closed .
Status
Not open for further replies.
And this is in reference to?
Verklagekasper, of course. I thought I'd included enough mimicry to make it obvious. He's just giving your standard argument from ignorance backed up by special pleading, so there's not much point trying to argue with him directly. The bits of information theory he can't misunderstand, he'll define into irrelevance.
 
Verklagekasper, of course. I thought I'd included enough mimicry to make it obvious. He's just giving your standard argument from ignorance backed up by special pleading, so there's not much point trying to argue with him directly. The bits of information theory he can't misunderstand, he'll define into irrelevance.

I figured but I wasn't sure, thanks.
 
Argument from incredulity again.
No. You said you're claiming "that neural computer systems show characteristics that we can recognise in brains" - as if that was something impressive. You seem unaware that neural computer systems were designed to show exactly those characteristics. There's nothing mysterious about neural computers displaying the characterstics that they were desigend to display, just as there is nothing mysterious about Super Mario showing the characteristics that he was designed to display. There is nothing mysterious about neural computers, no matter how much you'd like to mystify them.
 
No. You said you're claiming "that neural computer systems show characteristics that we can recognise in brains" - as if that was something impressive. You seem unaware that neural computer systems were designed to show exactly those characteristics. There's nothing mysterious about neural computers displaying the characterstics that they were desigend to display, just as there is nothing mysterious about Super Mario showing the characteristics that he was designed to display. There is nothing mysterious about neural computers, no matter how much you'd like to mystify them.

Non sequitur. How is impressiveness or mysteriousness relevant to building a conscious computer? Neural computer systems simply work, and there's no evidence they'd never be conscious.
 
No. You said you're claiming "that neural computer systems show characteristics that we can recognise in brains" - as if that was something impressive.
I did not say it was impressive. You claim that the brain cannot be a computer because you do not believe it - an argument from incredulity. You would impress me more if you could point to something that a computer cannot do, but a brain can. You tried it with subjective feelings but failed, so perhaps you should try to salvage your argument, or find a better example.

There is nothing mysterious about neural computers, no matter how much you'd like to mystify them.
It is the claim of science that there is nothing mysterious about brains either, so what is your point? You are the only one who thinks that 'mysteriousness' is part of any argument here.
 
I think one issue is we keep making such a big deal out of consciousness. How could Des Cartes possibly have known that animals were not conscious? Sounds like he's just making stuff up.

I think the argument for quantum consciousness has to do with quarks' indeterminacy. We can't measure exactly what state they are in, so they determine their own state, therefore must have free will and be conscious. I think that's how the argument goes.

There are people who claim machines could never be conscious because we can't imagine how they could have feelings so they'd never write good poetry. Then they make the leap that the brain is a quantum computer, somehow deriving its feelings from quarks or something. It's one hell of an argument from ignorance -- I know.

FWIW we have taught some great apes and parrots to have conversations with us, and their thoughts are frankly not very interesting. All they use speech for is to beg, in simple or complex ways. One could argue that's all we do, though often in fiendishly subtle ways.


Yes, I think Descartes is well-known for his unhealthily excessive skepticism. I only brought him up to point what modern philosophers of science should avoid, in case we actually witness the awakenings of a new consciousness.
As for the rest, I totally agree.
 
I did not say it was impressive. You claim that the brain cannot be a computer because you do not believe it - an argument from incredulity. You would impress me more if you could point to something that a computer cannot do, but a brain can. .

At the moment, no computer can produce the sort of hologram that we call consciousness.

There currently exists no design, not even a concept, for building a computer that is conscious.

And if the current trends in neuroscience continue, there never will be, because holograms cannot be programmed, they must be produced.

Of course, the kind of hologram that our minds are… well, we're using the term very loosely there.

But regardless, let's be clear that consciousness is the result of electromechanical processes, not "informational" ones (whatever that may mean).

If you want to assert that the brain is some kind of computer, it's up to you to demonstrate that, not up to anyone else to disprove something you haven't demonstrated in the first place.
 
Seriously, hasn't all this been ALREADY demonstrated, Piggy ? I mean, it's like people asking us to do the work all over again for relativity or JFK's assassination or UFO pictures being fake just because they don't believe it.

From what I've seen in this thread and previous ones on the subject, it's been done to death.
 
Seriously, hasn't all this been ALREADY demonstrated, Piggy ? I mean, it's like people asking us to do the work all over again for relativity or JFK's assassination or UFO pictures being fake just because they don't believe it.

From what I've seen in this thread and previous ones on the subject, it's been done to death.

Yes. I see it as moving goalposts. Dualists keep stating that machines will never do such-and-so (e.g. beat chess masters), then when machines do, they say, "oh, that's nothing because they can't do THIS (e.g. wing gliding).

Again and again, there's a task computers can't do well (e.g. handwriting recognition), but when neural networks are simulated in the computer, suddenly machines can. It's reasonable to extrapolate that a neural network simulated in a computer will do what we do, including consciousness.

What evidence is there that it never could?
 
Again and again, there's a task computers can't do well (e.g. handwriting recognition), but when neural networks are simulated in the computer, suddenly machines can. It's reasonable to extrapolate that a neural network simulated in a computer will do what we do, including consciousness.

What evidence is there that it never could?

I reckon there is n't any. If we can manage somehow that the simulated neural networks of a computer will definitely "feel" fear, who could ever deny that the dualistic approach is consequentially failing? Yes, we can decidedly assert that there is no ghost in the machine.

from wikipedia: A large body of neurophysiological data seems to support epiphenomenalism, see Bereitschaftspotential.

In Consciousness Explained, Daniel Dennett distinguishes between a purely metaphysical sense of epiphenomenalism, in which the epiphenomenon has no causal impact at all, and Huxley's "steam whistle" epiphenomenalism, in which effects exist but are not functionally relevant.

I do not agree with Denett. He states that a quale or conscious experience would not belong to the category of objects of reference on this account, but rather to the category of ways of doing things (the same charge that Gilbert Ryle leveled against a Cartesian "ghost in the machine").

My own opinion is that what we call consciousness is an epiphenomenon.
 
Last edited:
[..]If we can manage somehow that the simulated neural networks of a computer will definitely "feel" fear[...]
Because fear is a subjective feeling, this will never be proven to everybody's satisfaction. Many people do not even accept that other creatures can feel the same as humans, even though they have the necessary brains.

When captured eels are thrown on salt to remove their layer of slime, they writhe in pain, but many fishermen believe that eels - being non-mammalian animals - cannot feel pain.

It is so easy to deny, and absolutely impossible to prove.
 
It is so easy to deny, and absolutely impossible to prove.

Absolutely. This is why I used a conditional expression.
But for the scientistic approach, it does nt really matter; what appears to be the case is the case ;)

The scientistic approach is a clear-cut tool, for that we are grateful to Bacon and Popper. Science, gradually and effectively, eliminates idealism and dualism. We simply gonna have to wait. :)
 
You think it's not possible for a machine we built to spontaneously reach a dualistic conclusion about itself? To contemplate the nature of its qualia? To wonder where the ghost is in its machine, how it got there, or if it would survive its unplugging?
 
What the hell ?

I didn't say chemicals felt anything. I said that the feeling of fear is a hormonal reaction. Computers not having those, why would they have emotions at all ? Thoughts, sure, but emotions ? Where would they get them ?

I had the same thought -- what is it about hormones that make feelings?

A hormone in the brain is analogous to a global variable in a computer program. A useful way to think about emotion hormones is as neurotransmitters not local to synapses.

There's nothing special about specific hormones that make them integral to or specific to feelings. That is, there's nothing about the molecule for a pleasure hormone that's pleasurable. The molecule is arbitrary. It just needs to match the receptors that translate it into the action potentials in nerve cells.

To emulate, say, a pleasure hormone in a computer, we declare a variable, name it "iDopamine" which is incremented when images of pretty things are detected, decremented over time, and is used by routines which detect and register it and alter behavior and remember it. Make sense?
 
How would a machine "feel" without the associated chemistry ? I mean, fear is hormones, right ? Why would I want a machine to have that ?

It is difficult to generalize regarding "feelings" and "emotions" and the role hormones may play in these states. An emotion arises from activity in circuits in particular brain areas. As such it would be just as possible to build emotional reactions into a machine, as to build in the capacities for self referential "thought", planning, memory, sensation, etc.

In mammals, the emotion of fear is accompanied by hormone release from the adrenals resulting in peripheral vaso-constriction, vaso-dilation in particular vascular beds (muscle), increased heart rate, etc. The effect of the hormones on the body is being monitored by the sensory systems, and so the activation of sensory circuits in particular patterns (racing heart, pressure in the head, wet underwear, etc.) is part of the overall neural state of being afraid. Fear is still a neural state however.

The role of oxytocin in bonding and attraction can be described in a similar way. The overall state of being attracted, or in "love" involves the sensory patterns evoked in neural circuits in response to the activity of the hormones on the body, but the emotional state itself is still just a complex of neural activity in multiple interconnected circuits in the brain.

In some cases the hormones may have direct access to certain neural circuits where they may play direct roles in synaptic transmission, but these will be limited to areas where the blood brain barrier is ineffective.

If emotions can be described as complex patterns of neural activity in particular neural systems within the brain, then there is no reason to believe they could not be built into a thinking machine, even without the activity of systemic hormones.

One would want to be cautious about building in too strong a sense of self preservation I would think.
 
Status
Not open for further replies.

Back
Top Bottom