Are You Conscious?

Are you concious?

  • Of course, what a stupid question

    Votes: 89 61.8%
  • Maybe

    Votes: 40 27.8%
  • No

    Votes: 15 10.4%

  • Total voters
    144
No, it isn't. I've already said that I can have different behaviours for the same pain, and the same behaviour for different pains. I do not experience pain due to watching how other people behave when they are (presumably) suffering pain. I just behave in a similar way.

Pain is a SET of behaviours, then.

Anybody who has ever suffered a toothache in silence knows that pain is not behaviour.

You know, you should really think before stating things like that. I've had toothaches, and pain is clearly a behaviour.
 
I'm talking about my own pain. I can behave entirely differently for the same pain through simple choice.

Women from different cultures behave quite differently in childbirth. Are we to assume that their actual experience of pain is different?

A statement like "pain is behaviour" is so obviously, provably wrong, that it could only be made in a philosophical discussion.

The only reason why it's so obviously wrong is because you're using "behaviour" to mean "a whole individual's behaviour". It is PART of your behaviour, in other words, behaviour of part of you. Your reaction to the pain is another matter entirely.
 
No, it isn't. I've already said that I can have different behaviours for the same pain, and the same behaviour for different pains. I do not experience pain due to watching how other people behave when they are (presumably) suffering pain. I just behave in a similar way.

Anybody who has ever suffered a toothache in silence knows that pain is not behaviour.

Judging from his use of the term, I think Belz is using the word 'behavior' to mean 'something that happens' rather than 'something someone does'.
 
Judging from his use of the term, I think Belz is using the word 'behavior' to mean 'something that happens' rather than 'something someone does'.

Well, not exactly, but close. You have to remember that, when a muscle contracts, it behaves in some way. "Why do we have contracting muscles ?" is the same question as "Why do we have qualia ?"; and both mean "why does part X of me behave in this way ?"
 
I think most scientists would state this simply as "pain elicits behavior (x, y, z..)"

The pain is a form of qualia and how we react to it externally (public behavior, e.g., scream or reflexively move) may and/or may not be consciously sensed internally as other forms of awareness qualia at all.
 
I'm not arguing that computation plays no role in consciousness, or that thinking is not a form of computation. The problem with computationalism is that it approaches the subject strictly as a computational abstraction. Consciousness is not simply a matter of performing the right ops; what is essential is the -kind- of system performing the ops. What we should be focusing on is finding out what substrates provide conditions sufficient for producing consciousness.


I'm not sure that I would agree with that. I just listed out all the potential problems that I can see with a digital computer duplicating what neurons do, but I'm not sure that there isn't a way to overcome these issues -- maybe the AI folks can help me out here.

Initially I thought that spatial summation might be a problem, but I don't think it is. Temporal summation is clearly no problem.

The yes/no, 0/1 easily duplicates what occurs at the axon hillock. The multiple inputs (dendritic synapses) can be duplicated with multiple weighted inputs.

The two big problems, as far as neurons are concerned, are these: (1) I don't see an easy way to duplicate the actions of metabotropic receptors which essentially can alter the membrane threshold for seconds to minutes (it's more complicated than that) or the other metabotropic receptors that cause longer term changes (essentially in threshold) by turning on certain genes; and (2) we start life with orders of magnitude more neurons than we now have with most dying because they do not receive input (they don't learn).

Other problems seem to me to be conceptual -- we view the nervous system and perception in particular ways that are simply not correct. We think that there are separate sensory and motor systems, but there is simply no such clear cut division. We don't so much receive visual information at higher levels as hypothesize what is out there and construct our image of the world based on the lower level inputs we have received. Seeing is an active process; we see things that we find reasons to attend to. The other issue concerns what emotion and feeling may be. If it is the case that they are processed sensory information with cognitive reasoning that promotes a behavioral tendency, then it isn't clear to me what type of computation would perform that task. One of the consequences is that it wouldn't be just computation; it would necessarily have to be causal activity -- a behavioral tendency makes no sense as a computation, but only within a causal system -- so this is only going to be possible in a robot. Doing it in desktop just isn't going to work.

I think we simply need to think of computation in a different way. Certainly not as pure abstraction, since that will never be able to do what needs to be done for consciousness. The brain works as a sensorimotor integrator. We cannot leave out the motor side of the equation and get an answer to this issue.
 
Judging from his use of the term, I think Belz is using the word 'behavior' to mean 'something that happens' rather than 'something someone does'.

It's because it's clearly a confusing use of the word that I'm disinclined to accept it. Experience - what we feel - is something entirely different to behaviour - what we do. Saying that experience is actually behaviour, but a different sort of private behaviour that only we see, is just a way to combine two things together when the entire discussion is about their relationship.

Accepting that pain is a private behaviour is just another way of accepting that we understand what pain is - when we don't.
 
I'm not arguing that computation plays no role in consciousness, or that thinking is not a form of computation. The problem with computationalism is that it approaches the subject strictly as a computational abstraction. Consciousness is not simply a matter of performing the right ops; what is essential is the -kind- of system performing the ops. What we should be focusing on is finding out what substrates provide conditions sufficient for producing consciousness.
My question is, why do you think it is even possible for the substrate to matter? No matter what the substrate, running the same algorithm will give the same result.

In other words, consciousness is simply a matter of performing the right ops, what is irrelevant is the kind of system performing the ops.
 
My question is, why do you think it is even possible for the substrate to matter? No matter what the substrate, running the same algorithm will give the same result.

In other words, consciousness is simply a matter of performing the right ops, what is irrelevant is the kind of system performing the ops.

Do you honestly mean to argue that the chemical properties of the biological substrate have no bearing at all on the sensations that are produced, or whether or not they can be produced at all? Are you really that dense?
 
Do you honestly mean to argue that the chemical properties of the biological substrate have no bearing at all on the sensations that are produced, or whether or not they can be produced at all?
Yes.

Are you really that dense?
Are you? I'd suggest you take it up with Church and Turing, but they're both dead. David Deutsch is still around though.

It matters that the substrate supports computation of the necessary complexity and speed relative to the environment. But that's all that matters - and all that can matter. It is, for example, impossible for the human mind not to be simulatable.

The pattern of computations matters. The substrate does not, and cannot, matter beyond the computations that are performed. Those computations, performed on any suitable substrate, real or virtual, will produce the same result. Anything else is mathematically impossible.
 
AkuManiMani, I want to double-check on this: Are you suggesting that two different substrates, executing the same series of computations, will produce different results?

If not, then what are you suggesting?
 
I think we simply need to think of computation in a different way.

Subjective experiences are not numerical outputs -- they're physical products of brain activity. If one wants to produce consciousness they must first understand the physics of how the brain produces it. Hunting for the 'right' switching pattern won't cut it.

Certainly not as pure abstraction, since that will never be able to do what needs to be done for consciousness. The brain works as a sensorimotor integrator. We cannot leave out the motor side of the equation and get an answer to this issue.

Motility, while necessary for survival, is not relevant to consciousness, per se.
 
AkuManiMani, I want to double-check on this: Are you suggesting that two different substrates, executing the same series of computations, will produce different results?

If not, then what are you suggesting?

You're still thinking strictly in terms of abstraction, Pixy.

The answer to your question is that they will produce the same symbolic outputs, but they will not have the same physical results. Do you understand?
 
The pattern of computations matters. The substrate does not, and cannot, matter beyond the computations that are performed. Those computations, performed on any suitable substrate, real or virtual, will produce the same result. Anything else is mathematically impossible.

I'm going to try one last time, PixyMisa.

Physical objects are not numbers and reality does not reduce to abstracted symbols. If you want to produce a specific physical effect -- such as fire, electrical current, or fission -- you must meet the physical requirements for producing those effects; its not a matter of switching patterns. In such instances substrate is essential. Consciousness is a physical effect of the brain; it is not a symbolic output. One needs to produce sufficient physical conditions if their goal is to produce consciousness. Why is this such a difficult concept for you to grasp?
 
Last edited:
You're still thinking strictly in terms of abstraction, Pixy.
I'm thinking in terms of information processing. That's not abstract, that's physical.

The answer to your question is that they will produce the same symbolic outputs, but they will not have the same physical results. Do you understand?
No.

If I have the same set of computations running on any number of substrates of equivalent computational power, I will get the same answer each time. If these sets of computations comprise a human mind, I will get the same mind each time. If I ask each mind a question, I will get the same answer each time (modulo quantum noise and chaotic uncertainty).

Physical objects are not numbers and reality does not reduce to abstracted symbols. If you want to produce a specific physical effect -- such as fire, electrical current, or fission -- you must meet the physical requirements for producing those effects; its not a matter of switching patterns.
You can simulate all of these, and you can simulate consciousness in precisely the same way. Of course, we cannot generally map physical properties between the simulations and the world containing the simulation. We can however transfer information between the two. Thus we can simulate a conscious entity and communicate with it.

In such instances substrate is essential. Consciousness is a physical effect of the brain; it is not a symbolic output.
Everything is a physical process. However, we identify consciousness by its manipulation of symbols. Thus simulated consciousness has precisely the same effect on the real world as real consciousness.

One needs to produce sufficient physical conditions if their goal is to produce consciousness. Why is this such a difficult concept for you to grasp?
Because those sufficient physical conditions are spelled out by the Church-Turing thesis, and thus substrate does and cannot matter.
 
That's odd. Most people don't want to die, which is one reason we invented religion. Personally, I'd like to at least live as long as I'd like... which isn't going to happen as far as I know. But since I don't believe in things just because they'd be convenient to me...

Now, that would be the start of an interesting thread. Is the human desire for an afterlife really the main reason for religion? The answer might seem to be an obvious yes, but on reflection, I'm not so sure. There are too many cases where religions either don't preach any afterlife beliefs at all (Saducees) or anything remotely resembling coherent Christian beliefs (Judaism up until at least the Maccabees. And when you get right down to it, the idea of reincarnation being an unsatisfactory stopgap solution until you can escape the wheel of birth and rebirth isn't my idea of a decent afterlife at all. ;))


Oh, btw... I"m reading Steven Rose's latest book right now. He couldn't disagree more with Dennett about a lot of things, but he doesn't like the concept of qualia either. :rolleyes:
 

Back
Top Bottom