• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

On Consciousness

Is consciousness physical or metaphysical?


  • Total voters
    94
  • Poll closed .
Status
Not open for further replies.
Sorry, but no.

Consciousness is, before anything else, the ability to examine your own mental processes. To be able to respond to "A penny for your thoughts." in some form. Car ECUs can do this. Many computer programs, including some that are sure to be on your laptop, can do this.


What assumptions?


Where has it been definitively established what consciousness is? If there is anything that can be gleaned from the endless consciousness threads at JREF it is that what consciousness is is either not known or very ambiguously defined (it seems to become whatever suits whatever research group is examining it).

…just a point…you don’t ‘examine your own mental processes’. You are your own mental processes. If you ‘examine’ them, then you are the examination of them (unless there are two of you in there).

…but, QUITE OBVIOUSLY, car ECU’s ( and laptops) are not the equivalent of a human being. How, exactly and specifically and why and where and everything are not questions that need comprehensive definitive answers in order to satisfy this conclusion (which, I would venture, is Mr. Corey’s point).

Put another way…round pegs do not fit in square holes. A theory that says they do is wrong.

So perhaps you should come up with another word for whatever it is you are referring to when you talk about the ability of a car ECU to ‘examine its own mental process’ (what was that quote about using anthropomorphic terms when talking about computers?). It’s got nothing to do with ‘people wanting to feel special’ (whether we are or not). It’s got everything to do with the simple fact that the consciousness that people experience is light years from that ‘experienced’ (if that word even applies) by a car ECU! Sensation, remember…that’s the definitive quality of human consciousness…not SRIP.

Why don’t we have a little competition. A word that can be used when talking about the ‘consciousness’ of a car ECU. How about cecuc (car ECU consciousness). A car is cecuc. A human being is conscious. They are NOT the same thing therefore we use different words to describe them.
 
…just a point…you don’t ‘examine your own mental processes’. You are your own mental processes. If you ‘examine’ them, then you are the examination of them (unless there are two of you in there).
Yep. Self-referential information processing.

…but, QUITE OBVIOUSLY, car ECU’s ( and laptops) are not the equivalent of a human being.
So?

So perhaps you should come up with another word for whatever it is you are referring to when you talk about the ability of a car ECU to ‘examine its own mental process’ (what was that quote about using anthropomorphic terms when talking about computers?).
Why?

It’s got everything to do with the simple fact that the consciousness that people experience is light years from that ‘experienced’ (if that word even applies) by a car ECU!
Would you argue that an amoeba is not alive?

Sensation, remember…that’s the definitive quality of human consciousness…not SRIP.
Quite on the contrary.
 
A quick bit of Googling turned up some interesting stuff:

http://infopractical.livejournal.com/77298.html
http://mnemotechnics.org/x/forums/daniel-tammet-840.html

This looks like the book mentioned in that second post, but I haven't read it:

http://www.amazon.com/Moonwalking-Einstein-Science-Remembering-Everything/dp/159420229X

And a reminder that we need to be skeptical is always on topic.

Edit: This post: http://incorrectpleasures.blogspot.com.au/2011/10/about-daniel-tammet-excerpt-from-my.html covers the subject in detail and with lots of further links and references.

I'm convinced. Tammit is a fraud.
 
…but, QUITE OBVIOUSLY, car ECU’s ( and laptops) are not the equivalent of a human being.

Obviously. But look at what they are said to be conscious of and compare that to what a human is conscious of. That's a very clear difference, and seems to be exactly the difference that you are talking about.
 
If you weren't conscious of anything, how would you know?

I wondered the same thing. I figured you'd be conscious only of not being conscious of anything but that you're not being conscious of anything but that. In other words, your brain is consciously repeating "I'm not conscious of anything" until it gets too bored.
 
Last edited:
How tightly linked is unsphexishness with consciousness?

If an entity is sphexish is it necessarily unconscious?

If an entity is unsphexish, is it necessarily conscious?

I wrote a successful anti-sphexishness routine for enemy intelligence in one of my games. Does that mean it had a inkling of consciousness?

The original SA article on sphexishness ended with a joke about how people sometimes behave sphexishly. I guess you could argue that was a blind spot in their consciousness.
 
Last edited:
It's got self reference. But it doesn't do anything.

As I've said any number of times, self-reference alone isn't consciousness. Self-referential information processing is consciousness.
I can't see any reason why that short program couldn't be compiled into a set of instructions that were equivalent to something like this:

Code:
1. obtain a supply of at least 3 boxes, each large enough to hold 5 marbles.
2. obtain a supply of at least 9 marbles.
3. empty all boxes.
4. label the boxes: box0, box1, box2 ...
5. put 2 marbles in box2.
6. if box2 has 2 marbles then put 1 marble in box1
7. empty box0.
8. rest.
Executing those instructions is doing something (you might even call it "information processing") but you cannot unambiguously come to the conclusion that it is self-referential without interpreting it in a particular way. Who or what chooses the particular interpretation of any process such as this?

Perhaps the programmer was just using a very convoluted approach to leave each box containing the same number of marbles as its "index"?

Hence my earlier comment to note that the variable names in the source code ultimately mean nothing.

Where does the "meaning" come from for any particular "computation" on some arbitrary Turing Machine? The atoms? Or the programmer?
 
Last edited:
Is this what you think being "self-aware" is?

Well all this depends on how we define these words, right ? "Aware" is receiving input from something. "Aware of X" means getting input from X. "Self-Aware", which I equate with consciousness, but not necessarily to human-level consciousness, is getting input from yourself/your own state.

The piece of code you showed me doesn't seem to get any form of data from the "self", nor use it in any way that distinguishes it from data from elsewhere. But if it can distinguish itself from other stuff, I guess you could say that it barely fits the broadest possible definition of the word.

Mind you, that's just defining consciousness as a concept, not as its most complex (human) iteration.
 
When you meditate, can't you clear your mind so you're not conscious of anything?

I have cleared my mind while meditating. You remain conscious of the sensations received from your sensory apparatus, in a heightened way. Its just the thinking or stream of thought which is absent.

As I have pointed out consciousness is a bodily experience, not in the mind.
 
Well all this depends on how we define these words, right ? "Aware" is receiving input from something.
But are you also saying the inverse (obverse?) is true:- ie "To receive input from X is ' to be aware of X' " ?
Because I think that would be hard to support. The rocks on the beach receive input from the sun and reradiate it. Would you say the rocks are aware of the sun, or the process of heating a rock is aware of the sun? Or what?
"Aware of X" means getting input from X. "Self-Aware", which I equate with consciousness, but not necessarily to human-level consciousness, is getting input from yourself/your own state.
 
I can't see any reason why that short program couldn't be compiled into a set of instructions that were equivalent to something like this:

Code:
1. obtain a supply of at least 3 boxes, each large enough to hold 5 marbles.
2. obtain a supply of at least 9 marbles.
3. empty all boxes.
4. label the boxes: box0, box1, box2 ...
5. put 2 marbles in box2.
6. if box2 has 2 marbles then put 1 marble in box1
7. empty box0.
8. rest.
Executing those instructions is doing something (you might even call it "information processing") but you cannot unambiguously come to the conclusion that it is self-referential without interpreting it in a particular way. Who or what chooses the particular interpretation of any process such as this?
The Universe. Self-referential information processing is the most computationally efficient method for implementing complex adaptive behaviours. The behaviour of such a system is quantitatively different from a system without self-reference. See the poor sphex wasp and its starving babies for details.
 
How tightly linked is unsphexishness with consciousness?

If an entity is sphexish is it necessarily unconscious?
Sphexishness is a quality of behaviours, not necessarily of organisms. Whack the knee with that little hammer and your leg will jerk every time.

We'd label an organism sphexish if it exhibited no behaviours that broke out of the mold. I'm not sure that sphex wasps are actually sphexish in general. There are certainly examples of other insects that are far more adaptable.

If an entity is unsphexish, is it necessarily conscious?
Pretty much. A larger brain can perform more complex behaviours while still being unconscious. You can't get human complexity, or anything remotely near it, though. (Searle's Chinese Room argument fails to note that for the Room to function as described it would need to be larger than the observable Universe.)

I wrote a successful anti-sphexishness routine for enemy intelligence in one of my games. Does that mean it had a inkling of consciousness?
Hard to say without details. It is the easiest way to code such a thing. But you could just write a really huge if/then statement, Depending on the use case, it might work well enough. If you're trying to pack bee-level smarts into a tiny bee brain, though, you don't have such luxuries.

The original SA article on sphexishness ended with a joke about how people sometimes behave sphexishly. I guess you could argue that was a blind spot in their consciousness.
Sure. After all, most of what we do skates past our conscious awareness.
 
"This", from Annoid's oft repeated quote, refers to: "a careful and precise definition of consciousness". They're right. We don't have a careful and precise definition. We have a bunch of definitions that are pretty vague and general.

Yup, those neurologists and doctors, they don't have a clue.

Consciousness is being aware, awake and responsive. But then that is not magical transcendent or loose enough for many philosophers on the forum.
 
Status
Not open for further replies.

Back
Top Bottom