When you meditate, can't you clear your mind so you're not conscious of anything?
I never have. I've experience different, perhaps slower, states of awareness, but not "not conscious of anything".
When you meditate, can't you clear your mind so you're not conscious of anything?
I never have. I've experience different, perhaps slower, states of awareness, but not "not conscious of anything".
Sorry, but no.
Consciousness is, before anything else, the ability to examine your own mental processes. To be able to respond to "A penny for your thoughts." in some form. Car ECUs can do this. Many computer programs, including some that are sure to be on your laptop, can do this.
What assumptions?
http://www.amazon.com/Gödel-Escher-Bach-Eternal-Golden/dp/0465026567Where has it been definitively established what consciousness is?
If you weren't conscious of anything, how would you know?I have to think abourt it. I've never been good at meditation.
Yep. Self-referential information processing.…just a point…you don’t ‘examine your own mental processes’. You are your own mental processes. If you ‘examine’ them, then you are the examination of them (unless there are two of you in there).
So?…but, QUITE OBVIOUSLY, car ECU’s ( and laptops) are not the equivalent of a human being.
Why?So perhaps you should come up with another word for whatever it is you are referring to when you talk about the ability of a car ECU to ‘examine its own mental process’ (what was that quote about using anthropomorphic terms when talking about computers?).
Would you argue that an amoeba is not alive?It’s got everything to do with the simple fact that the consciousness that people experience is light years from that ‘experienced’ (if that word even applies) by a car ECU!
Quite on the contrary.Sensation, remember…that’s the definitive quality of human consciousness…not SRIP.
A quick bit of Googling turned up some interesting stuff:
http://infopractical.livejournal.com/77298.html
http://mnemotechnics.org/x/forums/daniel-tammet-840.html
This looks like the book mentioned in that second post, but I haven't read it:
http://www.amazon.com/Moonwalking-Einstein-Science-Remembering-Everything/dp/159420229X
And a reminder that we need to be skeptical is always on topic.
Edit: This post: http://incorrectpleasures.blogspot.com.au/2011/10/about-daniel-tammet-excerpt-from-my.html covers the subject in detail and with lots of further links and references.
…but, QUITE OBVIOUSLY, car ECU’s ( and laptops) are not the equivalent of a human being.
If you weren't conscious of anything, how would you know?
The book predicted that computers probably won't ever beat humans in chess, though Deep Blue beat Garry Kasparov in 1997.
I can't see any reason why that short program couldn't be compiled into a set of instructions that were equivalent to something like this:It's got self reference. But it doesn't do anything.
As I've said any number of times, self-reference alone isn't consciousness. Self-referential information processing is consciousness.
1. obtain a supply of at least 3 boxes, each large enough to hold 5 marbles.
2. obtain a supply of at least 9 marbles.
3. empty all boxes.
4. label the boxes: box0, box1, box2 ...
5. put 2 marbles in box2.
6. if box2 has 2 marbles then put 1 marble in box1
7. empty box0.
8. rest.
Is this what you think being "self-aware" is?
When you meditate, can't you clear your mind so you're not conscious of anything?
But are you also saying the inverse (obverse?) is true:- ie "To receive input from X is ' to be aware of X' " ?Well all this depends on how we define these words, right ? "Aware" is receiving input from something.
"Aware of X" means getting input from X. "Self-Aware", which I equate with consciousness, but not necessarily to human-level consciousness, is getting input from yourself/your own state.
The Universe. Self-referential information processing is the most computationally efficient method for implementing complex adaptive behaviours. The behaviour of such a system is quantitatively different from a system without self-reference. See the poor sphex wasp and its starving babies for details.I can't see any reason why that short program couldn't be compiled into a set of instructions that were equivalent to something like this:
Executing those instructions is doing something (you might even call it "information processing") but you cannot unambiguously come to the conclusion that it is self-referential without interpreting it in a particular way. Who or what chooses the particular interpretation of any process such as this?Code:1. obtain a supply of at least 3 boxes, each large enough to hold 5 marbles. 2. obtain a supply of at least 9 marbles. 3. empty all boxes. 4. label the boxes: box0, box1, box2 ... 5. put 2 marbles in box2. 6. if box2 has 2 marbles then put 1 marble in box1 7. empty box0. 8. rest.
Sphexishness is a quality of behaviours, not necessarily of organisms. Whack the knee with that little hammer and your leg will jerk every time.How tightly linked is unsphexishness with consciousness?
If an entity is sphexish is it necessarily unconscious?
Pretty much. A larger brain can perform more complex behaviours while still being unconscious. You can't get human complexity, or anything remotely near it, though. (Searle's Chinese Room argument fails to note that for the Room to function as described it would need to be larger than the observable Universe.)If an entity is unsphexish, is it necessarily conscious?
Hard to say without details. It is the easiest way to code such a thing. But you could just write a really huge if/then statement, Depending on the use case, it might work well enough. If you're trying to pack bee-level smarts into a tiny bee brain, though, you don't have such luxuries.I wrote a successful anti-sphexishness routine for enemy intelligence in one of my games. Does that mean it had a inkling of consciousness?
Sure. After all, most of what we do skates past our conscious awareness.The original SA article on sphexishness ended with a joke about how people sometimes behave sphexishly. I guess you could argue that was a blind spot in their consciousness.
You mean you couldn't even be bothered to go back a few posts and see what you said ?
"This", from Annoid's oft repeated quote, refers to: "a careful and precise definition of consciousness". They're right. We don't have a careful and precise definition. We have a bunch of definitions that are pretty vague and general.