The biggest difference that appears on these forums, in regards to the question of consciousness, seems to me to be between one class of poster who thinks consciousness is either a) unique to human minds or b) unique to minds of higher-order mammals, and another class of poster who thinks consciousness is substrate-independent to some degree. Interestingly, the first class of poster appears often to consist of idealists and dualists of various natures, believers in psychic phenomena or out-of-body experiences, and the religiously-inclined, while the second class of poster appears to consist of materialists, physicalists, and those who reject that there is anything 'special' at all about human thought.
The first class of poster seems to argue largely from appeals to emotion, arguments from incredulity, strawmen attacks, ad-hominem attacks, and similar fallacies; rarely, if ever, do they invoke an actual thoughtful, rational argument or notion in these discussions. I'm not actually sure how to define the arguments of the second class of poster, simply because I don't know the right terms involved; but, essentially, they argue by defining the terms first, and explaining how things fit into those definitions second; and those things that do not fit the explanations are often discarded as irrelevant or non-existent.
Now, I personally find PixyMisa's version of consciousness unfailingly coherent and logical, and while I can see some problems in this version (when pointed out by others), frankly, it's a good deal better version than anything else posited on this subforum. The most common objection that appears to be put forth is that this version of consciousness would mean that toasters, cars, and cell phones might (gasp!) be conscious.
OK, so what's the problem, honestly, if toasters, cars, and cell phones are conscious? Really? Why should that set off red flags or have us wagging our heads in disbelief? Is consciousness really such a mystical experience that we cannot conceive of other beings unlike ourselves as possessing it? And where DO we draw the line, then?
If consciousness isn't 'self-referential information processing', as Pixy defines it, what is it? I cannot personally identify anything within my own conscious experience that isn't self-referential information processing - nothing at all. One of you brought up 'pain'... but that's such a simple one. Nerves send information to the brain; the brain processes that information, and you receive a report of some form of damage or other nerve stimulation within awareness which you know is indicative of damage or potential damage, with a related physical locality indicating the portion of body thus suffering (or, often, misreporting it), which you have been educated to associate with the word 'pain' and a number of potential responses, one of which is likely already being selected by other consciousnesses within your brain and will be reported to you in a moment.
Simple self-referential information processing.
And all nerve stimulation is the same - and along with that comes emotion, feeling, taste, sight, and every other stimulation we have. Pain is no different from sight or temperature or thought. It's all information processing.
So, logically, that means that any information that can gather information, process that information in some way, and is in some way aware of itself is conscious. I see no problem with that - any computer with an anti-virus program is at least somewhat conscious (or, more accurately, has conscious parts). Many animals are conscious to some degree. Rocks? Well, rocks lack any means for gathering information, any processors, and any self-referencing mechanisms, so I don't think we can count rocks.
I guess what I'm trying to get at, for AkuManiMani, WestProg, and others, is this: why shouldn't we take seriously the notion that a smart toaster, an advanced recon drone plane, or a self-monitoring computer program are conscious? What is the key element that makes these things non-conscious, and humans conscious? Is it, as suggested by another, a matter of organic compounds versus metals and plastics? Would a toaster be conscious if we made the same functional components, but used carbon-based fleshy bits, maybe with some kind of warm-blooded heating system to cook the toast? Is it a matter of evolution? Would a drone plane be conscious if it evolved through a natural process of selection pressures over millions of years? Or does it become a matter of pure mysticism? Are only self-monitoring computer programs allowed to call themselves conscious if they appear magically, created by a supreme being?
You could not be suggesting that emotions are a necessary aspect of consciousness above and beyond self-reference and information processing; emotions are, after all, nothing more than biochemical reactions reported through the nerves into the brain - and, hence, processed information. You couldn't be claiming that a body is necessary for consciousness (beyond the brain), unless you think quadruple amputees are necessarily less conscious than other people. What could it be, then? What is the golden key that separates the conscious meat-computer in our skulls from the nonconscious silicon-computer at our fingertips?
At least Pixy offers a clear definition, and defends it. The other side offers nothing other than 'what we experience'. And they can't even rationally define the 'we' involved!
EDIT: Does anyone know why the forum keeps logging me out before I can finish typing a response? Thanks