The reason is that I see consciousness as a pure subject. If several states are allowed then consciousness becomes an object. As I see it, thoughts are objects, not the subject which is aware of the thoughts. Emotions are objects, and consciousness is the state that is aware of the emotions. All experiences are objects, are content experienced by the subject which is consciousness.
I think you might be using the words "object" and "subject" a little loosely, there. But, I have no time to debate the finer points of semantics and equivocations and stuff, right now.
Mind and consciousness are not the same 'thing'. Mind can be defined as consciousness plus the content experienced in consciousness such as thoughts and emotions etc.
From what I have been informed: The mind is what the brain does. Consciousness is a property of the mind, and by extension the brain.
In the video with Christof Koch I posted earlier he talked about something called Integrated Information Theory:
At first glance, I think the "observations" Integrated Information Theory makes are actually flawed:
"The first is that every observable conscious state contains a massive amount of information. A common example of this is every frame in a movie."
If you care to read a good, modern book on the science of neurology and/or consciousness, such as those written by Susan Blackmore, or Richard Wiseman: You will discover experimental evidence that
we are NOT, in fact, aware of a massive amount of information at any one time. Only a very, very small percentage of information thrown at us, such as from a frame of a movie, do we actually become conscious of. The rest is filled in by the narrative-reconstructive aspects of our brains, powered by (heavily biased) pattern recognition systems, etc.
I think Giulio Tononi is working off the wrong premise, if we wishes to unravel the mysteries of consciousness.
"All of the information you have gleaned from conscious states is highly, and innately, integrated into your mind. It is impossible for you to see the world apart from all of the information that you are conscious of."
I think it is more accurate to say much of the information you have gleaned, from conscious states, is
made up by various parts of the mind, as best it could from the scraps of information you actually got from your senses.
Technically, I guess you could say it is true that the information is "highly, and innately, integrated into your mind", but for reasons very different from what is implied by the Integrated Information theory.
The statement "it is impossible for you to see the world apart from all of the information that you are conscious of" is
demonstrably false, given experiments that have verified the existence of
blindsight. There are
some forms of information you might not be conscious of, that can still tell you something about what you are sensing.
If one seeks to quantitatively measure consciousness, the more direct approach might be to measure how successfully an agent can act independently within its environment. Or even better: Find a way to determine how aware someone is of their own existence. But, both are rather tricky to do, right now. Just measuring information processing, though easier to achieve, is not really broaching the subject very much.
Hmm... If Giulio Tononi is correct, then computers could have consciousness.
I also think computers could have consciousness, at least potentially. But, because of very different reasons than Tononi.
If we can isolate the algorithms and systems in the mind that generate and sustain consciousness awareness, there is no reason why we would not be able to, eventually, incorporate those things into computers. There is probably nothing "magical" about biological systems that would make them the only things capable of consciousness.