Yeah, we do.
We know it's produced by the brain. That is sufficient to demonstrate that you are wrong.
Wrong.
What I (Hostadter, Dennett, and many others) did was to examine what is the fundamental difference between systems we call conscious and systems we don't call conscious, and then realise that this is our operational definition. You keep piling random baggage on whenever you get an answer you don't like, but our fundamental definition of consciousness is exactly as I have described it.
Wrong. GWT is equally applicable to AI and humans - and not very interesting.
There is no global access state. That's impossible. All there is is signals passing from neuron to neuron.
There has to be self-reference. That is what, at its core, consciousness means.
Wrong. It can't not be a self-referencing loop.
Well, it's a few days now since this post, but for me I appreciate that your Strong AI position could mean that pretty much all the above are correct, though I would still strongly dispute that there is a consensus here amongst cognitive neuroscientists or related professionals.
However, this notion of yours that consciousness is inherently self-referencing I'm quite clear is either wrong or I am misunderstanding you. Can you explain? How, for example, is my current conscious visual vista self-referencing?
Nick