rocketdodger said:
None of them require or attempt to equate consciousness with human consciousness.
Well, it sounds to me like you are, for example the following statement:
Our consciousness is the only consciousness we really know. You can't establish others by fiat. What you can do is dissect human consciousness and ask yourself, and us, is this really essential for consciousness?
Nope. You missed my point entirely. I believe that we, some/most animals and potential AI all share certain fundamental elements of consciousness. I'm applying reductionism realizing that it can only be reduced so far before it ceases to be consciousness.
But you aren't telling the whole story here.
Our own subjective experience is the only subjective experience we really know. As far as the rest of consciousness goes -- external behaviors -- we are clearly not talking about the same thing (and neither is Pixy).
Because I really know your external behaviors, and whether I want to call them conscious behaviors or not. I also really know the behaviors of my dog, or a chimp at the zoo, or a chipmunk in my back yard. And I call them all conscious as well.
You just made my point without realizing it

. All those behaviors you believe are conscious in animals are because of the correlation they have to behavior you have you know are conscious and you knowledge that they probably share some similar brain architectures. When your dog effortlessly runs someplace you probably correlate that to unconsciousness because you run without thinking about or "feeling" how you need to move your legs. You assume its unconscious. However, since the dog seems to be running to play or achieve some goal-directed behavior you naturally correlate that to your conscious feelings engaged with such volition.
All humans can do is define things in relation to what we perceive and can understand which is ultimately limited by our brains and language - a huge barrier. I am perfectly willing to accept there are aspects of awareness, cognition, thinking, etc. we can never grasp except perhaps in their shadows and by abstract extrapolation and analogy. I think mankind will one day, upon the singularity occurring perhaps, create a new form of superconsciousness (SC). The SC will probably be able to directly sense such things as 4-D Kline Bottles that we can only imagine shadows of. They will be able to maneuver and understand a limitless range of dimensions. For them QM will be intuitive. They may experience what is called hive-mind and simultaneous experience an untold variety of interconnected realities of their own making. But beyond this are probably things I can't describe because my mind isn't even wired to enable me even to imagine it. I'm the amoeba to their mouse. How could a mouse or you dog infer everything it is to be humanly conscious? If there was such a thing as dog science, their brains would limit their epistemology analogous to how ours must be limited.
And furthermore I can look at the neural activity in a chipmunk while it is conscious and try to elucidate how those external behaviors arise. To me, that is the same thing as trying to elucidate how consciousness arises, because the external behaviors are an aspect of consciousness.
Yes, I'd do the same thing. But it begs the question. It's ultimately constrained by the behaviors you call conscious to begin with. How do you know that a turtle is swimming to the other side of the pond because it
wants to or is aware of what its doing? Or a worm burying itself? or a bacterium moving away from light?
I've argued in earlier posts how science needs to go about this. But you can't just define consciousness by an external behavior. You have to understand the principals of the computations that are generating them, not simply map behavior to brain signals. When we glimpse the fundamental algorithm(s) that yield consciousness I think it may be immediately self-evident that they must yield consciousness (though not necessarily sufficient for all forms of consciousness.) even before we test them. I think it will turn out to be simple and elegant like most deep fundamental discoveries and afterwards we'll wonder what the big deal had been about. That's science.
We have this argument all the time here, and it is why I wish people would be more specific in what they are referring to. A chipmunk can be conscious, because it can also be unconscious, yet a chipmunk clearly cannot have the same subjective experience as a human. So what are you talking about -- the subjective experience of a human, or the external behaviors that people label "being conscious" in animals (including ourselves)?
Neither, as I've just explained.
You also have to be able to tie an observable in consciousness itself to an operational definition that explains it to have a complete definition.
We have done that with SRIP.
I think it can be done but you haven't proven it sufficiently to call it a definition yet. Neither you nor Pixi have dealt with the problems of qualification and equivalence in the definition I think I demonstrated conclusively.
Really, it is painfully clear to me that you simply consider the term "conscious" to be stronger than we do. I wish we could move on. You know what, I will even come to your side of the fence if it means we can move on. If you want me to reserve "consciousness" for something more complex, I can do that, since you agree that SRIP is a necessary requisite.
What do you want to move on to exactly? I don't care what side of the fence you're on. I just want to provoke greater understanding and read good arguments.
You can take any premise for consciousness you want to and run with it all you like. But if you pick the wrong ones you can end up believing rocks are conscious.
I'm happy with this definition: Consciousness is the mental state that initiates when the alarm clock wakes you up and ends after consuming too many beers.