If you define "conscious" simply in terms of "being awake," then I agree.
But that isn't what you, or any other HPC proponent, are doing.
You start with something that is self evident -- being awake, being aware, experiencing things, whatever. Then you extrapolate and include all sorts of other stuff that is not self evident. This is not logically valid. If you want to talk about something being self evident, you have to stick with only what is self evident.
And the reason you do this is because you lack a formal definition of "conscious." So you think "well, if I am awake I am conscious, and if I am conscious I must be able to experience qualia and subjectivity, and since it is self evident that I am awake, it must be self evident that qualia and subjectivity exist."
That is a fallacy.
Rocket, the rationale isn't
"Oh, I'm awake so therefore I must be able to have qualia". The experience of being awake
is qualitative;
any subjective experience at all is qualia. Its not an additional property of being conscious and awake -- it
is consciousness.
I'm literally stunned that you don't seem to be picking up on this in the slightest
AkuManiMani said:
Whats circular about saying "consciousness exists as a phenomenon; consciousness is a requisite of knowledge"? Its no more circular than saying "mass is a real property; mass is a requisite to weight."
Because any formal definition of consciousness must be predicated on the existence of knowledge.
If you disagree,
just go ahead and try to define "consciousness" without somehow relying on the notion of "to know."
I already gave an example of such in the thought experiment I proposed in post
#353. The subject of the thought experiment is conscious, in the physiological sense, but does not have knowledge of anything because they are sensorially cut off from their environment.
Consciousness does not have a tautological relationship with knowledge; it is merely the necessary requisite for it. Just as an object cannot register weight unless it has mass, so an entity cannot have knowledge unless it is conscious. There is absolutely
no logical contradiction or circular reasoning in this statement. For the life of me, I cannot understand why you don't see this.
AkuManiMani said:
We know that consciousness exists as a phenomenon and that each of us experiences this state at various periods of the day; this is a given. What we don't know is what in physics necessitates or governs conscious [i.e. subjective] experience. This is the reason why we are stuck with informal, 'fuzzy' definitions. For reasons that I've already mentioned, its is evident that self-referential intelligence is not a sufficient requisite for conscious experience.
Another fallacy.
Are you seriously claiming that lack of knowledge of the mechanism causing a phenomenon necessarily prevents us from at least operationally defining the phenomenon?
That just the problem. Neither you, or anyone else has an operational definition of qualitative experience [i.e consciousness]. There are various methods of defining and modeling computational functions but absolutely nothing in the way of describing
how such functions translate into
conscious thought. The field of AI lives up to it's namesake: artificial intelligence. But intelligence and consciousness are not the same thing.
AkuManiMani said:
Of course.
Modeling in finer detail the exact physiological processes that give rise to said consciousness will require considerably more than simply stating consciousness as a given. The point of me assigning an "X" variable to consciousness is to serve as a conceptual placeholder until there is such a formal method of modeling what it is, exactly. There is no convincing evidence that we have such a formal system yet. My purpose here is to suggest possible avenues of investigation to determine a means of crafting such a system. My guess is that we need to study the physical process of instances that we do know are conscious [e.g. living brains] and work from there.
Well, perhaps we have been too harsh on you then -- you clearly know nothing about computer science and computation theory.
All the fundamentals we need to describe human consciousness
are already known. We know
exactly how an individual neuron behaves. The question, as with any complex problem, is how to arrange the fundamentals into something greater than the sum of it's parts.
Okay, so what is the difference between neurons of a conscious brain and an unconscious brain? What is it about the activity of some neurons that produces qualitative experiences? How do the contributions of all those neurons come together in the unified experience of being conscious? Is an organism simply having neurons sufficient for generating consciousness?
There have been great strides in computational theory over the passed century or so and the field of AI has produced a
lot in a relatively short period of time. Even so, merely creating intelligent systems is not the same as producing conscious experience. Such processes are what underlies the most basic of biological systems and we know that, in and of themselves, they are not sufficient to produce conscious experience.
I feel like you aren't clear on just how much greater a phenomenon can be than the sum of it's parts. Let me make it clear just how much -- infinitely.
I feel really rotten pressing the issue like this, but I think that it is you who aren't appreciating the full depth of the problem. I don't think that is insoluble [or atleast I hope not] but there are a lot of unanswered questions that our present knowledge barely even begins to address. :-/