The Hard Problem of Gravity

Wow... this is actually worth addressing...
Yy2bggggs: Likes to bring unconsidered points into every discussion.
Or another way to phrase it--I'm proselytizing critical thinking.
Also, he is the 'mazing Hero of Time that doesn't afraid of anything :D
Awww... you're just saying that 'cause I haven't ticked you off yet.
 
I'm sorry. Computers can use the phrase "I'm not convinced", but you claim they don't experience "qualia", whatever the hell they are. So obviously experiencing qualia, assuming they exist, is not a good criterion for determining consciousness.

So can books and walls. However, AFAWK, they only say such things when a human being has put them there in the first place.

When a computer program discovers its own qualia, and asserts it, rather than prints out a phrase inserted by a person, then that will at least be interesting. Until then, the computer is no more significant than any other medium for carrying a human being's thoughts.
 
So why don't you just define consciousness in terms of the set of behaviors exhibited by things you want to be conscious?

That would include all humans, cute furry animals, etc. Everything you want to be special.

I am serious -- why not just use a definition like that?

Because the critical element in consciousness is the assertion of consciousness. That's where this discussion begins. Yes, we derive an expectation that other human beings who haven't actually told us that they are conscious still are. We also suspect that animals are also conscious, in some sense. But what compels us to investigate the matter is the fact that human beings spend so much time telling each other about their qualia.

So, yes, we could set up an entirely behavioural definition, but that would be missing the point. We wouldn't be trying to define consciousness in the first place if it wasn't something we actually experience.
 
I didn't overlook it. I never saw a definition in there. In fact, you had been so clear to me before that you DIDN'T have a definition that I was convinced of it.

Let's take a look:

AkuManiMani said:
I've stated repeatedly that the words 'consciousness' and 'awareness' are synonymous

Really ? I took "consciousness" to mean "awareness of oneself". I guess if you want that to be "self-consciousness" then fine.

You could personally define it as such. But according to the standard definitions provided in english dictionaries and thesauri [the ones I'm employing] they are synonymous terms.


Is that your definition ? What's "qualitative experience" ? I honestly have no clue. How do you define it, and how do you observe it in anyone but yourself ?

What is 'qualitative experience'? All that stuff that goes on in your mind when you're awake or dreaming.

How does one define it? By describing such experiences via qualitative language.

Wanted a scientific definition? That’s still a work in progress.

How does one observe it in others? One cannot directly do so [atleast as far as we know] but we can observe the physiological activity of humans that claim to be conscious, and compare it to instances where they are purportedly unconscious. One can infer from there as to what biological processes give rise to it and which enties exhibit those properties.


It's not a "game". I'm obviously not using the same definition of "information processing" as you and so I was asking for your definition.

The broader definition used by the physical sciences.



So, basically, instead of providing said definition you ask me to look up someone ELSE's definition ? It's pretty clear that if you and I don't agree on the definition, then a third party's definition might just serve to complexify things, no ?

I did finally go about proving the definition of it that I'm refering from. Just getting a little weary of being asked for definitions ad nauseum even after i repeatedly provided them.

Your emotional state is of no interest to me.

It should be if you want non-sarcastic replies :p
 
Last edited:
It isn't a problem. It's a question.

What do you consider the behaviour of a conscious entity ?

Does a machine that can convince you it's human count ?

It's a question for which no totally convincing argument has yet been presented, i.e a problem.

As for the other two questions, I don't know. I have yet to draw a line in the sand, to be honest. I'm sympathetic to the arguments of the hard-AI proponents, but not entirely convinced they yet have all answers. As I said, I'm not totally convinced they even have all the questions!
 
westprog:

Do you deny the Church-Turing thesis? Or that humans are effectively calculable?

One of the things that bugs me here is how the HPC is being treated. P-zombies would be perfectly capable of talking about their "p-qualia", and even if not, you seem to both be proposing a behavioral distinction and denying one.

Nothing about the HPC should be distinguishable by behavior. The behavioral aspect is classified as an "easy problem".
 
What is 'qualitative experience'? All that stuff that goes on in your mind when you're awake or dreaming.

LOL! Well, that was a good laugh! "All that stuff that goes on in your mind when you're awake or dreaming" Is what I would call "Experience".

So if all that stuff is 'qualitative' then what the hell is quantitative?
 
When a computer program discovers its own qualia, and asserts it, rather than prints out a phrase inserted by a person, then that will at least be interesting.

How do you know it is but just isn't communicating it in English?

Seriously.
 
Last edited:
How do you know it isn't but just isn't communicating it in English?

Seriously.

Code walkthrough. If an AI program starts to assert consciousness, we can look at the code and see how it does it. Maybe step through it with a debugger. That's how we figure out how programs do things.
 
westprog:

Do you deny the Church-Turing thesis? Or that humans are effectively calculable?

One of the things that bugs me here is how the HPC is being treated. P-zombies would be perfectly capable of talking about their "p-qualia", and even if not, you seem to both be proposing a behavioral distinction and denying one.

Nothing about the HPC should be distinguishable by behavior. The behavioral aspect is classified as an "easy problem".

IMO much of what human beings do is attempting to convince each other that they are not p-zombies. We don't know for sure that a p-zombie wouldn't appreciate Mozart or Caravaggio, but that's the kind of test we apply to each other.

We don't know how much of what human beings are is due to their actual physical nature. It might be that they are calculable, it might not. I don't think anyone believes that a comet, for example, is calculable. We might calculate what a comet does, but we always realise that the comet and the calculation are very different things.
 
Code walkthrough. If an AI program starts to assert consciousness, we can look at the code and see how it does it. Maybe step through it with a debugger. That's how we figure out how programs do things.

Right.

So what does
Code:
qualia()
look like then?
 
That is EXACTLY my point. The formal 'definitions' of consciousness being proposed may actually apply to conscious processes. The problem is, their categorical nets are far too broad for them to count as a sufficient explanation for properties particular to what we call conscious experience.

Well, my contention is that there is nothing special in the human being to make it conscious. It's incremental, and very simple organisms have it to a much, much lesser degree. There is no hidden ingredient.
 
So can books and walls. However, AFAWK, they only say such things when a human being has put them there in the first place.

You only say things that your DNA and environmental factors have put there in the first place.

When a computer program discovers its own qualia, and asserts it, rather than prints out a phrase inserted by a person, then that will at least be interesting.

You seem to be under the mistaken impression that consciousness creates new data.
 
What is 'qualitative experience'? All that stuff that goes on in your mind when you're awake or dreaming.

I don't see a need to invent a new term for "perception", then.

How does one observe it in others? One cannot directly do so [atleast as far as we know] but we can observe the physiological activity of humans that claim to be conscious, and compare it to instances where they are purportedly unconscious.

So qualia are useless. We observe behaviour.
 
It's a question for which no totally convincing argument has yet been presented, i.e a problem.

As for the other two questions, I don't know. I have yet to draw a line in the sand, to be honest. I'm sympathetic to the arguments of the hard-AI proponents, but not entirely convinced they yet have all answers. As I said, I'm not totally convinced they even have all the questions!

My opinion is that the "HPC" is an artifact of our dualistic perceptions. From this thread (see my last reply to Aku), no one can assert anything about consciousness other than the fact that we know it from behaviour.
 
Right.

So what does
Code:
qualia()
look like then?

When the program that passes the Turing test is written, then I'd love to look at the code. Since no such program has been written, or looks like being written, I can't begin to speculate as to what the code would look like.

Step one is to have a program explore its environment, examine the world that we live in, and develop its own way of seeing. I wouldn't even be as demanding as the Turing test - I'd be happy enough with a program that saw the world in its own way. But it would have to experience it.
 
My opinion is that the "HPC" is an artifact of our dualistic perceptions. From this thread (see my last reply to Aku), no one can assert anything about consciousness other than the fact that we know it from behaviour.

And I'm not disagreeing with you. The "problem" is defining exactly what those behaviours are. Which we (neither neurobiologists nor philosophers of mind) can't do yet, as you yourself realise.

Pixy and RocketDodger are asserting something much stronger.
 
You only say things that your DNA and environmental factors have put there in the first place.



You seem to be under the mistaken impression that consciousness creates new data.

And DNA and environmental factors have created this capacity for subjective experience. It is there. It's a fact to be dealt with. And nobody will really accept that a machine consciousness is equivalent to human consciousness in any important respect unless it produces the same thing.
 

Back
Top Bottom