The Hard Problem of Gravity

Yeah, we do.

We know it's produced by the brain. That is sufficient to demonstrate that you are wrong.


Wrong.

What I (Hostadter, Dennett, and many others) did was to examine what is the fundamental difference between systems we call conscious and systems we don't call conscious, and then realise that this is our operational definition. You keep piling random baggage on whenever you get an answer you don't like, but our fundamental definition of consciousness is exactly as I have described it.


Wrong. GWT is equally applicable to AI and humans - and not very interesting.


There is no global access state. That's impossible. All there is is signals passing from neuron to neuron.


There has to be self-reference. That is what, at its core, consciousness means.


Wrong. It can't not be a self-referencing loop.

Well, it's a few days now since this post, but for me I appreciate that your Strong AI position could mean that pretty much all the above are correct, though I would still strongly dispute that there is a consensus here amongst cognitive neuroscientists or related professionals.

However, this notion of yours that consciousness is inherently self-referencing I'm quite clear is either wrong or I am misunderstanding you. Can you explain? How, for example, is my current conscious visual vista self-referencing?

Nick
 
Huh?

You don't see why an entity would need to differentiate itself from the rest of the world?

You must have meant something else because that is an utterly stupid thing to say. Of course, Nick227 has said it as well...

Human beings evolved consciousness because it was useful to humans. As far as we can tell it took many millions of years to do so. So thermostats evolved consciousness exactly how?

Quite how thermostats would evolve consciousness I don't quite see. Perhaps I'm just being utterly stupid. Maybe they get it just by really really needing it.
 
Human beings evolved consciousness because it was useful to humans. As far as we can tell it took many millions of years to do so. So thermostats evolved consciousness exactly how?

Quite how thermostats would evolve consciousness I don't quite see. Perhaps I'm just being utterly stupid. Maybe they get it just by really really needing it.

Who cares about the "need" ? Who cares about the evolutionary process ? We're talking about HOW consciousness operates. If we can tell HOW, then we may be able to replicate it, even if it serves no immediate purpose to the thing itself or if the "evolution" of it is completely different. You're placing barriers for the HPC that simply aren't there.

Why do you so WANT there to be an HPC ?
 
...or neural impulses.



We already know why. It's because people are referring to something that isn't there, and need to make it fundamental in order to support the rest of the argument, which results in circularity. Remove the need for the thing that isn't and you're left with no problem, "hard" or otherwise.

You also have no human beings. An unfortunate side effect.

Human beings will persist in thinking of themselves as human - except for the minority who think of themselves as smart thermostats.
 
Mostly, you mean. Sometimes you're conscious of it. Any idea why ?



I've been reading the thread since the first post and I'm yet to see a definition of consciousness that isn't circular.

A process (or a group of) that the part of the environment that is labelled "Darat" does.
 
If you don't know then I doubt if I or anyone else will be able to make you know.

Here's a way to see if you know what is meant. Try to summarise your last week - people met, books read, interactions with partner, music listened to, meals eaten, pain and happiness.

Then summarise the same week in purely behavioural terms.

The difference between the two accounts is what we are discussing. If you think the two accounts are essentially equivalent, and that no useful information is lost from one to the other, then you won't think there's a hard problem of consciousness.

IMO the subjective account is fundamentally central to being human, and has yet to be explained in any meaningful way.

Better still you just tell what this "difference" is, that way I will know what you mean.
 
If the only reason to assume that human beings possess consciousness is that they assert to have it then, by definition, a computer that also asserts it (it uses "I", for instance) can be assumed to have it, as well.

The reason that human beings know that they are conscious is because they directly experience consciousness. The mere fact of knowing anything demonstrates consciousness.

It is so laughably easy to have a computer call itself "I" that it cannot be said to demonstrate anything. The same applies to an assertion of consciousness.

For a computer program to convincingly demonstrate consciousness, a first step is that it would do it spontaneously.
 
However, this notion of yours that consciousness is inherently self-referencing I'm quite clear is either wrong or I am misunderstanding you. Can you explain? How, for example, is my current conscious visual vista self-referencing?

As Mercutio has pointed out, consciousness is quite poor at examining itself. It seems to work best at examining other things.
 
Better still you just tell what this "difference" is, that way I will know what you mean.

There's really no need to (or possibility of) retreat from human experience. I really don't think you need someone to tell you what it's like. What it feels like.
 
Who cares about the "need" ? Who cares about the evolutionary process ? We're talking about HOW consciousness operates. If we can tell HOW, then we may be able to replicate it, even if it serves no immediate purpose to the thing itself or if the "evolution" of it is completely different. You're placing barriers for the HPC that simply aren't there.

If we can tell how, we might be able to replicate it.

Why do you so WANT there to be an HPC ?

I don't want there to be. That's like saying that I want the universe to exist. It doesn't matter what I want. I can't make it any different by wishing.

Perhaps you should examine why you are so concerned that there should not be an HPC
 
I'm going to pile on and give my opinion:

I think everyone agrees that what we experience is sensory information. Now this sensory information is processed in several complicated ways that I cannot venture to understand, but I think it is reasonable to assume that processing sensory information in your brain 'feels like' something. My question to Akumanimani and westprog would be this: Is there a reason you suspect that consciousness is more than what it feels like to process information? Or do you think that it isn't reasonable to assume that processing information feels like anything at all?

Okay, that makes two questions, actually...
 
I'm going to pile on and give my opinion:

I think everyone agrees that what we experience is sensory information. Now this sensory information is processed in several complicated ways that I cannot venture to understand, but I think it is reasonable to assume that processing sensory information in your brain 'feels like' something. My question to Akumanimani and westprog would be this: Is there a reason you suspect that consciousness is more than what it feels like to process information? Or do you think that it isn't reasonable to assume that processing information feels like anything at all?

Okay, that makes two questions, actually...

Well, as I see it there is a basic complication here in that it's well known that the brain also processes vast amounts of information unconsciously. For me this fact is what creates the potential for a "hard problem."

The key question is thus to ask whether there is a qualitative difference between conscious and unconscious processing. Obviously, at a subjective-experiential level there is a immense qualitative difference, but can one demonstrate that this exists at a neuronal level in the brain? Or can it be that actually there is no difference, it's simply that the brain is completely conscious yet this self-consciousness we experience is but an aspect of it?

Pixy's assertion, which I don't buy, is that the only difference between so-called "conscious" and "unconscious" processing is the presence of a self-referencing loop in the former.

Nick
 
Last edited:
I'm going to pile on and give my opinion:

I think everyone agrees that what we experience is sensory information. Now this sensory information is processed in several complicated ways that I cannot venture to understand, but I think it is reasonable to assume that processing sensory information in your brain 'feels like' something. My question to Akumanimani and westprog would be this: Is there a reason you suspect that consciousness is more than what it feels like to process information? Or do you think that it isn't reasonable to assume that processing information feels like anything at all?

Okay, that makes two questions, actually...

They will say that "feel" is already predicated on consciousness.

So you should change your questions accordingly and ask them again.
 
Or can it be that actually there is no difference, it's simply that the brain is completely conscious yet this self-consciousness we experience is but an aspect of it?

Yes.

Your precious GWT explains it quite well.

The self consciousness "you" experience is the globally accessible stuff. The rest of the private modules are also conscious, and experience, and all that jazz, but the "outer you" has no access to that stuff, as GWT asserts.

How would you know whether a subsystem of your brain was itself conscious, under GWT? You wouldn't have access to such information, any more than you would have access to the behavior of individual neurons in your own head.
 
You also have no human beings. An unfortunate side effect.

You are AGAIN assuming your conclusion. Why the hell do you persist in doing that ?

People have been experiencing unreal things for all of recorded history, and more. So why would your perception of this "special" consciousness be different ?

You wave your hands, claiming to "know" that the mind is special, but when pressed, you can't even show why.

Human beings will persist in thinking of themselves as human - except for the minority who think of themselves as smart thermostats.

Why would it make a difference to you ?
 
The reason that human beings know that they are conscious is because they directly experience consciousness. The mere fact of knowing anything demonstrates consciousness.

Does experiencing ghosts prove ghosts ? UFOs ?

It is so laughably easy to have a computer call itself "I" that it cannot be said to demonstrate anything. The same applies to an assertion of consciousness.

YOU're the one who said that the only way to know that humans are conscious is that they claim to be, and that "I" placed me outside of machine realm. Make up your mind.

For a computer program to convincingly demonstrate consciousness, a first step is that it would do it spontaneously.

"Spontaneously" ? What the hell do you mean by that ?
 
Perhaps you should examine why you are so concerned that there should not be an HPC

I don't care, actually.

What I've been doing, so far, in this thread, is to try and find out if we need to use Occam's Razor on this one, or if we should look further. You say the latter, but unlike your opponents in this debate you seem unable to explain why except by your appeal to humans' gut feelings.
 
Yes.

Your precious GWT explains it quite well.

The self consciousness "you" experience is the globally accessible stuff. The rest of the private modules are also conscious, and experience, and all that jazz, but the "outer you" has no access to that stuff, as GWT asserts.

How would you know whether a subsystem of your brain was itself conscious, under GWT? You wouldn't have access to such information, any more than you would have access to the behavior of individual neurons in your own head.

I agree. But for Strong AI a new question now arises with GWT. How is, say, visual phenomenology innately self-referencing, as Pixy asserts? It appears, from Edelman and others research, that global access is switched by the thalamus, but how is the phenomenology itself innately self-referencing? For me, GWT challenges Strong AI far more than it supports it.

Nick
 
The reason that human beings know that they are conscious is because they directly experience consciousness. The mere fact of knowing anything demonstrates consciousness.

...snip...

Perhaps if I re-write what you have posted using the definitions you have given for some of the words it will help you understand the problem you have:

The reason that human beings [undefined word] that they are [undefined word] is because they directly [undefined phrase]. The mere fact of [undefined word] anything demonstrates [undefined word].​
 

Back
Top Bottom