The Hard Problem of Gravity

Well, as I see it there is a basic complication here in that it's well known that the brain also processes vast amounts of information unconsciously. For me this fact is what creates the potential for a "hard problem."

The key question is thus to ask whether there is a qualitative difference between conscious and unconscious processing. Obviously, at a subjective-experiential level there is a immense qualitative difference, but can one demonstrate that this exists at a neuronal level in the brain? Or can it be that actually there is no difference, it's simply that the brain is completely conscious yet this self-consciousness we experience is but an aspect of it?
Good points. Stuff to think about.

Pixy's assertion, which I don't buy, is that the only difference between so-called "conscious" and "unconscious" processing is the presence of a self-referencing loop in the former.

Nick
I have no idea about self referencing processing, but it sounds like a fairly reasonable explanation.

They will say that "feel" is already predicated on consciousness.

So you should change your questions accordingly and ask them again.
I disagree. That would just be a "yes" to the second question while evading the reason for it.
 
Okay, Let me sum up my position:

Consciousness is still a 'hard problem' because there are no sufficient formal definitions of it. All the definitions being put forward as such by the strong AI proponents here are not sufficient because there are innumerable examples of their operational criteria being met that do not actually produce conscious experience.
Could you give us one of these innumerable examples, please?
 
Because awareness is qualitatively different from unawareness.
What is this difference?

If we look at GWT...if Strong AI is correct, then it must be that each parallel networked module is itself conscious, even though "I" am only aware of that which is being broadcast between modules.
That's not exact, but close enough, yes. Under GWT, the "I" is a synthesis of all those conscious processes. You are not aware of them, but they are what is actually doing everything. The conscious mind is an illusion with no causal efficacy (which we know from experiment), because all of that actually happens at a lower level that you cannot directly access. (Because you are the illusion.)
 
I don't know of any definition of consciousness that actually defines consciousness, never mind explains it.
Consciousness is self-referential information processing.

As soon as you said "I" you took things out of the realm of machines.
I is just the self-reference.

The machine doesn't need an I.
Need is irrelevant. If it's self referential, it has an I.

Indeed, "I" is something we invented to explain consciousness.
Wrong.

If there's no consciousness, then there's no need to identify one atom as "Me" and another atom as "part of world".
Wrong.

The only reason to assume that human beings possess consciousness is that they assert they have consciousness.
Wrong.

The only reason to assume that animals have consciousness is because they are similar to human beings.
Wrong.

The grounds for assuming that a thermostat has consciousness are so tenuous as to be hardly reasons at all.
Yes! Correct! Which is why no-one suggests that thermostats are conscious.

Why, how, where and even who.
And whither, whence, and which.
 
Well, as I see it there is a basic complication here in that it's well known that the brain also processes vast amounts of information unconsciously. For me this fact is what creates the potential for a "hard problem."
Why is that a hard problem? Why is that any sort of problem?

The key question is thus to ask whether there is a qualitative difference between conscious and unconscious processing.
Why is this the key question?

Obviously, at a subjective-experiential level there is a immense qualitative difference
This is not obvious to me at all.

Or can it be that actually there is no difference, it's simply that the brain is completely conscious yet this self-consciousness we experience is but an aspect of it?
Not quite, but you're getting there.

Pixy's assertion, which I don't buy, is that the only difference between so-called "conscious" and "unconscious" processing is the presence of a self-referencing loop in the former.
Yes. And I'm still awaiting a coherent counter-argument.
 
Perhaps if I re-write what you have posted using the definitions you have given for some of the words it will help you understand the problem you have:

The reason that human beings [undefined word] that they are [undefined word] is because they directly [undefined phrase]. The mere fact of [undefined word] anything demonstrates [undefined word].​

You should try speaking English to a few real people some time. You'd be amazed at how much of what they say reduces to [undefined word] or [word defined in terms of undefined word]. And yet they communicate.

Maybe it isn't me with the problem.
 
I don't care, actually.

What I've been doing, so far, in this thread, is to try and find out if we need to use Occam's Razor on this one, or if we should look further. You say the latter, but unlike your opponents in this debate you seem unable to explain why except by your appeal to humans' gut feelings.

I'm pleased that you are acknowledging that gut feelings actually exist. Now it's a matter of figuring out where they come from.
 
Does experiencing ghosts prove ghosts ? UFOs ?

No, because we don't directly experience ghosts or UFO's. We directly experience experience.

YOU're the one who said that the only way to know that humans are conscious is that they claim to be, and that "I" placed me outside of machine realm. Make up your mind.



"Spontaneously" ? What the hell do you mean by that ?

I mean that any fool can write

10 PRINT "I AM CONSCIOUS"

and it means precisely nothing. It doesn't imply consciousness or identity.

If a program has a line in it which we put in to say that the program is conscious, it does not mean the program is conscious. Because we could patch it to

10 PRINT "I AM NOT CONSCIOUS"

How can a determinative program spontaneously assert consciousness? Beats me. That's a problem for the Hard AI crowd.
 
I'm going to pile on and give my opinion:

I think everyone agrees that what we experience is sensory information. Now this sensory information is processed in several complicated ways that I cannot venture to understand, but I think it is reasonable to assume that processing sensory information in your brain 'feels like' something. My question to Akumanimani and westprog would be this: Is there a reason you suspect that consciousness is more than what it feels like to process information? Or do you think that it isn't reasonable to assume that processing information feels like anything at all?

Okay, that makes two questions, actually...

When "feel like" appears in the physics text books, we'll know we've explained "feel like". In the meantime we have to choose between something real yet unexplained, or something in some sense unreal.
 
You are AGAIN assuming your conclusion. Why the hell do you persist in doing that ?

People have been experiencing unreal things for all of recorded history, and more. So why would your perception of this "special" consciousness be different ?

It's possible for the experience to reference something unreal. It is not possible for the experience itself to be unreal. All it can pretend to be is another experience.

You wave your hands, claiming to "know" that the mind is special, but when pressed, you can't even show why.



Why would it make a difference to you ?
 
No, because we don't directly experience ghosts or UFO's. We directly experience experience.
"Directly experience experience"? What does that even mean?

I mean that any fool can write

10 PRINT "I AM CONSCIOUS"

and it means precisely nothing. It doesn't imply consciousness or identity.
It's not self-referential information processing, so it fails of my definition too.

So?

If a program has a line in it which we put in to say that the program is conscious, it does not mean the program is conscious.
Correct, because it's not doing self-referential information processing.

Because we could patch it to

10 PRINT "I AM NOT CONSCIOUS"
Yeah, but in either case it's not doing self-referential information processing, so the whole argument is irrelevant.

How can a determinative program spontaneously assert consciousness?
Self-referential information processing.

Beats me.
Self-referential information processing.

That's a problem for the Hard AI crowd.
It's a solved problem, and the answer is self-referential information processing.
 
No, I want to know how you are aware that it is qualitatively different to process unconsciously as opposed to consciously.

*eyetwitch*

Okay...deep breath....


Cyborg, from your posting history it would seem that you're a reasonably intelligent guy. I would like you to tell me why I would say that unconscious processing is qualitatively different from conscious processing... -_-
 
westprog said:
We directly experience experience.


I would actually qualify that statement. Thus the whole previous discussion about inferring and observing in terms of timing issues. From a subjective point of view, everything is happening in the present moment (there's no past or future from a first person's experiential perspective). To simplify a bit: When we "look back" we draw from memory and re-construct it in the present moment. When we "look forward" we imagine it in the present moment. The simplification here because we never really remember exactly how it was.

Although objectively speaking: We can't really subjectively experience the present moment presently – it has to be registered by which it has already become something else, if ever so slightly.

The moment we experience something it is already the past for that experience, hence we must re-construct it in some way or another. Doing that enough times, and it certainly feels like we're "continuously" experiencing experience as one single experience, but we are of course inferring that.

Only from a third person perspective can we thus say that we experience experience. Observing that process "objectively" would look vastly different thou; it would look like we are constructing the experience of experience by inference.
 
Mostly, you mean. Sometimes you're conscious of it. Any idea why ?

THATS MY ******** POINT!!

LOARD!

-- I mean...

My point is that no one -- including the strong AI proponents -- has devised a sufficient formal explanation as to why there is any conscious experience at all. There is currently no operational description of qualitative experience and that is my point!

I've been reading the thread since the first post and I'm yet to see a definition of consciousness that isn't circular.

I've already expl--

Okay... Different tac now..

Belz...


Define 'I've'. Define 'been'. Define 'reading'. Define 'the'. Define 'thread'. Define 'since'. Define 'first'. Define 'post'. Define 'and'. Define 'I'm'. Define 'yet'. Define 'to'. Define 'see'. Define 'definition'. Define 'of'. Define 'consciousness'. Define 'that'. Define 'isn't'. Define 'circular'.

Define all of these words without referencing the dictionary, since all it does is 'circularly' reference itself.
 
Last edited by a moderator:
If you weren't entirely reliant upon knee-jerk reactions, you might learn something.

In this case, cyborgs question is the single most important question that can be asked when it comes to understanding how simple information processing can give rise to your own consciousness.

Why are you conscious of your toe at certain times and unconscious of it at others?

Just try to answer the question.

First, my reaction isn't 'knee jerk'. It borne for frustration from seeing otherwise intelligent people ask the most asinine questions I've ever encountered with a strait face -- repeatedly.

Second, the question he asked is not the same as you 'toe' question but, since you insist, I'll answer it [Though, for the life of me, I cannot fathom why you can't figure out the answer yourself or why you seem to think it holds such rhetorical significance].

You become 'conscious' of your toe, or some other object, when you either a) consciously decide to shift your field of focus to said object or b) an overriding sensory cue grabs your attention towards it. This is all assuming, of course, that you are in a conscious state to begin with. Otherwise you do not have the capacity to turn your conscious attention toward anything.
 
I'm going to pile on and give my opinion:

I think everyone agrees that what we experience is sensory information. Now this sensory information is processed in several complicated ways that I cannot venture to understand, but I think it is reasonable to assume that processing sensory information in your brain 'feels like' something. My question to Akumanimani and westprog would be this: Is there a reason you suspect that consciousness is more than what it feels like to process information? Or do you think that it isn't reasonable to assume that processing information feels like anything at all?

Okay, that makes two questions, actually...

The thing is, every object contains and/or processes information. Simply saying 'consciousness is some class of information processing and thats all we need to know' is worth less than ◊◊◊◊, as far as I'm concerned. Consciousness is subjective experience-- any and every subjective experience.

The processing of information in my brain and body [reflexive or otherwise] in the vast majority of cases is not conscious. Clearly, simply stating that 'computation dunnit' is not a sufficient answer.
 
AkuManiMani said:
My point is that no one -- including the strong AI proponents -- has devised a sufficient formal explanation as to why there is any conscious experience at all. There is currently no operational description of qualitative experience and that is my point!


You might come to that conclusion because people tend to conflate perspectives when they explain:

  • from a 1st PP we are observing whereas from a 3rd PP we are inferring;
  • from a 1st PP we are receiving whereas from a 3rd PP we are constructing.
We say "we observe" subjective experiences. It is nevertheless that same as "we infer" them.

We say, yeah but..., we are "feeling" them. But it is nevertheless the same as "constructing" them.

We say, yeah but..., we "have them". But it is nevertheless the same as "identifying" (i.e. constructing by inferring, including "we").
 

Back
Top Bottom