The Hard Problem of Gravity

We have a physical theory for consciousness.


Darwin was, nonetheless, entirely correct.


Wrong analogy. Darwin was right, even before 150 years of research into the details. Consciousness as self-referential information processing is the Theory of Evolution of neuroscience. It's perfectly obvious once you understand it, but all the interesting stuff is in the details.

Darwin observed the phenomenon of Evolution. He did not know the mechanism. Until the mechanism was discovered, by people investigating the physical structures, it remained a mystery.

In the case of consciousness, nobody knows how the physical structures generate it, and yet non-physicists promote the non-physical theory of Self-referential information processing as if it were an explanation for anything. At least the people who thought proteins were the agent of replication were dealing with something real. SRIP isn't even wrong.
 
You are a self-referential information processing system. These "feelings" and "experiences" are merely sub-processes. They "feel" real to "you" because that "you" is nothing more than a synthesis of all these sub-processes.

Computers work exactly the same way.

It is more complex than this, Pixy. Humans are not the same as AI here. The act of identification with thought or feeling creates an intensity that is otherwise not present. Thus "my sadness" feels more intense than "the sadness." Use of self-referential terminology increases identification. The notion that feelings "belong to someone" (albeit a user illusion) increases the intensity of feeling.

As far as we know, computers don't feel, so again there are these considerable unassessed differences between AI and human consciousness.

Nick
 
It is more complex than this, Pixy. Humans are not the same as AI here. The act of identification with thought or feeling creates an intensity that is otherwise not present. Thus "my sadness" feels more intense than "the sadness." Use of self-referential terminology increases identification. The notion that feelings "belong to someone" (albeit a user illusion) increases the intensity of feeling.
So, how are humans different to AI?

As far as we know, computers don't feel
What, exactly, is this strange assertion based upon?

so again there are these considerable unassessed differences between AI and human consciousness.
What differences?
 
So why all the gum-flapping if you don't even know WHAT consciousness is ?

I'm "gum-flapping'' because there are self-deluded tools that have convinced themselves that they do know exactly what it is when they clearly do not. Simply defining a process and then labeling it 'consciousness' is not defining consciousness. They're essentially pulling an answer outta their @$$ and claiming it to be gold standard. Such an approach is not only presumptuous and counter productive; its downright idiotic.


Actually, that's wrong. It's more equivalent to saying "things fall because that's what being heavy entails." which would be pretty much correct, even without a precise theory of gravity. Newton even managed to formulate one without even knowing what it was.

[FYI, the sweeping declarations being touted here by S-AI proponents are as feeble as the theistic explanation of gravity.]

Consciousness hasn't had it's Newton yet; and it will probably be a while before it has its Einstein. In the meantime, its counter productive for people to pretend that it has been properly defined already because it most definitely has not. The strong AI proponents here are putting the cart before the horse. They are essentially claiming to have created actual gravity via computer programming before there is even a physical explanation of what gravity is. Attempting to simulate an undefined physical function and then declaring such a simulation attempt to be the actual phenomenon is absolutely asinine.
 
Last edited:
This might be interesting if it had anything to do with, well, anything.

Forget about "phenomenology". It's not going to lead you anywhere; indeed, it can't. Instead, study visual perception. Study what actually happens. Listen to the lecture series. Read Hofstadter.

I'm away in Holland so not reading GEB at the moment, though I enjoy it. But from what I read before I don't think H is really dealing with visual phenomenology. Inner dialogue for sure, but visuals not.

So, for the 3rd time, can you explain just how visual phenomenology self-references? Just a little bit of explanation? Go on, Pixy. Actually give something instead of all these thinly veiled put-downs and protestions to read or listen to other material. Give, man.

Nick
 
This is why I don't spend a lot of time responding to PixyMisa.

There is quite clearly not a physical theory of consciousness.
I've already given you one, so you're wrong.

This is not something ambiguous, or debatable.
That much is correct. And you're wrong.

Physics is a well-defined field, and we know what's involved in publishing theories. We know what a theory is in physics, and there simply isn't one.
You're confused.

Physicists don't study consciousness. That's neuroscience. Evolution is a physical theory, but it's not part of physics courses.

The only physicist involved in the area in any way that I'm aware of is Penrose, and he would not class his speculations as having reached the stage of a theory yet.
I'd class his speculations as being complete nonsense, but this is irrelevant.

There's plenty of neurological research going on, but while there's a wealth of biological information, it does not add up to a theory which explains consciousness.
Yes it does, and I've told you what it is.
 
Simply defining a process and then labeling it 'consciousness' is not defining consciousness.
Then I'm sure you can point out a flaw in our definition.

Note that this flaw has to be coherent, well-defined, operationally defined, real, and actually a flaw in our definition.

We'll be right here.
 
The flaw is that the definition doesn't "feel" right.

For some reason we should consider this a deadly serious problem.
 
Because we do directly experience our own awareness. There's nothing intervening.

I suggest that you, as an adult, have forgotten just how much learning it took for you to be able to speak coherently about your own private behavior. These things, which you claim are the only ones we directly experience, are far more difficult for us to describe than are objects in our environment. It's easy for us to point to circles, red, fuzzy, distant or heavy; describing pain is difficult ("is it a stabbing pain, a throbbing pain, a dull pain...?" note that we use terms that are based on things in our environment to describe the pain inside us!). Describing love, more so ("when you say you love me, do you mean the same thing I mean when I say I love you?").

There is quite a bit intervening when we experience our own awareness, but you have forgotten it, since most took place while you were still becoming verbal. To learn "dog", all you had to do was to be able to learn to agree with your verbal community, that this group of objects were dogs, and other things were non-dogs. There were any number of dogs around to point to, and virtually all of your verbal community could see them, just as you could. But... when learning your awareness, your pain, your love, learning any of the feelings, between which and you there is "nothing intervening"... the only way to learn those was from people who had no access to your feelings, and to whose feelings you had no access. So you learned them through the public behaviors, objects, and situations that accompany them. If our feelings are the same, it is because similar bodies (including nervous systems, for you reductionists out there), in similar situations, can be assumed to behave similarly (assumed, not proven). All the similarity is in the physical systems; to the extent that one assumes a "consciousness" that arises, separately from the behavioral response to an environmental stimulus, there is no reason to assume that this "consciousness" (in scare quotes to distinguish it from my behavioral definition, which is actually coherent) is the same from person to person. If it is something that is not explainable by the self-referential feedback loops, if it is a mystery, then we have no reason to assume that we are talking about the same "experience" from person to person. The reasons we have to assume things are similar are all included in the feedback loop/behavioral version of awareness.

Long story short... (too late!), we actually have considerably less confidence in our introspective accounts than in publicly available (I won't call them "objective", since the objective/subjective distinction is a superfluous can of worms) relationships.
 
Then I'm sure you can point out a flaw in our definition.

Your definition implies that the the entire universe is conscious. Unless one is an idealist, this definition does not suffice. Unless you want to start declaring the unconscious and comatose 'conscious' I strongly suggest you rethink your assumptions.

Note that this flaw has to be coherent, well-defined, operationally defined, real, and actually a flaw in our definition.

The flaw is that while your definition is coherent, well-defined, and operationally defined it empirically fails. Another flaw is that the S-AI definition assumes too much and explains nothing beyond defining itself. If you were to be beat into a senseless coma, by the S-AI operational definition of consciousness, you would still be conscious. By any reasonable standard, this is a neon-sign of a clue that the S-AI definition is bull.
 
Last edited:
Your definition implies that the the entire universe is conscious. Unless one is an idealist, this definition does not suffice. Unless you want to start declaring the unconscious and comatose 'conscious' I strongly suggest you rethink your assumptions.

...snip...

Wanted to tackle your comment from a slightly different direction, so making this second post.

Unless you want to hold that there is the "entire universe" and there is also consciousness (a form of dualism) than the "entire" universe is conscious.
 
Your definition implies that the the entire universe is conscious.
I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a statement.

The flaw is that while your definition is coherent, well-defined, and operationally defined it empirically fails.
Empirically fails what?

Another flaw is that the S-AI definition assumes too much and explains nothing beyond defining itself.
It explains consciousness.

If you disagree, well, you should know the drill by now.

If you were to be beat into a senseless coma, by the S-AI operational definition of consciousness, you would still be conscious.
Nope.

You would not be conscious. There would still be conscious processes going on in your brain. The two are not the same.

By any reasonable standard, this is a neon-sign of a clue that the S-AI definition is bull.
No, it's merely a neon-sign that you haven't understood the argument.
 
Wanted to tackle your comment from a slightly different direction, so making this second post.

Unless you want to hold that there is the "entire universe" and there is also consciousness (a form of dualism) than the "entire" universe is conscious.
That depends on what your definition of is is. ;)
 
No it doesn't, that you start with such a misunderstanding shows you do not understand the definition that they have been putting forward.

The universe is a self-referential computational process. Operationally, it meets the S-AI criteria for consciousness.
 
Last edited:
Nope.

You would not be conscious. There would still be conscious processes going on in your brain. The two are not the same.

Then clearly, you are using 'consciousness' as a pseudonym for for a much wider range of phenomenon than it actually encompasses. Again, you're simply just defining yourself out of a problem instead of working towards solving it.

edit: Anywho, I gotta get ready for school. I'll continue this discussion later when I gotz moar time.
 
Last edited:
Then clearly, you are using 'consciousness' as a pseudonym for for a much wider range of phenomenon than it actually encompasses. Again, you're simply just defining yourself out of a problem instead of working towards solving it.

edit: Anywho, I gotta get ready for school. I'll continue this discussion later when I gotz moar time.


Aku,

Defining ourselves out of the "problem" is precisely the solution. That is what reductionism amounts to -- properly defining the issue, explaining the components with lower level structures, and realizing that the original "problem" was that the issue was so poorly defined at the start.

If you look at the history of philosophy and look at the proposed 'solutions' you will see that the original 'hard problem' has been solved. Originally, what separated human consciousness from animals was the whole issue of awareness of self, awareness that we are aware -- which is solved through self-referential loops.

The new 'hard problem' is just something that was cast aside earlier because it wasn't thought worthy of discussion -- namely, the passions. I don't understand at all why people think it is so difficult to explain emotions/motivational states/moods. We just haven't tried to do so until very recently. Think about it -- they have to show up somehow for them to be anything. They happen to show up as the feeling of what happens. Nothing mysterious there -- we just need to understand the wiring diagrams of limbic/thalamic projections and we'll move forward.

This discussion reminds me too much of the current ID crowd who have God -- the Ultimate Reality, the Meaning of All Existence -- designing bacterial flagella. Consciousness is now reduced simply to the feeling of what happens -- that part of mind action that was considered not only not-mysterious in the past, but unworthy of attention.

What folks here seem most interested in conveying, from what I can tell, is that the former supposedly 'hard problem' has been solved. This new 'hard problem' is window dressing, something that we can finally begin to deal with since the self-referential bits turn out to be fairly simple after all.

Personally, I don't see why or how emotion/mood/motivation are all that hard of a problem.
 

Back
Top Bottom