The Hard Problem of Gravity

It is only via introspection that any of us can recognize our own consciousness [by Pixy's definition, introspection is the same as consciousness -- but I digress]. It is our capacity for introspection that allows us to create language to label, organize, and communicate our thoughts and other subjective states. The only way we can assume that others are also conscious is by inferring from our own introspective examinations. Introspection is just as empirical, and essential, as extrospection.

No. The way we learn our own introspective states is by observation of others. This is quite clear. If we labeled our thoughts and other subjective states through introspection, we could not (unless we were psychic) know what another person meant by "I'm hungry", "I'm angry", or "I love you."

There may be much left to learn about the process, but what you have just written is clearly not in the running.
 
No. The way we learn our own introspective states is by observation of others. This is quite clear. If we labeled our thoughts and other subjective states through introspection, we could not (unless we were psychic) know what another person meant by "I'm hungry", "I'm angry", or "I love you."

There may be much left to learn about the process, but what you have just written is clearly not in the running.

We aquire the cultural labels of our internal states by observing others. The states and introspective capacities are there independent of, and prior to, the aquisition of any learned language. Language itself is merely an evolved capacity of communicating these states among entities which share some [or all] such states in common.
 
Last edited:
There have been multiple attempts, historically. Introspection, as practiced by James & Wundt, for example, proved flawed. Wundt's more objective measures are still used today--particularly the method of subtraction--and scientifically examine private experience. Fechner and Weber's psychophysical methods are still used today. So, it may be logically impossible, but don't tell the people who are doing it already.
So you have the observation of the experience of the feeling of the typing. Do you have qualia of the observation of the experience of the feeling of the typing? How many turtles in total? (I am just pointing out the layers of hypothetical constructs or explanatory fictions that are presupposed in that one sentence.)
The language frankly isn't that important. I can make the same point about the sentence "I feel myself typing." That sentence can be explained thusly: Nerves in the fingers send signals to the brain which lead to tactile processing activities which lead to language processing activities which lead to motor processing that causes the fingers to write the sentence. Even though the sentence expresses an experience, no actual experience is required to explain its having been written.
Given my views, no. But frankly, the example and explanation are so far removed from my views that even "no" is an odd answer.
For argument's sake, just presume this is all done under controlled conditions and everything or whatever you want.
The brain is actually part of the body, you know that. It is not a dualistic relationship. And reductionism is not explanation. It's description, albeit at a different level. What caused the brain to cause the body...?
How did I claim the brain was not part of the body? I said the brain sends signals to the body as in a hand or the larynx or what have you; that doesn't mean I think the brain is immaterial.
If one presupposes dualism, then perhaps there is no scientific way of studying experience. Which was kinda my point. "From some scientific views; not from all."
My argument has nothing to do with presupposing dualism. No immaterial concept is invoked in saying that expressions of having experienced things can be explained without the concept of experience and that experience is therefore either nonexistent or existent but incapable of being studied.
 
I know. It only has relevance when you say a rock switches "just like" a thermostat, because a rock does not switch "just like" a thermostat. If it did, you wouldn't have to provide the new definitions for the ON states.

It is, however, functionally equivalent to a thermostat in computational terms. No, you can't plug in a rock to replace a thermostat, but you can't plug in a metric thermostat to replace one designed with Imperial units.

However, the point is actually important. In terms of the processes going on in the rock, they are quite different to what's going on in the thermostat - though not as different as between the thermostat and a logic gate on a Pentium, or a neuron.

Absolutely. If you want to say rocks switch, then under my definition of consciousness rocks are conscious.



"Computational" is a subset of "physical," so I already assume consciousness is physical in nature.

All computation has to take place using some kind of physical process. However, the Hard AI model, AFAIAA, assumes that all computations of the same algorithm are equivalent, even when they are physically different. I don't see how that can possibly be considered a physical model.

My "certain knowledge" is derived from the fact that all observable results of consciousness are computational. That means consciousness is computational. To show otherwise, all you have to do is find a single counterexample.

Of course, you can't. We have been asking you for years, and you haven't ever done so.

I'm not aware of any actual, real thing that is not created by an actual physical process.

Maybe you should define "physical" to begin with?

Physical effects involve matter and energy interacting in space and time. The hard AI theory is that it doesn't matter how much matter and energy is involved, how it's distributed, how it interacts or how long it takes. The energy transfer can be arbitrarily small. As we've established above, the interaction can be electro-chemical processes, semi-conductor electronics, knobs and levers or even rocks and streams of water. All will give rise to the same effect.

I'm not sure that people making this claim realise quite how extraordinary it is. There is no other physical phenomenon which is independent of just about every physical parameter. We are supposed to believe that this effect arises - and to make things even more bizarre - if that were possible - there isn't even a physical theory to tie down exactly when and how these patterns arise.

The problem with the idea that computation is in some sense "real" is that the only thing which can detect that something called computation is going on is a human being. The rest of the universe can cope quite happily with no such concept. A rock or a thermostat or a computer can be described as if each were entirely independent objects, working in different ways to give different results. The only need we have for the non-physical concept of computation is to engage with the human mind. And so we end up with the circular idea that because the human mind is the only thing for which the concept of computation makes a difference, the human mind must work by computation.

As to the claim that all observable results of consciousness are computational - that's again just an assertion. Can a symphony be composed computationally? Can a novel be written? Can even a simple conversation be carried out? The fact is that most of what human consciousness does has not been emulated by computation in forty years of trying, so how is it possible to make such an extravagant claim?
 
We know what 'subjective experience' means. What is being sought is an objective way of defining, identifying, and reproducing it.

FYI, I don't think the Turing test is a valid way of determining consciousness. Its merely an intelligence test, not a consciousness test. All the Turing test does is highlight the epistemological problem of inferring consciousness in entities other than ourselves. The means to overcoming this problem is to find the necessary objective correlates of subjective experience, not simply inferring from contingent correlates of gross behavior.

Well, I don't accept the Turing test as being proof of consciousness. It's merely a prerequisite to examine the device that lets us think it's human to figure out how it does it. Obviously something like Eliza isn't conscious, and we don't need to worry too much about exactly how it works.
 
Somebody said something really cool and insightful in this thread. Can't quite remember which page it was on though.

It summed up all the arguments in a concise and elegant way, left nothing important unsaid, and finished off with a really funny joke.
 
We aquire the cultural labels of our internal states by observing others. The states and introspective capacities are there independent of, and prior to, the aquisition of any learned language. Language itself is merely an evolved capacity of communicating these states among entities which share some [or all] such states in common.

Babies seem quite able to experience the sensation of "I'm hungry" without the capacity to label it, and they certainly don't recognise it in others.
 
We know what 'subjective experience' means. What is being sought is an objective way of defining, identifying, and reproducing it.


Perhaps. What is the definition then? If we examine it, perhaps we can make headway?

Subjective seems clear -- it is private. It is the experience thing and awareness that seem a bit fuzzy. So how do we define them? How do we go about working with these concepts?
 
Babies seem quite able to experience the sensation of "I'm hungry" without the capacity to label it, and they certainly don't recognise it in others.

Interesting, hunger. You'd think that, since we can "experience the sensation of 'I'm hungry' " from such a young age, we'd be better at acting on our hunger. Babies will eat when they are agitated, when they are bored, when they are anxious (I am using the labels that the mothers put on their babies' actions), as well as when they are hungry. They will cry for multiple reasons as well. How is it you say they recognize this sensation, when so much of your conclusion is based on your own inference about their behavior? (When you get to assign motives to someone else, it is much easier to think they are acting for the reasons you suspect.)

No one has greater access to our introspective hunger than we do; if anyone knows how hungry we are without looking at our public behavior, it is us. And yet, who has not loaded up a buffet plate, eaten their fill, looked at the pile of food remaining and said (or thought) "I guess I wasn't as hungry as I thought I was"? Who has not gone back for seconds, thirds, etc. (my record is seven complete meals) and said (or thought) "Wow, I was hungrier than I thought I was!"? Note here, even when judging our own hunger, we defer to our actual observable behavior (how much we ate) as the real measure of our hunger.

My son no longer feels hunger. He is diabetic, and his eating is not something that depends on his feeling hunger. It is dependent on his blood sugar and insulin levels. If he needs to eat, he needs to eat, no matter what he feels. If he's hungry, but doesn't have his insulin, he can't eat. And so the private behavior of feeling hungry ceased to be a good predictor of eating (as it is for those of us who get hungry in a place where food is readily available).

As a daycare provider, I watched kids who thought they were sick, but were just hungry, or thought they were just hungry, but were sick.

Our feelings are more difficult to learn than, say, our colors. We just don't tend to remember learning them, since our parents and others have worked on those words much earlier than colors. But we still experience misattribution of emotions as adults. And boy do we experience it as kids, and as babies. We've just forgotten.
 
Last edited:
The problem with the idea that computation is in some sense "real" is that the only thing which can detect that something called computation is going on is a human being. The rest of the universe can cope quite happily with no such concept. A rock or a thermostat or a computer can be described as if each were entirely independent objects, working in different ways to give different results. The only need we have for the non-physical concept of computation is to engage with the human mind. And so we end up with the circular idea that because the human mind is the only thing for which the concept of computation makes a difference, the human mind must work by computation.

Nope.

The thing that we call "computation" exists regardless of human observers. The word "computation" does not.

As to the claim that all observable results of consciousness are computational - that's again just an assertion. Can a symphony be composed computationally? Can a novel be written? Can even a simple conversation be carried out? The fact is that most of what human consciousness does has not been emulated by computation in forty years of trying, so how is it possible to make such an extravagant claim?

"Nobody has been able to produce the number 29865268282093850285829820672073265 by adding 1 at a time, so how is it possible to make such an extravagant claim?"

lol.

You might want to look up mathematical induction sometime.
 
AkuManiMani said:
We aquire the cultural labels of our internal states by observing others. The states and introspective capacities are there independent of, and prior to, the aquisition of any learned language. Language itself is merely an evolved capacity of communicating these states among entities which share some [or all] such states in common.
I think you're simplifying too much in terms of creating a dichotomy that really isn't there in the way you have presented it. Labeling is also part of our internal interpretation, and whilst being an integral part of interpretation, it also changes and creates new internal states in reference to that, which in turn gives rise to new interpretations and new states etc. Thus, interpretation and labeling are also part of the whole conditioning by which we for some reason then feel tempted to say that "introspective capacities are there interdependent of..." It's really not that clear cut at all. In much, we have learned, internalized and directed what we feel and how we experience our "own" internal states (including that of which is our own and which is not).

We also learn how to feel by learning to interpret context (where labels are an integral part too). In certain contexts the same actions can be felt differently. Seeing someone slipping can induce a variety of internal states depending on the context being a 'clown' doing his 'act' on a 'stage', or your 'grandpa' 'struggling' down the 'stairs'. Often we learn to like caviar, beer, coffee or smelly cheese.
 
It is, however, functionally equivalent to a thermostat in computational terms.
Wrong.

No, you can't plug in a rock to replace a thermostat, but you can't plug in a metric thermostat to replace one designed with Imperial units.
Of course you can.

However, the point is actually important. In terms of the processes going on in the rock, they are quite different to what's going on in the thermostat - though not as different as between the thermostat and a logic gate on a Pentium, or a neuron.
What is the difference between a thermostat and a logic gate in a Pentium, please? (We'll leave neurons for later.)

All computation has to take place using some kind of physical process. However, the Hard AI model, AFAIAA, assumes that all computations of the same algorithm are equivalent, even when they are physically different.
Church-Turing thesis.

I don't see how that can possibly be considered a physical model.
You are conflating two things. The Church-Turing thesis - i.e. the universal rules of computability - and physical processes of computation. The only confusion is what you have introduced.

I'm not aware of any actual, real thing that is not created by an actual physical process.
Good!

Physical effects involve matter and energy interacting in space and time. The hard AI theory is that it doesn't matter how much matter and energy is involved, how it's distributed, how it interacts or how long it takes.
Not that it doesn't matter; specifically, when you are looking at the information rather than the substrate in which it is encoded, the information is equivalent.

The energy transfer can be arbitrarily small.
Not arbitrarily.

As we've established above, the interaction can be electro-chemical processes, semi-conductor electronics, knobs and levers or even rocks and streams of water. All will give rise to the same effect.
All give rise to the same information, given the same input and the same logic. (And excluding physical failure.) It cannot be otherwise; it would be a logical contradiction.

I'm not sure that people making this claim realise quite how extraordinary it is. There is no other physical phenomenon which is independent of just about every physical parameter.
There is no physical phenomenon which is independent of just about every physical parameter. You're just making stuff up.

We are supposed to believe that this effect arises - and to make things even more bizarre - if that were possible - there isn't even a physical theory to tie down exactly when and how these patterns arise.
Wrong, and wrong.

The problem with the idea that computation is in some sense "real" is that the only thing which can detect that something called computation is going on is a human being.
And wrong. Human beings detect computation through computation.

The rest of the universe can cope quite happily with no such concept.
And wrong. Computation requires a physical substrate, and of necessity has a physical effect.

A rock or a thermostat or a computer can be described as if each were entirely independent objects, working in different ways to give different results.
Well, of course they can.

And both human beings and cheeseburgers can be modeled through quantum mechanics.

The only need we have for the non-physical concept of computation is to engage with the human mind.
And wrong. It is a specific class of physical behaviours.

And so we end up with the circular idea that because the human mind is the only thing for which the concept of computation makes a difference, the human mind must work by computation.
And wrong, and wrong. False premise and strawman.

As to the claim that all observable results of consciousness are computational - that's again just an assertion.
No it isn't.

Can a symphony be composed computationally?
Yes, of course.

Can a novel be written?
Yes, of course.

Can even a simple conversation be carried out?
Yes, of course.

The fact is that most of what human consciousness does has not been emulated by computation in forty years of trying, so how is it possible to make such an extravagant claim?
The fact is rather that you are wrong about this, and that you have already been shown to be wrong. Repeatedly. With cream.
 
Our feelings are more difficult to learn than, say, our colors. We just don't tend to remember learning them, since our parents and others have worked on those words much earlier than colors. But we still experience misattribution of emotions as adults. And boy do we experience it as kids, and as babies. We've just forgotten.
Very good point, Mercutio!
 
It is only via introspection that any of us can recognize our own consciousness [by Pixy's definition, introspection is the same as consciousness -- but I digress]. It is our capacity for introspection that allows us to create language to label, organize, and communicate our thoughts and other subjective states. The only way we can assume that others are also conscious is by inferring from our own introspective examinations. Introspection is just as empirical, and essential, as extrospection.

For me, the complexity here is the assumption that introspection is happening to someone, that there is an actual subject of experience once one examines at a level below that of the whole brain or whole organism. What is valid for a whole organism does not necessarily exist in the manner it is intuitively felt to exist at a smaller window of examination.

Nick
 
Our feelings are more difficult to learn than, say, our colors. We just don't tend to remember learning them, since our parents and others have worked on those words much earlier than colors. But we still experience misattribution of emotions as adults. And boy do we experience it as kids, and as babies. We've just forgotten.

A small point, perhaps, but do you mean to say that we learn our actual feelings? For me feelings are autonomous states that happen in reaction to stimuli. We learn to label them as having certain qualities and we learn to associate them with certain triggers, but we don't learn to have them. One might also say that the developing sense of self facilitates the arising of certain feelings, as it learns to percieve the world in terms of subject-object relationships, but again this is not learning to have feelings. It also learns to repress certain feelings and develop avoiding behaviours.

Nick
 
Last edited:
I would say that it is learned to a great extent at least. Under the process of differentiation there is sensation. At least I have more or less been forced to learn to distinguish between many different kinds of pain. If you dig deep enough it seems quite obvious that pain in and of itself is somewhat of a robust shorthand for interpreting and categorizing sensations at a higher abstraction level.

It's always surprising how what at first glance could be felt as "having unbearable pain" seems less so when attention moves towards the actual sensation and thus past the first interpretation, or when it moves completely away from it. It's also fascinating how good you might feel immediately after the pain stops or subsides. It often seems that the "unbearable" part is an additional part of the whole interpretive cascade for not being able to move attention into the sensation or away, thus a sort of intolerable rocking back and forth without end in sight. We sort of interpret both the sensation, our behavior and calculate future possibilities, seemingly at the same time, and finally label it accordingly. In other instances we might simply pass out.

I would also say that pain is however less influenced by learning and context than for instance all the variations of moods, such as the scale from elevation to angst. Those seem much more dependent on learning and interpretation vis-à-vis brute sensation. Not to mention aspects of taste, which interestingly enough is also used when referring to music or art etc.
 
Nope.

The thing that we call "computation" exists regardless of human observers. The word "computation" does not.

That is something to which I cannot agree. No computation can take place without a key to interpret it. Such a key must always be in the possession of an observer, who must be, AFAWK, a human being.

That's also the flaw with the universe simulated with a row of stones idea. Only the person laying out the stones knows what the simulation means. The stones aren't in possession of the key, without which it is meaningless.

"Nobody has been able to produce the number 29865268282093850285829820672073265 by adding 1 at a time, so how is it possible to make such an extravagant claim?"

lol.

You might want to look up mathematical induction sometime.
 
westprog said:
That is something to which I cannot agree. No computation can take place without a key to interpret it. Such a key must always be in the possession of an observer, who must be, AFAWK, a human being.


From Wiki:
Computations as a physical phenomenon said:
A computation can be seen as a purely physical phenomenon occurring inside a closed physical system called a computer. Examples of such physical systems include digital computers, quantum computers, DNA computers, molecular computers, analog computers or wetware computers. This point of view is the one adopted by the branch of theoretical physics called the physics of computation.

An even more radical point of view is the postulate of digital physics that the evolution of the universe itself is a computation - Pancomputationalism.


I wonder what would happen to 'computation' in a scenario when the last human in the universe observing a computer doing the 'computing' dies, but the computer keeps working. Does computation now disappear? Why wouldn't that which is referred to as 'computation' continue?
 
And DNA and environmental factors have created this capacity for subjective experience. It is there. It's a fact to be dealt with. And nobody will really accept that a machine consciousness is equivalent to human consciousness in any important respect unless it produces the same thing.

That's nice. Now, suppose we cal "subjective experience" "private behaviour", as Mercutio suggests. Does that make it seem less mysterious ?
 

Back
Top Bottom