The Hard Problem of Gravity

Ah, but the definition of consciousness as a fuzzy set of public behaviors is once more precisely what you are using. And is the only definition that is available to you. And is considerably simpler, and yet more useful and explanatory, than any definition that hinges on the unobservable.

What do you mean 'unobservable'? Each of us 'observes' our consciousness firsthand every day -- its the fundamental basis of all observations. The definition that I'm going by is the one that *I* personally experience and, by proxy, assume that others do in some capacity as well. It doesn't matter how simple and convenient the definition you're proposing is -- it simply does NOT explain what its claiming to explain. I don't intend to sweep my questions under the rug in favor of the cop-out definition being pushed here by Pixy et al.

I wouldn't even go as far as to say that consciousness is 'unobservable' in the public sense, either. The fact that we can see the effects of it in action via technologies like MEGs, ECoG, and EEGs shows that, in principle, we can measure the external correlates of conscious experience. Heck, using electrical stimulation we can even crudely induce effects on the conscious experience of a subject.

Given the above examples, it seems clear that conscious experience is, or is strongly correlated with, EM activity throughout the nervous system. Better understanding this correlation lies not with merely understanding computation but the physical processes that give rise to it. This is why I stress that consciousness almost certainly is a field phenomenon. All the sensory information taken in by the body to the brain is converted to these electrochemical signals in the nervous system. Any model of consciousness that does not implicitly take the biophysics of the nervous system into account is futile. There isn't any reason why, in principle, we should not be able to gain an understanding of how specific qualitative experiences correlate with these EM processes an why. If there is more to the story than that, then it will make itself known thru further investigation. In the mean time, the 'simpler' explanations not only miss the point -- they are epistemological dead weight.


(BTW, even in the person you describe, there are processes that absolutely are conscious, in a meaningful way, even if the person is unconscious.. For instance, that person has not lost bladder control. Someone knocked unconscious may well lose bladder control, but not heartbeat and respiration, which do have feedback processes. Someone who loses those, is dead.)

Well, I distinguish intelligence from consciousness. Biological processes are examples of natural rather than artificial intelligence. I think that generally, we pretty much have a solid grasp on how autonomic intelligence works. What what still don't understand is conscious experience -- which is not the same as intelligence. I don't think that anyone here is arguing that consciousness cannot be understood or explained we're just pointing out the obvious fact that it hasn't been -- yet. Pretending the problem is going to go away merely because some people have chosen to redefine it is NOT the means to gaining such an understanding.
 
We [by which I mean people in general] can't define what consciousness is "operationally", because we don't know how it operates.
We [by which I mean people who have studied the matter] do.

The entire problem is that we don't know how consciousness operates, so creating an "operating" definition is far too premature.
No.

First, define what you mean by consciousness. If you get that definition correct, you will find that an operational definition is straightforward. If you pile on irrelevant baggage, you'll get nowhere.

Like AAM said, you're defining yourself out of a problem. You're not solving it, not at all.
No. I have correctly identified the problem and solved it.

If you think otherwise, then you actually have to show that I am wrong. Show me something that we attribute to consciousness that (a) actually exists and (b) is not explained by my model.

[By the way - no comment on the fact that the Game of Life, which you hold to be examples of an imaginative program, robotically produces the same output for the same input, every single time?]
No. I didn't use the word "imaginative" at all. I said that Life generates behaviours that are not explicit in the program.

But besides that, why do you think it isn't imaginative? The only general way to predict the outcome of a given pattern in Life is by running the automata. It's deterministic, where the real world isn't, but does that mean you're claiming that true imagination is random?

If you want to define it that way, then we can just disable ECC and you get what you want, sooner or later. (Even with ECC, you get what you want - just later rather than sooner.) Computers are, after all, only pseudo-deterministic.
 
No. I have correctly identified the problem and solved it.

If one didn't know any better, they'd be led to think that you suffer from megalomania. It would explain your argumentation by fiat and dogmatic inability to recognize when you're demonstrably wrong.

If you think otherwise, then you actually have to show that I am wrong.

People here have already; numerous times and numerous ways. You're just psychologically incapable of recognizing it.

Show me something that we attribute to consciousness that (a) actually exists

Conscious experience. If you don't experience it yourself, tough luck.

and (b) is not explained by my model.

Erm...hows about all of it?
 
Last edited:
Mercutio said:
I agree with this. Until the Rocket, PM, et al have a definition for consciousness which distinguishes between an awake and aware human and one in a coma, I don't think they have a sufficient definition for consciousness.

We (our language community) are using "conscious" for several related but separate things. A conscious person and a conscious process are both meaningful terms, but the meaning is not identical. A conscious person is defined by engaging in a set of observable behaviors (whether you agree with this or not, every example here has followed this definition).

Actually, I like your observable behaviors definition just fine. That definition, unlike the self-referential definition, is able to distinguish between an aware and aware human and one in a coma. Where you and I disagree seems to be more along the lines of what constitutes a causal agent. However, I’m still not clear on what you consider to be a causal agent. I don’t suppose you’d care to respond to my post #530? You’ve clearly spent a good deal of time thinking about the problem. I value your opinion on the matter.
 
The mystery isn't whether or not there is qualia. The question is why are there qualitative experiences at all and what physical principle give rise to specific qualitative states.

Is it also a mystery "why 1 == 1 and what physical principle give rise to the identity axiom?"

Because the answers are the same, actually.

To once gain use the mass analogy: An object can have mass but, under certain conditions, it may not register a weight. A person on earth may weight 198 lbs.; that same person in a space station may weight nothing. Regardless of their context they still have mass. What I'm saying is that consciousness has the same relation to knowledge as mass has to weight.

So you take an ill-defined term -- consciousness -- and predicate the definitions of other terms on it? So now "knowledge" is also ill-defined?

This isn't what most people call "making progress."

So basically, you're ignoring the original problem, giving a much simpler problem its name, and then, after you solve the simpler problem you claim to have solved the original. Like I've emphasized earlier, you won't make any headway in solving the problem if you keep averting your eyes from it.

Well you are correct in that respect.

When I was a child I was terrified of closets because I thought there might be alien eggs in them. It was quite a problem for me, figuring out how I could monitor all the closets in our house to make sure there were no eggs, every night. When I got a little older (smarter) I figured out the problem of "whether there were aliens or alien eggs in the first place" and guess what -- the problem of monitoring the closets disappeared!

I've stated to you exactly what it is you're missing. You just choose to believe that co-opting the label 'consciousness' is the same as solving it.

You have stated exactly what you think I am missing.

You have also stated that you can't specify exactly what it is that you have stated.

Or am I wrong? Have you decided to produce a coherent definition of qualia?

There is information flow between neurons whether an individual is conscious or not. There is self-referential feedback not only between all somatic cells but within them, via autocrine signaling. These kinds of computational processes go on all the time in every living organism and they are not sufficient, in and of themselves, to produce the subjective experience of consciousness.

By your definition even a person in a dreamless, non-REM, sleep is conscious. In such states a person is clearly unconscious even tho, biologically, they meet your criteria for what constitutes consciousness. For this reason, it is obvious that your definition of consciousness, while relatively simple, is inaccurate.

No. The person is not conscious -- components of their brain are conscious. For the entire person to be conscious requires the synthesis of many waking behaviors, or what most people call human consciousness.

I'd be glad to talk more on this subject with Nick. I do not think consciousness is a human specific phenomenon, but its clear that you and I aren't referring to the same thing when we speak of 'consciousness'.

Yes, it is clear. I have an idea what I am talking about, you have no idea what you are talking about. Let us try a little experiment, eh?

You tell me which entities from the list below are conscious and why they are conscious:

1) Human
2) Worm
3) Dolphin
4) Octopus
5) Moray Eel
6) Chimpanzee
7) Crow
8) German Shepard Dog
9) Chihuahua Dog
10) Polar Bear
11) Tiger
12) Severely Retarded Human
13) Infant Human
14) Garden Snake
15) Fetal Human in first trimester
16) Squirrel
17) Sea Otter
18) Rat
19) Dragonfly
20) Humpback Whale
 
Erm...hows about all of it?

How about the feeling of hunger? Or the sensation of Red? Or the knowledge that one is conscious?

All those things are explained by the computational model. The details haven't been flushed out yet, but they are just details -- the things that need explaining are explained.

Kind of like a mechanic can tell you how an internal combustion engine works without knowing the exact piston diameter or camshaft timing, or even how many pistons there are or whether a camshaft is being used at all.

The principles are known. Anyone sufficiently educated on the subject with even a little imagination can figure out how they might lead to the end result.
 
Actually, I like your observable behaviors definition just fine. That definition, unlike the self-referential definition, is able to distinguish between an aware and aware human and one in a coma.
That just tells us that the difference between an awake and aware human and one in a coma isn't whether they have any conscious processes or not. It's whether they're awake and aware. Which are different things.

As I keep saying, you have to define what you mean by consciousness. Until you do that, your arguments won't make sense to anyone - not even you.
 
AkuManiMani said:
The mystery isn't whether or not there is qualia. The question is why are there qualitative experiences at all and what physical principle give rise to specific qualitative states.

Is it also a mystery "why 1 == 1 and what physical principle give rise to the identity axiom?"

Because the answers are the same, actually.

Rocket, really? Really??

So now you're saying that consciousness actually is axiomatic and requires no further explanation, or are you just tossing out a red herring?

So you take an ill-defined term -- consciousness -- and predicate the definitions of other terms on it? So now "knowledge" is also ill-defined?

This isn't what most people call "making progress."

Eh? Dude, I just explained logically, why an entity cannot know something unless it is conscious. A book does not know the information it contains, a computer does not know the information that it carries. Simply containing information is not the same as knowing said information. My point is that you cannot make any progress in regards to explaining consciousness if you cannot even get passed an issue as rudimentary as this.

AkuManiMani said:
So basically, you're ignoring the original problem, giving a much simpler problem its name, and then, after you solve the simpler problem you claim to have solved the original. Like I've emphasized earlier, you won't make any headway in solving the problem if you keep averting your eyes from it.

Well you are correct in that respect.

When I was a child I was terrified of closets because I thought there might be alien eggs in them. It was quite a problem for me, figuring out how I could monitor all the closets in our house to make sure there were no eggs, every night. When I got a little older (smarter) I figured out the problem of "whether there were aliens or alien eggs in the first place" and guess what -- the problem of monitoring the closets disappeared!

So deep scientific questions that you haven't the stomach to face are just bogeymen in the closet as far as you're concerned, eh? Pretty much what I surmised earlier; I just never thought that you'd just shamelessly come out and say it.

You have stated exactly what you think I am missing.

You have also stated that you can't specify exactly what it is that you have stated.

Or am I wrong? Have you decided to produce a coherent definition of qualia?

Dude. Its getting beyond parody right now. I've already provided as coherent definition of it as any word ever defined. I'll take you're continued evasion as a concession.

AkuManiMani said:
There is information flow between neurons whether an individual is conscious or not. There is self-referential feedback not only between all somatic cells but within them, via autocrine signaling. These kinds of computational processes go on all the time in every living organism and they are not sufficient, in and of themselves, to produce the subjective experience of consciousness.

By your definition even a person in a dreamless, non-REM, sleep is conscious. In such states a person is clearly unconscious even tho, biologically, they meet your criteria for what constitutes consciousness. For this reason, it is obvious that your definition of consciousness, while relatively simple, is inaccurate.

No. The person is not conscious -- components of their brain are conscious. For the entire person to be conscious requires the synthesis of many waking behaviors, or what most people call human consciousness.

Oh? So basically everything is conscious then. By your definition, not only are bacteria conscious but even the very atoms they're made from. Hell, lets not just go wild and say that since the universe is a self referencing computational process all reality must be conscious! If you were an Idealist why didn't you just come out ans say so?


Yes, it is clear. I have an idea what I am talking about, you have no idea what you are talking about. Let us try a little experiment, eh?

Correction. I know what I'm talking about and what you're talking about. The problem is, you don't want to recognize what it is I'm actually talking about because you'd be forced to recognize that you're not describing what you think you are ;)

You tell me which entities from the list below are conscious and why they are conscious:

Sure, I'll play along. :rolleyes:

Being a human I can vouch 100% that I am conscious. Other humans, like infants, can be assumed to be conscious by extension. First trimester fetuses don't seem likely to be conscious; from what I've read about that stage of a person's development it seems likely that they are in a state akin to unconscious sleep [we can't know for sure until we make more significant headway in answering the EMA]. The chordates on your list seem pretty certain to have consciousness by dint of having such similar physiology that one would be hard pressed to argue that they would not generate consciousness [thought they would be much harder subjects to study in this regard since they are unable to communicate their internal states the way a human could.] Mollusks, and other multicellular invertebrates seem like plausible candidates for being conscious as well. Tho, if they are, it is probably qualitatively much different than our own [again, we can't know how or in what way they would be because we still don't have an adequate grasp of the EMA].

The real question is: Using YOUR alleged understanding of consciousness could you explain how specific qualitative experiences arise in humans and what kind of qualitative experiences non-humans like eels would experience? What is it that determines the subjective quality of a conscious organism's experience and how could one create an entity from scratch that could reproduce them?

Of course, no one really knows the answers to these questions and we currently don't have the scientific means to, which is why its called a hard problem. When, using your allege 'understanding' of consciousness, you can explain how to generate an entity with specific qualitative perceptions of stimuli and explain how and why they arise you can claim to have solved the EMA. In the meantime, stop pretending that you've already solved it.
 
How about the feeling of hunger? Or the sensation of Red? Or the knowledge that one is conscious?

All those things are explained by the computational model. The details haven't been flushed out yet, but they are just details -- the things that need explaining are explained.

Kind of like a mechanic can tell you how an internal combustion engine works without knowing the exact piston diameter or camshaft timing, or even how many pistons there are or whether a camshaft is being used at all.

The principles are known. Anyone sufficiently educated on the subject with even a little imagination can figure out how they might lead to the end result.

Okay. Build a system that experiences sounds as colors. Provide me a link explaining the computational model of nausea or vertigo. How wide is the range of potential subjective experience? What kinds of computations give rise to each?

Explain to me how a bat experiences it's own echo location. Do they experience it in a way analogous to our hearing, does it evoke the experience of something akin to a visible map, or do bats experiences it in a qualitative way completely alien to human experience?

Is there anyone who's proposed a viable means to recreating the subjective experiences of one animal into another completely different species? Hows about recreating those experiences in present day AI system?
 
GRAVITY GRAVITY GRAVITY!

All hail to Gravity so Strong
let Newtons apples fall
bring forth the ground
and all fall down
and crown it
force of all!

bring forth the ground
and all fall down
and crown it
force of all!

That's just in case it wants my praise in order to keep working.
 
I do not believe we currently can explain the behavior of objects being affected by gravity. There are a number of qualities exhibited by such objects, such as "falling," that defy a full mathematical description.

Thus I advocate the notion of a "Hard Problem of Gravity,", or "HPG," that must be solved if we are to eventually grasp the full nature of gravity.

Among the notions supported by the HPG is the philosophical "gombie" or "gravitational zombie," an object that behaves exactly as if it is being acted upon by gravity yet is not being acted upon by gravity.

The HPG is particularly startling because it implies that everything we drop might actually be a p-gombie instead of a non-p-gombie. In fact, if you have gone skydiving, or jumped from a diving board, or even walked upright, you might be a p-gombie!

P.S. If the local university in your area is hiring post-docs in philosophy, please let me know, I am currently unemployed.
Nice. I'm pretty impressed. Not having read the rest of this thread, I have to wonder if you have managed to work The Matrix into your position? If so, how? Can you link me to the post?

In any rate, awesomely funny stuff. Keep up the good work!
 
AkuManiMani said:
What do you mean 'unobservable'? Each of us 'observes' our consciousness firsthand every day -- its the fundamental basis of all observations. The definition that I'm going by is the one that *I* personally experience and, by proxy, assume that others do in some capacity as well.

I'm curious about whether we actually observe our "consciousness" firsthand or if "we" simply infer it via the organism's capacity to connect different operations together into a sort of meta-cognition – pattern recognition ("I", "we", "us", "me").

For some operations, the brain and the neural context is their external environment. Only when certain operation are taking place – like "self-systems" – will it even be possible to create a distinction between the everyday notion of internal and external environment. A great deal of the brain appears not to have a clue that there is a brain or an organism or that there is identity.

When you say that consciousness is the prerequisite for all such observations, what addition to the description have you actually made, except for labelling all the complex interactions going on as such? From my perspective, I would have to say that those operations – whatever they are – are the prerequisite for us to later claim we are conscious (or that we "observe consciousness"). The difference between us being in considering consciousness as a property vs. mechanism (as a behavioural variable).
 
I'm curious about whether we actually observe our "consciousness" firsthand or if "we" simply infer it via the organism's capacity to connect different operations together into a sort of meta-cognition – pattern recognition ("I", "we", "us", "me").

For some operations, the brain and the neural context is their external environment. Only when certain operation are taking place – like "self-systems" – will it even be possible to create a distinction between the everyday notion of internal and external environment. A great deal of the brain appears not to have a clue that there is a brain or an organism or that there is identity.

When you say that consciousness is the prerequisite for all such observations, what addition to the description have you actually made, except for labelling all the complex interactions going on as such? From my perspective, I would have to say that those operations – whatever they are – are the prerequisite for us to later claim we are conscious (or that we "observe consciousness"). The difference between us being in considering consciousness as a property vs. mechanism (as a behavioural variable).

Hmm...Well when I say that we observe it directly I mean that the process of observation, regardless of the underlying mechanisms, is a conscious process. As individuals, our conscious minds do not directly observe the the biological processes that give rise to it. All we observe -- all we can observe -- is the end result of conscious awareness.

Being cognizant of one's distinction from the environment is incidental to whether one is conscious. There are some who reach states of consciousness, via meditative practice, that report having the mental distinction between 'self' and 'non-self' dissolve [it is probably even possible to achieve this thru some sort of external manipulation of the brain]. The fact that their specific experience changed does not negate the fact that they were still consciously experiencing. Cognitive functions and systems are auxiliary to the actual experience of consciousness. It is more than possible to have such functions continue without there being any conscious experience whatsoever.

Like I mentioned earlier, a newborn baby's body contains all the information required to build a human but that child is not born with knowledge of biology. The entire body performs all the computational capacities, and more, that some here allege to be identical to consciousness. The point I'm making is that these are not sufficient IAOT to produce conscious experience.

The mechanisms that ultimately give rise to consciousness are not themselves conscious. There is still much to learn in regards to the 'hows' and 'whys' of this whole process. I find the whole attitude exhibited by some here of stating that not only is the question not important [or dogmatically declaring "*I* already solved it"] to be extremely unscientific. If they truly believed that they should just pack their bags and retire because they've nothing more to contribute.
 
Last edited:
You're not describing consciousness, then. You've redefined the problem to be something it isn't and then claimed to have solved it. You're basically just playing a labeling game rendering your conclusions a complete non sequitur. In essence, all your argumentation in support of your position is...how shall we say it...? Irrelevant

Yes. For me Pixy takes a strong AI definition of consciousness and then proceeds to defend this definition, never really stopping to investigate whether the myriad phenomena of human consciousness really fit with it. Anything which does not agree with his definition must be wrong because, well, just because! It's like discussing with the HAL computer from 2001.

The problem, as myriad commentators acknowledge, is that we don't have a proper agreed definition of the term "consciousness." If you're a computer this is a major drag, of course.

Nick
 
AkuManiMani said:
Hmm...Well when I say that we observe it directly I mean that the process of observation, regardless of the underlying mechanisms, is a conscious process. As individuals, our conscious minds do not directly observe the the biological processes that give rise to it. All we observe -- all we can observe -- is the end result of conscious awareness.

OK, let's start with this one. I think we're making progress thou. I would say that it would be better if we could somehow make a distinction between the mechanism and the observation – in which case we're always only observing something (i.e., observation of). Without the mechanism there's no observation in any case. The mechanism is thus primary; it only allows observation to that which there is access to – mainly other processes or representations thereof, which are then interpreted and re-interpreted according to where access is gained henceforth (in a sort of cascading way). Obviously, access to more stable memory systems is also included.

When you say that the process of observation basically constitutes to being a conscious process, I cannot help but to see a rather odd redundancy in how you define them as being the same thing. Moreover, how can it be "regardless of the mechanism" if theres no observation without it, hence no conscious process without the mechanism? I would say that it is the mechanism which correlates more closely to that of "consciousness", and what is being accessed correlates more closely to the content of the "observation".

Obviously it is thus ultimately a conscious process in toto, but that would primarily be because of the inferred underlying mechanism rather than due to the content of what is observed (which can change and still not make a fundamental difference to the experience – read: inference – of being conscious).

This would also be the reason why it's so darn hard to simply rely on subjective experience when trying to pinpoint what we mean by consciousness. The irony is that from a first person perspective we are inclined to make it because of constant change of observed content, whereas from a third person perspective it would be inferred from looking at the mechanism, regardless of the observed content.

Finally, if the content would not change at all, could we say we are conscious in any meaningful way? How about if it would change, but in a very limited and repetitive fashion?
 
Not in the sense that human beings remember information.¸Computers put data in and put data out. Human beings construct and experience memory, on a continuous basis.

You're making a useless distinction. Humans put data in and out also, and it's possible to have a computer do it continuously.
 
Exactly. People somehow manage to equate 'having a reason for making a choice' into 'having no choice'.

People always have reasons or 'causes' for the actions they take. If you know someone really well, you could probably predict what actions they would take in certain circumstances.

How you can jump from these simple facts to 'we have no choice in our actions' is beyond me.

In any case, Third Eye, who cares ? Even if we DON'T have a choice, which is, technically, true, it doesn't change anything. From our perspective, since we aren't aware of all the variables involved, we still make choices.
 
In other words, if consciousness is the result of computation, then it can occur just the same by simulating neurons one at a time using a monastery full of monks writing on paper with feather-quill pens. So could any other process we know of, by the way. That is a fact

I always said so: monks can do everything.
 
Well, that is entirely my point, really.

I stated that "computers only do what they're told to do". As a counter-argument, Pixy suggested that Game of Life proved that computers could go beyond their deterministic programming, because the resultant patterns were not programmed into the source-code. I'd argue that the opposite is true - that these patterns are predictable and deterministic given the limited rule-set and initial breeder pattern shows that it's not a great example of a computer program doing something it was not programmed to do. As a counter-argument, it fails - because the Game of Life does exactly and only what it's supposed to to, in a deterministic fashion, every single time. If consciousness does this, it does not do so self-evidently, which means further discussion and thought is required.

If there are examples of such programs (I don't know), I'd be interested to learn of them.

You seem to be under the mistaken impression that life, and humans in particular, operates differently.
 
If one didn't know any better, they'd be led to think that you suffer from megalomania.

It seems that the "problem" of consciousness appears so insurmountable to you that anyone that claims to understand it is either crazy or simply mistaken.

Conscious experience. If you don't experience it yourself, tough luck.

I'm sure you'd be hard-pressed to define what you mean by conscious experience, and how to determine if someone else has it. I've been asking this question for quite some time now, without a response.

Erm...hows about all of it?

One example will do.
 
Last edited:

Back
Top Bottom