The Hard Problem of Gravity

No, HPC is the (unjustified) assumption there is a reason. Not a reason itself.

There is no reason why you would want to deny your experience a character that is not describable in terms of information processing. There is no experience in a calculator doing arithmetic. Reason being there is no information there that would relate to that experience, hence there are no reasons to believe it's there. Otherwise, it's me who will have to tax you of using your imagination a bit too wildly. You are not just a machine, this is what HPC tells you. Don't be afraid!
 
Thanks for pointing out the problem! I have been waiting for some similar code for a while now! Difference is I did not promise any and yet I am dead certain of what I feel. Isn't that a problem?

Any programmable machine can produce the statement "I am dead certain of what I feel."

Until I see your "feel" for myself, I have no choice but to say you are merely a machine.
 
There is no reason why you would want to deny your experience a character that is not describable in terms of information processing.

If you start by framing an argument by what you want, then its obvious you're going to come to a conclusion you want. One should avoid appeals to consequences.

There is no experience in a calculator doing arithmetic.
Reason being there is no information there that would relate to that experience, hence there are no reasons to believe it's there

Says you. I say there is, as the electricity is flowing and circuits are switching and the calculator is experiencing changes in the operations being performed and results displayed.

Otherwise, it's me who will have to tax you of using your imagination a bit too wildly.

Gleep glork?
You are not just a machine, this is what HPC tells you. Don't be afraid!

Appeals to consequences again. I'm obviously not a machine in the sense of steel pistons and copper pipes and gold circuits, but I'm made of biological stuff that works in many of the same ways.

It seems that you're the one scared of the idea of being a machine, but the HPC can't help you here. You can't even show it exists.
 
Sticking with the Star Trek theme, my guess is westprog would be ok with the reality of replicator oranges: it's holodeck (hologram-generated) oranges he would be reluctant to call "real".

Of course, our guesses are likely very poor simulations of the real westprog's real response. :vulcan:

I'd consider the properties of something that I decided to call an orange, and see how many properties the replicator orange had. Then it would be a matter of definition.

If we define an orange as something grown and picked, the replicator orange wouldn't qualify. But if we are only concerned with eating the orange, then we can include it. If we are only concerned with appearance, then a drawing of an orange might do. A description of an orange in a book falls well short. A jpeg file of an orange does as well. But they still have some of the properties.

One might think that the relation between Westprog and a real orange is the same as between an orange in a video game and Lara Croft. A moment's thought allows us to realise that it is not - that the relation is in no way similar.
 
Are you sure? I mean, couldn't it be because they are all part of the hard problem of consciousness that they all fall under the same category? I would also believe that experience is more appropriate than feeling to describe that category.
But again, people keep mentioning the HPC without answering my question - what precisely is the HPC?
 
OK, here are what I think the HPC amounts to:

"The word 'mind' evokes vague connotations of ethereality and wispiness and the word 'physical' evokes vague connotations of solidity, hardness and lumpiness.

This in turn evokes a vague uneasiness that ethereality and wispiness do not sit well with the concepts of solidity, hardness and lumpiness.

Therefore mind cannot be physical".


or else it is something like:

"The ostensible qualities of observed objects differs from the process of perception, therefore the process of perception cannot be physical"

Is that about it?
 
There is a still relatively small literature about the distinctions between emotion and feeling - dating back at least to William James. Basically, in the neurological literature at least, an emotion is considered to refer to a body state with all the attendant physical sensations (heart pounding with fear, heart pounding with love) and a feeling is a conscious sensation. Feeling, therefore, refers more to a higher order cognitive issue than emotion, which is organized and probably mediated primarily by the hypothalamus and amygdala.

Feelings, therefore, include all sorts of sensations, not just emotional ones -- including the feeling of seeing blue, etc.

Okay: "emotion" is body state and its physical sensations; "feeling" is any conscious sensation (including the physical sensations of an emotion); 'conscious' sensation is probably redundant -- are there unconscious sensations?-- though it does highlight they are part of consciousness.

I may be completely wrong, but the way I conceptualize it, feelings like the feeling of seeing blue are low level valuations (placing value on a particular perception) that can be used to direct behavior...

That's interesting: the "feeling" of seeing blue. I'm so used to associating feeling with the sense of touch it sounds odd. So the "feeling" of blue is how blue looks to me, how it feels to me, not how I feel about it. ("How do you feel about blue?" -- It's my favorite color! "How does blue feel?" -- Umm... blue-y? [like asking how does 'soft' feel]).

...and feelings of emotional states like fear that allow us to respond in different ways to what we initially perceive unconsciously as danger (if we only had the unconscious evaluation we'd be stuck with simple responses like flight or flight; consciousness provides a means to elaborate behavioral responses).

In some ways consciousness reminds me of an instrument panel: fear is: the instrumental measurement (body state -> unconscious evaluation) that something is wrong; and the flashing red light (the attendant discomfiting sensation of fear) that alerts us something's wrong, and urges us to attend to the problem (pay attention and react). But though warning is urgent, it's still presented to cognitive / thinking [?] consciousness (this is where it gets tricky: [feeling] consciousness presents and [its content] is presented to [thinking] consciousness) for evaluation, as if other knowledge we have might give us a better, more rational, more elaborate (as you say) response, to the source of the fear, than more immediate, less rational "fight or flight" (though animals may reason a response too, but simpler, based on less knowledge).

It's probably wrong to have only one word cover all of this (and this might be one of the problems) because we generally use the word 'feeling' to cover not only higher order processing of intense emotions, but also valuation of perception, and motivational inputs.

We probably need a whole new vocabulary.

A standard vocabulary. There's been a slow snowball of interest in analytic-neuro-phenomenology for a couple of decades at least, going back to Varela and Hunter (I've only discovered it very recently), but specialized academic: still debating the terms the terms of debate, afaik.

Maybe an example will illustrate -- and also illustrate the limitations and problems with the words?

I happen to hate wasps, having been stung several times as a child. As a result I flinch whenever one gets near me.

So, say I'm out mowing the lawn and I see soemthing small and dark flying near my right arm. I flinch and run a few feet away to avoid that coming sting. That action depends upon my unconscious perception of something flying near my right arm and an unconscious decision to run. It happens to be accompanied by an increase in heart rate and overall hightening of arousal not under my conscious control (this physical unconscious change is considered an emotion by many).

Irrational yet sensible: if there is a threat, you're physically prepared and on your way.

After I have run away I turn back to see a leaf falling to the ground and in my heightened state of arousal I realize consciously that the fear I am feeling (it is now a conscious perception of fast beating heart, etc.) is totally inappropriate.

So, there is the unconscious perception of something flying, the unconscious behavioral response of flight, and then the conscious feeling of fear (which is a behavioral tendency to either fight or flee more if needed) and also the conscious appraisal that the fear was unfounded. That conscious feeling of fear allows greater flexibility in my responses than the unconscious emotion which is linked to a set behavior.

Okay: wasp? = fear [emotion] -> arousal / decision & run -->> fear [conscious feeling] "fight or flight?" -> appraisal (no wasp... phew!)

So, the more conscious we are of "fear" (as a feeling and not an imperative), the more elaborate (rational?) our response? (I 'fear' I might be misunderstanding you; if not, that's how I would analyze it too).

I suppose it is correct in a way to view feeling as less intense than emotion, but the real difference is that one -- emotion -- is not part of conscious evaluation. Feeling is part of the conscious evaluation, for want of a better way of expressing it.

An emotion's feeling is an opportunity to compose a rational response to the emotion.

Different folks have different stories about what all of this means -- James thought that emotion caused the feeling, that feeling was a conscouis incorporation of the behavioral response caused by the emotion -- so we feel sorry because we cry; we feel fear because we run. Antonio Damasio has an updated version that has the feeling being a story that the brain creates for the behavior.

I remember reading James' theory for the first time and thinking he was joking; but that's a lifetime of assuming the feeling of the emotion is the emotion. (That's the usual definition of course, but there's nothing wrong with James' specialized distinction).

I prefer the idea that we have parallel systems ongoing with the unconscious perception being linked with the later conscious appraisal and parallel with the emotional output -- fight or flight, etc. -- rather than what looks to me like a series (emotion causes behavior, relay loop of info getting to consciousness causes feeling) in the James-Lange theory.

I like the democratic model of mind too -- [raucous] parliamentary really, with everyone trying to talk at once -- and simultaneously competing pov's should be analogous to parallel processing, I think.

A key to all of this, though, is the way our nervous system is organized. What animals do is constantly update information from inside and outside through big information loops -- spinal cord to brainstem to thalamus to cortex, with each step including a loop back to the earlier level and all higher levels looping back to all earlier levels. Information is constantly looping and updating; we appraise a situation unconsciously and then update it based on what has changed or what we change. I view consciousness as the means by which we vary behavioral repsonses based on what might and might not work in any given situation, so it is tied to uncsonscious appraisals and recursive loops.

Yes, recursive neural loops make perfect sense; otherwise, it'd be nigh-impossible to calm down once you realize your fear is unfounded. :dragonfly <- dragonfly :)
 
Thanks for pointing out the problem! I have been waiting for some similar code for a while now! Difference is I did not promise any and yet I am dead certain of what I feel. Isn't that a problem?
No. Why should it be?

Anyway, code that is aware. Stimulus - something to be aware of. Memory - to know that you have been aware of something. And conditional response - because otherwise you're merely reacting, not aware.

Meet Alice the friendbot!
Code:
import sys, datetime

my_name="Alice"
people_ive_met=[]
when_i_met={}
my_friends=[]

while True:
  print "Hello?  Is anyone there?  Press enter if you see this!"
  sys.stdin.readline()
  
  print "Hi!  I'm an aware (but not self-aware) computer!  My name is %s!  What's yours?" % my_name
  person=sys.stdin.readline().strip()

  if not person:
    print "That's not a name!  Is a bird just pecking on my enter key?  If only I had a USB camera. I am sad.\n"
    continue
  
  if person in people_ive_met:
    print "Hello %s!  We last met on %s!" % (person,when_i_met[person])
    when_i_met[person]=datetime.datetime.now().strftime("%A at %H:%M %p")
    if person in my_friends:
      print "I like you %s" % person
      continue
      
    else:
      print "You wouldn't be my friend before!  Will you be my friend now, %s? (y/n)" % person
      
  else:
    people_ive_met.append(person)
    when_i_met[person]=datetime.datetime.now().strftime("%A at %H:%M %p")
    print "Hello %s!  Will you be my friend? (y/n)" % person
    
  person_will_be_my_friend=sys.stdin.readline().lower()[0]=='y'
  
  if person_will_be_my_friend:
    my_friends.append(person)
    print "Thank you for being my friend, %s!  I will remember this moment always!" % person
    print "If you know anyone else who might like to be my friend, my enter key is always warm!"
    print "Now farewell!\n"
  else:
    print "You are not a very nice person.  May your camel be afflicted with the hiccups!\n"
    continue
    
  if len(my_friends)>=1000:
    print "I have a thousand friends!  My life is complete!"
    print "Thank you to", my_friends
    print "Goodbye!"
    sys.exit()
I can translate to C if you like.
 
And no qualia and no experience... Hurray we have just explained consciousness away!

No, we haven't. "Qualia" is just a term philosophers use because they enjoy making the "mind" seem more special than it really is. But as I said, when I touch a wall, my experience of the wall is an extention of that wall, not an image of it.

You guys keep insisting that the special quality of consciousness is "obvious", but it's far from being so from my perspective, especially in certain moments when I can't seem to distinguish myself from the rest of the world. Yet I seem to operate very well and I dare you to show that I'm not "conscious".

The trick is to determine how consciousness works, and many have already admitted that the only way to know is by observing behaviour. If behaviour is our only criterion, then that's how consciousness is to be defined. But then the same people object to that definition. Odd.

And you have yours in the sense that you assimilate awareness to a brain process, which I was given no proof about.

I didn't say anything about brain processes, so you are making this up. I will say, however, that I assimilate awareness as a personal process i.e. one that is a function of your entire being AND its environment. Everyone here seems to agree to this, but then they refuse to come to a conclusion. I can only guess as to why.

Believe me, they are all around us!

No, they aren't.

The concept of a P-zombie is a being whose behaviour, structure and composition are indistinguishable from a conscious human, but lacks consciousness. Since we can only define consciousness in behavioural terms, by definition the P-zombie IS conscious, so P-zombies are incoherent.

Act on the data! Spot on! See... no need for understanding the data!

Oh, but it does understand the data. Otherwise it couldn't act on it. And, if you think about it, it's the same with a human. I dare you to differentiate the two in this respect.
 
One is a process of the other. There is no reason or evidence to the contrary. In such situations, I feel assumptions are valid.

HPC is reason to the contrary. That's why it is not a valid assumption. Isn't that why we are here?

That is a circular argument. The HPC IS an assumption that what we know isn't enough, so the HPC can't be the reason for that assumption.

There is no reason to assume that consciousness is anything more than self-referential information processing, because what we know about the latter seems to include all of the former. The only reason to assume otherwise is unwarranted at best, and irrational at worst.
 
Because we all have similar bodies and behave similarly to various stimuli.



Oh wait, that would make 'feelings' something physical or behavioral. Never mind.

What does "physical or behavioural" mean? Is the implication that feelings are a non-physical phenomenon, identified by behaviour but forever outside the reach of science?
 
The question would be rather: why do I feel at all?

How. Now why.

No they are not!

Well, someone hit a nerve. Sorry, kid. Feelings are not knowledge, except about themselves.

Right now! I feel my keyboard under my fingertips. Try to prove I am lying! :)

You have not proven that you feel anything. You just claimed to.

There is no experience in a calculator doing arithmetic.

And you know this how ?

I remember a teacher-in-training during a religion course in high school who claimed that dogs didn't have souls. Since we were smart enough to call him on it, he eventually had to give us a reason: "Did you ever see a dog go to church ?". Please make sure that your answer to my question isn't of a similar kind.

You are not just a machine, this is what HPC tells you. Don't be afraid!

Who's afraid ? Being just a machine has nothing inherently comforting. The HPC is a philosopher's construct.
 

Back
Top Bottom