The hard problem of consciousness

I view the problem as artificial. Consciousness and qualia aren't even well defined. Shouldn't we begin by establishing that something like them actually exists in the first place as something other than a quaint notion we'd like to believe?

At this point the debate seems to be on the level of:

If the materialist view is correct, then how do you explain cuitsimushu? What do you mean "does it even exist?", how can someone be so ignorant as to deny their own cuitsimushuness?!? Without cuitsimushu there'd be no purpose for living at all, we'd be like animals.
 
Last edited:
Shouldn't we begin by establishing that consciousness or qualia exist in the first place...
Sure, what would you call being self aware? The ability to think abstractly? To have theory of mind? If we decide that consciousness doesn't exist then we need to come up with a different word or words and/or act as if these things don't exist, what do you think?

...as anything but a convenient abstraction and quaint notion we'd like to believe in?
There are a few problems with this sentence given the context. "We'd like to believe in"? If there is no such thing as conciseness how can we decide and how can we believe? I think you need to rethink this one a bit. These types of criticism seem to me to only become recursive. You can replace one concept with another but in the end the problem is the same.

Or at least, can we define the words in an objective manner?
Again, I think you need to rethink your complaint. Whatever it is that is happening it is happening. I don't think the solution lies in semantics.
 
@ RandFan,
I agree that "HPC is a valid scientific question", I simply think that it is a rather empty philosophical question (or it that last bit a tautology?).
I look at the existance of subjective experience as simply a step in the way piles of atoms behave. The simplest organisms have at least a rudiment of self/other, if only in the most easily explained chemical terms: most stuff is kept "out", some is allowed to become "part of self" through simple chemical reactions or osmosis. More complex organisms percieve more portions of reality through easily explained processes: acting on physical or chemical gradients in order to better their existance. These examples are not what anyone would call "subjective", as they are merely response to stimuli, but it is only a small step to the behavior of the simpler animals, which are also not regarded as self-aware. Fish, for example, have never, as far as I can tell, been described as conscious of their existance, but they display behavior which is much more complex than billions of fishermen have been able to simplify to stimulus/response.
At some point a pile of atoms has demonstrated that it has actions difficult to predict from the last pile of atoms we examined.

Humans are more aware of their reality than paramecia, and in easily demonstrated ways. Humans are also more aware than fish, but in ways different from the ways that both humans and fish are different from paramecia. Dogs are different from fish too, but describing how seems to put them pretty near humans. The great apes, as well as porpoises, whales, elephants, and others, have awareness of reality that suggests that they have subjective experiences of it.

I apologize for rambling, my point is that there is no reason for the "Great Big Philosophical Question" when there are a bunch of good little scientific questions.
I agree with you with one exception, I wouldn't say that it is a bunch of little scientific questions. It might be but I kinda doubt it but that really is neither here nor there. When we can give a complete accounting then we will be able to say with confidence what it is. Until then we are simply speculating. I think we are on the cusp of a major paradigm but I have to be honest, this might simply be hopeful thinking and arrogance. We'll see.
 
Last edited:
I view the problem as artificial. Consciousness and qualia aren't even well defined. Shouldn't we begin by establishing that something like them actually exists in the first place as something other than a quaint notion we'd like to believe?

This is where I find the argument extremely strange. If I were to list the things that I am actually 100% certain exist, I would start with my own consciousness and sensations - and then stop. I don't know at any given moment if there is a material world of any kind, or even if I exist in time as a single entity. All I actually know for certain is the momentary sensation.

To say that this is an illusion makes no sense to me. If it's an illusion, what (as Randfan has observed) is having the illusion? What makes it illusory? How can it be an illusion if I'm directly experiencing it? To think that the sensation itself is illusory, but the world which it seems to describe is necessarily real seems a backwards view. If the sensations are illusions, how can we then assume that they reflect a genuine reality?

This is where I wonder if others necessarily experience qualia at all. The only evidence we have that anyone else experiences such sensations is that they claim to do so. The qualia don't seem to be necessary in order that human beings function.
 
Sure, what would you call being self aware?

Self-awareness. I can't see any insurmountable obstacle that would prevent you from building an automaton that is capable of reasoning about itself or its existance. I don't think this is a "hard problem".

The ability to think abstractly?

Abstract thought. The difficulty is not in building machines that work with such problems, indeed text, music, images, mathematical equations, musical notes, colours, maps, volumes, surfaces, it can all be represented as binary data in current day CPUs if you wish. What current computers and programs lack is the abillity to develop and test these abstract models of the world around them or purely abstract notions in general; but then, that's difficult for humans too. I don't see why this must be a "hard problem" either.

To have theory of mind?

Theory of mind. I don't see why a computational machine could not have a simplified model of itself that helps it survive the very complex world around it without understanding much of its workings or why it cannot extend this model to other actors similar to in construction to itself to better understand their behaviour and better achieve the desired ends in its model of the world. This does not appear to be a "hard problem" either.

If we decide that consciousness doesn't exist then we need to come up with a different word or words and/or act as if these things don't exist, what do you think?

I think that there are many interesting problems here and whenever you solve one that which is called consciousness retreats until there is nowhere left to go. I view consciousness as a dualist enspired notion to begin with.

There are a few problems with this sentence given the context. "We'd like to believe in"? If there is no such thing as conciseness how can we decide and how can we believe?

Postulate that there is a computational machine with a theory of mind, a convenient abstraction that evolved for its survival; very useful for regulating its own behaviour and predicting the behaviour of other such machines. Postulate that this abstraction contains notions which are useful for survival but not true. Postulate that this machine surrounds itself with similar machines and that a means of communications evolved to contain the abstract notions in this model, either because it didn't know the model it had of itself was wrong or because these abstract notions are useful short hand for describing the behaviour of others.

Postulate that this machine is capable of formulating and testing abstractions about the world around it, and uses the notion of belief from its theory of mind to describe to other such machines what it is doing when it provisionally accepts something to be true.

I think that consciousness, like so many other preconcieved notions will evaporate away when we finally manage to examine it closely enough.
 
One of the biggest mysteries in science is consciousness.....Or is it?



I know we have a lot of materialists here, so would like their take on this, and of course the non-materialistic POV too.

As I said before, not a problem as such, just a set of loosely related and vague questions. The answers to some are, admittedly, "we don't know", but there are so, so many don't knows in our universe.

Why does this particular "don't know" give rise to such metaphysical soul searching?

OK, here is my take:
* "Why should physical processing give rise to a rich inner life at all?"
Don't know, but is there any good reason why it should not?
* "How is it that some organisms are subjects of experience?"
Because it has been a useful survival feature.
* "Why does awareness of sensory information exist at all?"
Why does anything exist at all? Why does energy exist? Why does matter exist? What is energy? What is matter?

Why do scientists still have jobs?
* "Why do qualia exist?"
See above. In fact we don't quite know what it means to say that qualia exist.

Imagine the color red existing independently of an intelligent observer. Would it still be red? No, it would be meaningless to say it would still be red.

So clearly qualia, whatever they are, are the end product of a complex system.
* "Why is there a subjective component to experience?"
Because the usefulness of experience to any organism has primarily been to orient itself within it's environment. So clearly experience is necessarily subjective.
* "Why aren't we philosophical zombies?"
Why on earth should we be philosophical zombies?
* "Phenomenal Natures are categorically different than behavior"
Probably. So what?
 
This is where I wonder if others necessarily experience qualia at all. The only evidence we have that anyone else experiences such sensations is that they claim to do so.
Not true. We observe people behaving in the way that we do in response to conscious experience.

There are only two possiblities.

1. They contain some advanced adaptive mimicry mechanism
2. They experience sensations in the way we do.

We observe that their brains have the same machinery as ours.

So unless it were somehow possible that the same machinery could perform two very different functions then the overwhelmingly reasonable conclusion is that the experience sensations the way we do.
The qualia don't seem to be necessary in order that human beings function.
Can you explain why you think that?
 
This is where I find the argument extremely strange. If I were to list the things that I am actually 100% certain exist, I would start with my own consciousness and sensations - and then stop. I don't know at any given moment if there is a material world of any kind, or even if I exist in time as a single entity. All I actually know for certain is the momentary sensation.

Please begin by defining consciousness instead of telling me that you have it, whatever it is. As it is know I have no idea what you really mean in the paragraph below.

This is where I wonder if others necessarily experience qualia at all. The only evidence we have that anyone else experiences such sensations is that they claim to do so. The qualia don't seem to be necessary in order that human beings function.

What specifically are qualia and why do sensations have to be more than just a piece of internal labeling attached to sensory information as it is processed and analysed?
 
Damn. I have occasionally joined threads like this to see if anybody could explain to me what qualia are, in the sense of being anything in particular. So far, even hard-core dualists have failed. That may, of course, be because I'm unusually dense.

Unfortunately, reading this thread so far, I don't think an explanation is forthcoming.

To me the hard problem of consciousness is what the problem is.

Hans
 
Self-awareness. I can't see any insurmountable obstacle that would prevent you from building an automaton that is capable of reasoning about itself or its existance. I don't think this is a "hard problem".

I dont see a problem with a machine trying to reason either, no hard problem there. Automated computation, math formulas, information processes, no problem at all. When we add an experiencer we have a problem.

In other words if you were this highly complex electro-chemical computer
in the classical sense you wouldn't be experiencing anything, there wouldn't be anything more than computation.

Robin said:
"Why should physical processing give rise to a rich inner life at all?"

Don't know, but is there any good reason why it should not?

Yes.

I like to think of the problem in terms of AI. If you were to build an AI that truly had experiences and not merely programmed to "act" as it were having experiences, how would you build such a thing? Do you believe that this would be achieved by adding more compexity? More transistor, more memory, more data and more sophistacated algorithms . What you get is
more information processes at a faster rate. Does faster computation equal experience/consciousness?

The only thing that comes close to experience in a computer is a transistor reacting to a signal. This is not even an experience but merely a physical reaction based on it properties. No matter how many trillions of transistors the AI has, it's "experience" is limited to what a single transistor can do, and that is to react to an electrical impulse, to forward to the next transistor or stop it there etc. If an AI's sense perception(forexample seeing a flower) would be a few million transistors flipping on/off what would experience these transistors/signals? another set of transistors? and where does the information go from there?

Sorry bout the rambling, but my point is, there is no conceived way of how a transistors or neurons could achieve more than computation.

The problem is: experience is not a function in the classical sense.
 
Last edited:
Please begin by defining consciousness instead of telling me that you have it, whatever it is. As it is know I have no idea what you really mean in the paragraph below.

In short
consciousness = being able to experience. Note that experience is different from computational processes and analysis, this can be done without the ability to experience.
 
Last edited:
If you know how objective reality leads to subjective experience then let the world in on it and pick up your Nobel prize. I assure you it is waiting for anyone who can adequately explain it and I suspect that the folks who are currently reverse engineering the brain may very well be the ones to take it home.

How are these guys modeling feelings? It seems to me, and I willingly confess that I'm not much up on these things, that the issue of emotions will be significant in understanding the aspect of "richness" attributed to inner life. How do you make a machine feel?

Nick
 
Damn. I have occasionally joined threads like this to see if anybody could explain to me what qualia are, in the sense of being anything in particular. So far, even hard-core dualists have failed. That may, of course, be because I'm unusually dense.

Unfortunately, reading this thread so far, I don't think an explanation is forthcoming.

To me the hard problem of consciousness is what the problem is.

To me, the only thing I don't need explained to me is what my own sensations feel like, and what the experience of consciousness is like. But I have no way of knowing if my sense of consciousness is in any way related to anyone else's.
 
I dont see a problem with a machine trying to reason either, no hard problem there. Automated computation, math formulas, information processes, no problem at all. When we add an experiencer we have a problem.

Why? How do you conclude that?

In other words if you were this highly complex electro-chemical computer
in the classical sense you wouldn't be experiencing anything, there wouldn't be anything more than computation.

How do you conclude that?

I like to think of the problem in terms of AI. If you were to build an AI that truly had experiences and not merely programmed to "act" as it were having experiences, how would you build such a thing?

The trick about AI is that it is not programmed to act. It does so based on experiences.


Do you believe that this would be achieved by adding more compexity? More transistor, more memory, more data and more sophistacated algorithms . What you get is
more information processes at a faster rate. Does faster computation equal experience/consciousness?

Is there any way we can know, currently?

Sorry bout the rambling, but my point is, there is no conceived way of how a transistors or neurons could achieve more than computation.

The problem is: experience is not a function in the classical sense.

How do you conclude this?

What you are really saying is that we have not currently build a computer that appears to be conscious. That is probably true, but how do you reach from there to the conclusion that we cannot?

Hans
 
How are these guys modeling feelings? It seems to me, and I willingly confess that I'm not much up on these things, that the issue of emotions will be significant in understanding the aspect of "richness" attributed to inner life. How do you make a machine feel?

Nick

Add variables to its internal state with labels such as confidence_not_fear and happiness_not_sadness, which are adjusted based on external stimuli and estimation and prediction algorithms. The values of these variables could then be used to affect the rest of the system, which in turn affects the values of these variables.

The trick (as with meat-based life) is keeping everything stable.
 
Last edited:
In another thread Skiba has admitted to having a Psyleron REG-1. He claims that the output responds to his "intentions". If a conscious computer could be built, I wonder if he believes that it would be able to "influence" the REG-1 in the same way?

Leon
 
Last edited:
Add variables to its internal state with labels such as confidence_not_fear and happiness_not_sadness, which are adjusted based on external stimuli and estimation and prediction algorithms. The values of these variables could then be used to affect the rest of the system, which in turn affect the values of these variables.

The trick (as with meat-based life) is keeping everything stable.

How do you create the actual conscious bodily sensation of feelings?

Nick
 
How do you create the actual conscious bodily sensation of feelings?

Nick

If the emotional variables are used to affect the rest of the system, sensors which provide feedback about the current state of the system would pick up the bodily sensations of feelings.

E.g., the confidence_not_fear variable affects the poop variable, which is sent to the poop sub-system. The state of the poop hardware is picked up by a sensor local to the hardware and is fed back to the 'brain' to modulate other variables and systems, such as the anal sphincter sub-system.
 

Back
Top Bottom