• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

The Hard Problem of Gravity

A bold claim. Let us get to the root of this issue.

What do you have in your brain, that allows you to "understand anything at all", that a computer brain could not possess?

Westprog you still haven't answered this question. I say there is nothing. There is no special thing about our brain that a computer brain could not have.

You and all the other navel gazers are starting with the conclusion that there IS something special, and trying to find it. That is this so called 'hard problem'.

I see no evidence for anything special or 'magic' that our brains have that a computer couldn't have if sufficiently complex.
 
Only in the metaphorical sense! ;)

I understand your though experiment, but it's not really what I was asking. You seem to have jumped to the empirical plane prematurely. I'm still staying on the philosophical plane and trying to weed out the conceptual boundaries before proceeding. What you have done, is to restrict the though experiment to inhibiting the child from receiving any outside stimuli. That's only half of the equation, and an overstep. What I'm asking is what if 'all' "content" is removed, including hunger, thirst, swallowing, digesting, feeling of heartbeat etc. In other words, total void. Would there then be 'experience of nothing', or 'no experience'. This is a conceptual question.

I haven't ignored your though experiment, and I will get back to it. But I want to sort out this conceptual precursor of it first. You don't even have to think what 'removing all content' means in biological terms, just what it would entail in philosophical terms.

Let's suppose, for the sake of argument, that in such a case, experience/awareness would cease.

What does this tell us? Does it mean that experience is something entirely subsidiary to and dependent upon external stimulus, and hence can be disregarded as something in and of itself? I tend to think not. Things can be entirely created and dependent on other things, and yet exist in their own right.

I would suppose that even if all sensation is removed, a mind would still recycle its own memories.
 
I would suppose that even if all sensation is removed, a mind would still recycle its own memories.

Yes, but those memories are of sensations, experiences, and all thought is based off those. If you never had ANY experiences, how could you even be said to exist?
 
Third Eye Open said:
I guess we haven't been reading the same thread.
Almost like a deja vous!

westprog said:
But there is no evidence that inanimate objects are conscious. None whatsoever. A belief that they must be is making the exact same mistake that was made by the people who insisted that the Earth was at the centre of the solar system - overriding science with philosophical preference.
But this whole thread is about different perspectives to approaching "consciousness", not about a fixed, agreed upon, point of departure to the problem. Thus when you say there's no evidence, it's basically almost a meaningless supposition so far.
 
Last edited:
I understand your though experiment, but it's not really what I was asking. You seem to have jumped to the empirical plane prematurely. I'm still staying on the philosophical plane and trying to weed out the conceptual boundaries before proceeding. What you have done, is to restrict the though experiment to inhibiting the child from receiving any outside stimuli. That's only half of the equation, and an overstep. What I'm asking is what if 'all' "content" is removed, including hunger, thirst, swallowing, digesting, feeling of heartbeat etc. In other words, total void. Would there then be 'experience of nothing', or 'no experience'. This is a conceptual question.

I haven't ignored your though experiment, and I will get back to it. But I want to sort out this conceptual precursor of it first. You don't even have to think what 'removing all content' means in biological terms, just what it would entail in philosophical terms.

Okay, Gotcha.

Since, I'm conceiving of consciousness as a field of some kind [sorry, I know you wanted me to refrain from jumping into the 'empirical' but I think this may be appropriate, bear with me >_<] then it stands to reason that it would have the same general properties as other fields; one of them would be the capacity to have an overall zero value over a particular spacial extent. My guess is that any region with this overall zero value could be said to to be unconscious.

I think I addressed this somewhat from a purely ontological perspective in my expository post:

"Inside"/"Outside" Aspects

Now...to deal with the whole 'inside'/'outside' business. Its a bit mind bending once you actually get it... The 'outside' aspect of reality is the objective state of things and 'inside' is the subjective perception of things:

-The 'outside' is the world as is[observed-objective aspect]; which I'll shorten to WaI

-The 'inside' is the world as seems[observing-subjective aspect]; which I'll shorten to WaS

What I'm speaking of aren't separate realms or universes but one of many dialectical aspects of reality. The WaS is part of -- within the WaI; on other words every perception is part of objective reality. On the other hand, the WaI, in order to be perceived at all, must have subjective qualities and so, in turn, falls within the WaS.

>>>>>>The subject/object relation is a fundamental part of reality and because of this no meaningful language can be generated that doesn't assume such<<<<<<

I suppose conscious experience would, by definition, be the first-person 'inside' perspective of a conscious field; the field IAOI or the 'beingness' of the field. Qualia would be the 'inside' correlation of the 'outside' aspect of field activity. Qualia could only be experienced from the 'inside' of a conscious field; absent a non-zero conscious field, there are no 'inside' qualia.

I hope that's the class of answer you had in mind :o
 
Last edited:
Consciousness is a computer program.
You don't know that.
Yes I do.
No, you don't

What else can it be?
Ahhh... so, you're making an argument from incredulity

The brain is a computer.
No

The brain is a brain
Consciousness is the result of brain activity. That makes consciousness a computer program.
No, it does not make consciousness a computer program

For you to argue that it does simply illustrates that you don't understand programming and/or consciousness - no big deal... very few understand the former... maybe no-one really understands the latter, yet

If you want to get an idea of why/how I differentiate consciousness from a program, read up on Systems Analysis

For some reason, I suspect you know this already

What I don't understand is why you want/need to blur the two to the point of inanity

:confused:
 
Westprog you still haven't answered this question. I say there is nothing. There is no special thing about our brain that a computer brain could not have.

You and all the other navel gazers are starting with the conclusion that there IS something special, and trying to find it. That is this so called 'hard problem'.

I see no evidence for anything special or 'magic' that our brains have that a computer couldn't have if sufficiently complex.

Then please explain in what sense a computer executing a program "understands" what it is doing. Does a rock falling downhill "understand" that it is falling downhill? Does a payroll program understand that what it is doing has to do with pay? How could it. It's just shuffling bits. And no matter what programs you put into a computer, it's just shuffling bits.

Of course, if you believe that human consciousness is purely computational in nature, you have to believe that it can be fully emulated computationally. But that's a matter of faith, not science.
 
Then please explain in what sense a computer executing a program "understands" what it is doing. Does a rock falling downhill "understand" that it is falling downhill? Does a payroll program understand that what it is doing has to do with pay? How could it. It's just shuffling bits. And no matter what programs you put into a computer, it's just shuffling bits.

Of course, if you believe that human consciousness is purely computational in nature, you have to believe that it can be fully emulated computationally. But that's a matter of faith, not science.

If I turn on my television, you could say I 'understand' what I am doing. But in what way? I don't know exactly what ligaments and muscles moved in my hand in order to push the button, I don't know what neurons fired in order to send the command to my hand, I don't know what rout they took through my nerves to get there. I also know very little about what went on in the remote or in the television it's self, yet you would say I 'understand' what I did.

When a computer runs an anti virus program (to use an analogy from earlier) It may not 'know' what parts of it are doing what, or exactly what the program is doing, but it has a 'memory' of being told to do this, and does it, then stores the information of what it did for later reference.

The computer understands that it was told to run the virus scan, and understands that it is running a virus scan. Perhaps from the computers point of view (not knowing about us, pushing its buttons) it 'wanted' to run a virus scan, and thus 'decided' too.

ETA: Also, I apologize for calling you a 'navel gazer'.
 
What I don't understand is why you want/need to blur the two to the point of inanity

I think the motivation is seen in some of the comments made on this and other threads. If you don't believe that consciousness is produced by a computational process, then you have to admit that it's something of a mystery. Just having an open question lets in "magic".
 
If you want to get an idea of why/how I differentiate consciousness from a program, read up on Systems Analysis

For some reason, I suspect you know this already

What I don't understand is why you want/need to blur the two to the point of inanity

:confused:

I think he does so possibly for two reasons.

One, because, as the saying goes "when your only tool is a hammer every problem starts to look like a nail". PixyMisa seems to be capable of relating to the world only in terms of computer logic. He apparently feels very comfortable with this state of affairs and can't [or wont] see any compelling reason to change it.

Two, because he finds the thought that there could be anything that cannot be expressed in such terms very disconcerting -- perhaps even downright terrifying. Notice how he goes to great lengths to be obtuse on precisely the points that would render his entire world view invalid. If he were to ever allow himself to consider what is really meant by consciousness he'd be forced to face the fact that there are things in the world that he doesn't have an absolute grasp of.

Hes not as slow as he putting on; hes willfully and deliberately stonewalling. I guarantee you won't get anywhere in this discussion with him :covereyes
 
Last edited:
If I turn on my television, you could say I 'understand' what I am doing. But in what way? I don't know exactly what ligaments and muscles moved in my hand in order to push the button, I don't know what neurons fired in order to send the command to my hand, I don't know what rout they took through my nerves to get there. I also know very little about what went on in the remote or in the television it's self, yet you would say I 'understand' what I did.

No, I don't have perfect understanding - but I do have a desire to watch the TV, and I know that by pushing a button I can gratify that desire. I can derive enjoyment (or not) from watching the TV. Just in the simple act of pressing the button, there's a huge amount going on, on many levels.

When a computer runs an anti virus program (to use an analogy from earlier) It may not 'know' what parts of it are doing what, or exactly what the program is doing, but it has a 'memory' of being told to do this, and does it, then stores the information of what it did for later reference.

The computer understands that it was told to run the virus scan, and understands that it is running a virus scan. Perhaps from the computers point of view (not knowing about us, pushing its buttons) it 'wanted' to run a virus scan, and thus 'decided' too.

.

But it doesn't understand that it's running a virus scan. It has no memory of what it did before. All it has is a pattern of bits. Some of the bits are data, some are programs - though really the only difference is that the program part isn't normally changed. The CPU manipulates them to produce a new pattern. There is no understanding of any kind going on. In order to claim that a computer/program understands something - that's anything at all - you have to either set the bar so low that "understanding" becomes essentially meaningless, or else you have to indulge in an anthropomorphic fantasy of the kind indulged in by people who name their cars.

ETA: Also, I apologize for calling you a 'navel gazer'.

I've seen some very pretty navels in my time. Not my own, however.
 
I think he does so possibly for two reasons.

One, because, as the saying goes "when your only tool is a hammer every problem starts to look like a nail". PixyMisa seems to be capable of relating to the world in terms of computer logic. He apparently feels very comfortable with this state of affairs and can't [or wont] see any compelling reason to change it.

Two, because he finds the thought that there could be anything that cannot be expressed in such terms very disconcerting -- perhaps even downright terrifying. Notice how he goes to great lengths to be obtuse on precisely the points that would render his entire world view invalid. If he were to ever allow himself to consider what is really meant by consciousness he'd be forced to face the fact that there are things in the world that he doesn't have an absolute grasp of.

Hes not as slow as he putting on; hes willfully and deliberately stonewalling. I guarantee you won't get anywhere in this discussion with him :covereyes

If you went all psycho-babble like that on me, I'd put you on ignore too.

Content = 0
 
But it doesn't understand that it's running a virus scan. It has no memory of what it did before. All it has is a pattern of bits.

I don't understand how you can differentiate the two. The pattern of bits IS a memory. What is a memory other than a record of past events? A computer records past events, and references them later for other tasks.

Some space alien with a different type of brain than ours (not a biologist, so I have no creative example) could say something like "These creatures cannot be aware, all they have is firing neurons that react to stimuli, where do they store information?"

This is just starting to sound like human chauvinism to me.
 
What is a memory other than a record of past events?
This is a less-than-trivial question; one that can't be answered in such a cursory manner

A computer records past events, and references them later for other tasks.
Only if its both instructed to do so and has the capacity (cache, RAM, HDD) to do so


This is just starting to sound like human chauvinism to me.
This is continuing to sound like erroneous anthropomorphism to me
 
Only if its both instructed to do so and has the capacity (cache, RAM, HDD) to do so

Well duh, of course only if it has the capacity too. A human only has memory if it has the capacity to (some peoples brains are damaged and cannot store new memories, or some people cannot access old ones[coma]) A human only recalls memories if it is 'instructed' to by some outside source. This doesn't change the fact that you have them. The mechanism is different the result is the same.



This is continuing to sound like erroneous anthropomorphism to me

By calling this anthropomorphism you are asserting that consciousness is a uniquely human characteristic.
 
By calling this anthropomorphism you are asserting that consciousness is a uniquely human characteristic.
No, I'm not

If anything, I might assert something along the lines of consciousness being a (so far) uniquely 'living' characteristic - ie beyond machines
 
No, I'm not

If anything, I might assert something along the lines of consciousness being a (so far) uniquely 'living' characteristic - ie beyond machines

So consciousness has something to do with reproduction? Or metabolism? Growth? Adaptation to environment? Some combination of these?
 

Back
Top Bottom