The Hard Problem of Gravity

Yet the context where Mr Smith sees a spider is always going to be different from Mrs Smith. Mrs might also have had bad experiences with spiders in the past having found one in her bed when she was a child. So both seeing the spider, you could say it's the same stimuli, but it will be routed in through different pathways spreading out to different parts of the brain before they reach a more general reaction to it. At what part of the process will it become problematic to talk about they're having the same stimuli?

Furthermore, Mr Smith being scared of snakes whereas Mrs Smith is scared of spiders does not mean they are scared about them in exactly the same way, although they both label it as being scared.

Indeed, we don't know whether Mr and Mrs Smith experience anything in the same way. We don't know if they see colours the same way, or taste food the same way. The experience could be entirely different.

We assume that similar behaviour indicates similar qualia - but we don't know that for sure. It's quite possible that there are four billion entirely different experiences of "red".

Ah yes, which brings the issue back to what Mercutio called 'semantic generalization'. To which I also touched upon way back in post #193.
 
You mean dreams, yes? Dreams are a form of conscious state. What I'm referring to is unconscious sleep; the period of time where your conscious awareness is shut off.

I know of no such time period. Quite often I nod off for about a minute, litterally, and I have time to dream, and I most certainly dream right up to the moment I wake up in the morning. As far as I'm concerned, I'm either awaken or dreaming.
 
It depends upon what you consider thinking "in the same sense as a human being". Would you consider it possible that a chimpanzee thinks in the same sense as a human being? A dog? A cat? A mouse? As far as I know computers have not even reached the ant level in complexity

I would have thought that the "sense in which humans think" is the very question we don't have an answer to yet.

You can safely assume that the one thing I'm promoting on this thread is our ignorance.

AI may help provide that answer. In fact even the failures of AI will provide useful information.

I've no problem with AI research - it's the assumption that AI will inevitably produce conscious beings that I reject. The statement that conscious programs already exist I can't accept at all.
 
No, it isn't.

Your knowledge of your own consciousness is dependent upon, among other things, your knowledge of self versus non-self. That makes it non-axiomatic. Unless you want to rewrite the rules of logic so that axioms can depend on each other.

The knowledge of self arises out of experience. We distinguish between what is and is not ourselves on the basis of what is part of our awareness.
 
Then please explain in what sense a computer executing a program "understands" what it is doing. Does a rock falling downhill "understand" that it is falling downhill? Does a payroll program understand that what it is doing has to do with pay? How could it. It's just shuffling bits. And no matter what programs you put into a computer, it's just shuffling bits.

So are you.
 
I don't know why there's this absolute terror at the thought that there might actually be something unique or unusual about human beings.

Terrified about being special ? Odd.

What's funny is, this is the exact same argument made against theists who are terrified about being NOT special.
 
The knowledge of self arises out of experience. We distinguish between what is and is not ourselves on the basis of what is part of our awareness.

Yes, I agree. Awareness and experience is required for knowledge of self, knowledge of self is required for consciousness.
 
No, you aren't. Or at least, if you are, you have some fundamental misunderstandings about physics and mathematics.

You see, anything in physics can be mathematically described. And if something can be mathematically described then what we know about computation theory is applicable to it. Thus, even if you are correct and consciousness is some kind of "field," it will still be the result of an equation or algorithm, which are mathematically the same thing.

There seems to be a massive misapprehension here. If consciousness is the result of some kind of physical field effect (and I'm far from accepting that as true) then it will probably be describable mathematically. Does this mean that carrying out the computation will be equivalent to the actual field? Can we generate gravity by playing around with Newton's or Einstein's equations?

It is not possible to create things in the real world by mathematical simulations, whether performed on paper or on a computer. If consciousness depends on a field, it will not be created in a computer.
 
Westprog, meet SHRDLU.

Sure, it's just shuffling bits. But so are you.

Stanford said:
SHRDLU was written in MacLisp for the ITS system, vintage 1970.

That's fairly typical for the big AI breakthroughs. I expect that back then they assumed that nearly forty years on they'd have intelligent robots in every house. Instead they've nearly mastered walking.
 
Regarding the "mind" recycling "its own" memories? ...Well, wouldn't that also be what we are calling "experiencing"? Thus it's not empty of all content, is it? Moreover, how would we know they belong to the "mind"? Provided the context in where AkuManiMani's though experiment is applicable, where there would never have been outside stimuli (see blow): Why should we assume the experiencing is happening in the person's "mind" if she's never been exposed to matters of identity and the social constructs we have, "mind" being one of them. Certainly it's plausible that the child would not think it is it who's experiencing. Not to speak of what "memories" in this context would entail in the first place. If there's no construction of subject/object division, what does the term 'experience' mean here, would it be the same "thing" as for us who surely are capable of creating such distinctions, at least during particular times during the day?

One of the amazing tricks that happen in human development (and possibly in some animals, though we don't know) is this sense of self - and the corresponding concept of non-self. In the absence of experience, there'd be no need of such a concept.

It strikes me that one way to bring a human mind closer to the condition of an executing computer program would be to remove the experience - including, as suggested, the experience of memory.
 
westprog said:
One of the amazing tricks that happen in human development (and possibly in some animals, though we don't know) is this sense of self - and the corresponding concept of non-self. In the absence of experience, there'd be no need of such a concept.

It strikes me that one way to bring a human mind closer to the condition of an executing computer program would be to remove the experience - including, as suggested, the experience of memory.


Just to be clear: Do you mean removing the experience of self?
 

Back
Top Bottom