The Hard Problem of Gravity

My point is not related to this. If you replicate the visual cortex with a massive array of interconnected tin cans or a vast population of interconnected humans performing simple functions (a la Block)....how does it see?

Nick

The same way it sees now -- by reasoning.
 
My point is not related to this. If you replicate the visual cortex with a massive array of interconnected tin cans or a vast population of interconnected humans performing simple functions (a la Block)....how does it see?

Nick

It doesn't. Just as a word or symbol is not the concept(s)/object(s) they're used to represent, so an operational representation of the process of sight is not identical to sight. The only way to reproduce sight is to use sufficient formal knowledge of it to physically reproduce it. Simply having random objects crudely ape the process of visual stimulus is not enough.

So...the Chinese Nation if interconnected correctly would, as a whole, experience vision?

Nick

Maybe if we had some technology to literally link their minds together. Clearly, such a feat is a long ways off -- if its even physically possible. In the mean time its more than safe to assume that the answer to that question is "no".
 
Last edited:
The notion that it is switching state is actually just humans attributing function. With the stat electron activity is switched up or switched down by the movement of the bimetallic strip. It's a simple physical process. The notion that a state is switched.....the AC capacitor charges, coolant begins to flow, the fans start to move...is purely human attribution of functionality.

So is consciousness.

Or are you claiming that the notions of consciousness you have could exist without your existence?
 
So...the Chinese Nation if interconnected correctly would, as a whole, experience vision?

Nick

Of course.

Why wouldn't it?

Is there some mathematical rule stating that vision is a property unique to biological neural networks?
 
It doesn't. Just as a word or symbol is not the concept(s)/object(s) they're used to represent, so an operational representation of the process of sight is not identical to sight. The only way to reproduce sight is to use sufficient formal knowledge of it to physically reproduce it. Simply having random objects crudely ape the process of visual stimulus is not enough.

Personally, I'm not clear either way. Maybe the computational theory of consciousness (Strong AI) is valid, maybe not. Actually, I don't think anybody really knows yet. Dennett is one of the most innovative and vocal SAI proponents around but I still find his rhetoric suspiciously antagonistic (vaguely reminiscent of Pixy) and I don't think he really knows. The most level-headed-sounding researchers to me seem clear we don't yet know the answer.



Maybe if we had some technology to literally link their minds together. Clearly, such a feat is a long ways off -- if its even physically possible. In the mean time its more than safe to assume that the answer to that question is "no".

Actually, it's a bit different to how you're interpreting it. Block says that each human is just to act as a neuron, connecting to the next with a mobile phone. Again, I'm suspicious of this and also Searle's Chinese Room experiment which also purports to refute Strong AI. Both are too loaded to appear ridiculous. Both Searle and Block want to make SAI look simply ridiculous.

This ridiculing thing goes on a lot in this area of research and to me indicates basically that neither side really have much clue actually, when all's said and done.

Nick
 
Last edited:
It doesn't. Just as a word or symbol is not the concept(s)/object(s) they're used to represent, so an operational representation of the process of sight is not identical to sight. The only way to reproduce sight is to use sufficient formal knowledge of it to physically reproduce it. Simply having random objects crudely ape the process of visual stimulus is not enough.

You would be correct except for the fact that you only have access to representations to begin with.

Tell me -- what is the difference, in cognitive terms, between seeing X and seeing a perfect illusion of X?
 
Of course.

Why wouldn't it?

Is there some mathematical rule stating that vision is a property unique to biological neural networks?

So...how would this vision manifest? Would each person see or would some vast meta-entity be created, and if so how would this affect each individual?

Nick
 
nobody is interested in the results of that decision then... well... nobody is interested in the results.

That's exactly right. The thermostat isn't interested. The molecules aren't interested. The rock isn't interested. The only element that's interested in the business is the human being.
 
Notice how you are using the 3rd PP when you say "...to have access to its own consciousness..."

Since we're using your own way of reasoning here: How do you have access to your own consciousness from a 1st PP? Do you not see the duality here? The only way to get away from that duality is to say that access is consciousness.

You might also say that you are conscious of your consciousness, but isn't that a little bit like saying you can lift yourself up from your own bootstraps?

Yes, access to your consciousness is consciousness. Experiencing your experience is experience. It's not infinite regress, it's not by your bootstraps, it's just a matter of reaching the irreducable end of the line.
 
You would be correct except for the fact that you only have access to representations to begin with.

Tell me -- what is the difference, in cognitive terms, between seeing X and seeing a perfect illusion of X?

We can only see instances of the concept of "X".

The symbol can be expressed in various ways and still be an instance of "X":

X

X

X

Etc...

In any case, X is merely a symbol. There is no such thing as the 'illusion' of a symbol; they are inherently stand-ins for concepts. That's like saying there is the illusion of an illusion; such a statement is redundant. Either the symbol elicits appropriate associations in a subject or it does not. The only illusion in such cases would be if a subject mistook a representation for what its representing. "X" is "X"; there is no illusion of "X" because it is an exemplar of itself.

What we're referring to are specific classes of physical phenomenon -- in this case conscious phenomenon. Merely carrying out a formal representation of a phenomenon is not the same as actually reproducing it. Representations are illusory just as simulations of actual phenomena are illusory. Simulated gravity in a computer game is not an instance of actual gravity, and an array of tin cans mimicking neural nets is not conscious.

Consciousness, as such, can only be directly perceived by a conscious entity and specific conscious states can only be communicated via representations. The representations are not themselves conscious but serve to elicit conscious associations in entities with such capacities.
 
Last edited:
In fact, you have no idea what consciousness could possibly be, aside from what it feels like, so it's anathema to you that someone actually would.

I've agreed that the definition of consciousness is fuzzy. However, you've claimed that the concept of information processing and reading data is not. So why no rigorous definition?
 
So is consciousness.

You're claiming that consciousness cannot exist except without a human attribution of functionality applied to it?

If Strong AI is correct then consciousness is an phenomenon that arises in a sufficiently complex network of switches. Say a billion to be on the safe side.

Nick
 
Wait, wait... are you seriously saying that, for instance, the information that your food might be poisoned is of equal value to you as information about the number of children in a village in India?

No. Because I'm a human being, who makes value judgements.

And that the information about external temperature is of equal value to a thermostat as information about what pigment was used to paint the living room?

It's of exactly the same value. The thermostat doesn't make value judgements. It's a very weird form of anthropomorphism to suggest it does, which most of us outgrow by the age of ten or so.

Again... you don't get it -- value is relative.

No. Value is a human concept. Possibly zebras and ants have a concept of value. Thermostats do not.

No it isn't. "Want" and "preference" are perfectly valid concepts that are easy to formalize.

It's easy to make up engineering meanings that have nothing to do with the way human beings use the words. They have no physical meaning.
 
You're claiming that consciousness cannot exist except without a human attribution of functionality applied to it?

If Strong AI is correct then consciousness is an phenomenon that arises in a sufficiently complex network of switches. Say a billion to be on the safe side.

Nick

So basically, their position is a concession that they don't really know what consciousness is:

"Oh, its just a bunch of really complicated operations & stuff. The more complicated the operations, the more conscious they are. There's no need to define it beyond this description" :rolleyes:
 
"Oh, its just a bunch of really complicated operations & stuff. The more complicated the operations, the more conscious they are. There's no need to define it beyond this description" :rolleyes:

I know that this wasn't directed at me, but I feel the need to reply.

Consciousness existing has nothing to do with complexity. The idea that it does, is something that a lot of people here are hung up on. Conscious systems obviously can vary in complexity, the system existing within the human brain being the most complex one that we currently know of.

I would say that the more complex the system, the more deep, rich, and complex the experience/awareness will be. This is not to say that consciousness cannot exist in a much less complex system. That system would obviously have a more shallow experience, but it would not lack consciousness due to being simple, nor would I call it "more conscious" because it is more complex.
 
I know that this wasn't directed at me, but I feel the need to reply.

Consciousness existing has nothing to do with complexity. The idea that it does, is something that a lot of people here are hung up on. Conscious systems obviously can vary in complexity, the system existing within the human brain being the most complex one that we currently know of.

I would say that the more complex the system, the more deep, rich, and complex the experience/awareness will be. This is not to say that consciousness cannot exist in a much less complex system. That system would obviously have a more shallow experience, but it would not lack consciousness due to being simple, nor would I call it "more conscious" because it is more complex.

Good. Then we agree on something. Unfortunately, there seems to be others here who think that merely having computational operations [reflexive or otherwise] is sufficient to produce conscious experience. Consciousness may indeed be a class of computation, but its pretty obvious that no one has yet succeeded in formally mapping what it is.
 
Last edited:
Good. Then we agree on something. Unfortunately, there seems to be others here who think that merely having computational operations [reflexive or otherwise] is sufficient to produce conscious experience. Consciousness may indeed be a class of computation, but is pretty obvious that no one has yet succeeded in formally mapping what it is.

Well, I can't read Pixy or RD's mind, but I think that they would agree with me on the complexity bit. I think that some people *coughwestprogcoughcough*, are using the ultra complex human manifestation of consciousness as a guideline for what consciousness is. This is wrong.

This will lead you to believe that humans are the only conscious entities, and this is clearly wrong.
 
Flying, of course, is done with feathered wings. Thus, by definition, insects don't fly. Nor do airplanes. Especially airplanes, since they aren't even alive.
 
So...how would this vision manifest? Would each person see or would some vast meta-entity be created, and if so how would this affect each individual?

Nick

How does your vision manifest? Does each neuron see or is some vast meta-entity created, and if so how does it affect each neuron?

*rolls eyes*

It is clear you have absolutely no capacity to understand where any of us computational people are coming from. If I were you I would try to get something that even remotely resembles an education before you continue with this "investigation" of yours.
 

Back
Top Bottom