The Hard Problem of Gravity

When the program that passes the Turing test is written, then I'd love to look at the code. Since no such program has been written, or looks like being written, I can't begin to speculate as to what the code would look like.

Step one is to have a program explore its environment, examine the world that we live in, and develop its own way of seeing. I wouldn't even be as demanding as the Turing test - I'd be happy enough with a program that saw the world in its own way. But it would have to experience it.

... so, what does
Code:
[i]Experience()[/i]
look like?
 
Last edited:
... so, what does
Code:
[i]Experience()[/i]
look like?

Show me a program that produces experience, and I'll look at the code. Since no such program has been produced, we can but surmise.

It's not up to me to guess just how you will get experience into the code. It's your project. Forgive me if I remain unconvinced that it's going to happen.
 
Which yet again begs the question of how it is you're supposed to know what that looks like.

Apparently you know what it doesn't look like and that's about it.
 
Show me a program that produces experience, and I'll look at the code. Since no such program has been produced, we can but surmise.

It's not up to me to guess just how you will get experience into the code. It's your project. Forgive me if I remain unconvinced that it's going to happen.

lol
 
When a computer program discovers its own qualia, and asserts it, rather than prints out a phrase inserted by a person, then that will at least be interesting. Until then, the computer is no more significant than any other medium for carrying a human being's thoughts.
Okay, then SHRDLU. Fine, done that.
 
Well, i'm not a philosopher and i don't understand how a brain (chemical scum) produces the subjective experiences that i have. So, it's a hard problem for me anyway. I don't even know where to start with a answer. But i don't let this get to me. The tragedy is i have some books about this ("Consciensness explained", "How the mind works", "Society of the mind", "Neurophilosophy", "Brain Children", and "The Minds") but i didn't start reading them. I just bought them because the subject is fascinating:duck:
 
Last edited:
Well, i'm not a philosopher and i don't understand how a brain (chemical scum) produces the subjective experiences that i have. So, it's a hard problem for me anyway. I don't even know where to start with a answer. But i don't let this get to me. The tragedy is i have some books about this ("Consciensness explained", "How the mind works", "Society of the mind", "Neurophilosophy", "Brain Children", and "The Minds") but i didn't start reading them. I just bought them because the subject is fascinating:duck:

Welcome to the circus, Tranca!

Your self-awareness is a subjective experience.
From an objective, scientific, enginering point of view, this is very, very inconvenient, not to mention merely anecdotal and not subject to an objective verification or negation.

Now if we could just dismiss the subjective from the topic ...

:lol2:

Some of us can't help but try and find the models of consciouness in simpler systems that don't get subjective on us.

Enjoy!
 
Welcome to the circus, Tranca!
yes, welcome!
Your self-awareness is a subjective experience.
Um... only under some philosophical views. "Subjective", historically, presupposes a dualistic chasm between subjective and objective. From at least one other view, such things are simply "private" experience, not scientifically inconvenient at all. Private experience is still observable. It's just that the number of observers is one. We do have control procedures that can be implemented to minimize error even in this sort of observation.
From an objective, scientific, enginering point of view, this is very, very inconvenient, not to mention merely anecdotal and not subject to an objective verification or negation.
From some scientific views; not from all.
Now if we could just dismiss the subjective from the topic ...

:lol2:

Some of us can't help but try and find the models of consciouness in simpler systems that don't get subjective on us.
Some of us study what is there, and don't make **** up that is not there.
Yes!
 
Um... only under some philosophical views. "Subjective", historically, presupposes a dualistic chasm between subjective and objective.


Oh yeah, i also have a small book by John Searle about this gravity problem.
In his view, there's no problem. It's like saying ice can't be icy because the individual molecules can't be said to be solid...
 
yes, welcome!
Um... only under some philosophical views. "Subjective", historically, presupposes a dualistic chasm between subjective and objective. From at least one other view, such things are simply "private" experience, not scientifically inconvenient at all. Private experience is still observable. It's just that the number of observers is one. We do have control procedures that can be implemented to minimize error even in this sort of observation.
From some scientific views; not from all.
Some of us study what is there, and don't make **** up that is not there.
Yes!

Indeed, Mercutio!
My view is that the self of self-awareness isn't really there (as soimething that could be really there).
We do have private experiences and error must be mninimized.
My use of "Subjective" here isn't with any metaphysical substance or self in reference. Therre are simply different kinds of experience, as you point out.

There's a sense in which our "I" is an object of our private observation.
While at the same time there's a awareness of observing that's not the "I" of observation.

Sun rises, rainbows, and marvelous mirages.
 
Which yet again begs the question of how it is you're supposed to know what that looks like.

Apparently you know what it doesn't look like and that's about it.

That's always the starting point.

I've said that I regard the Turing test as still the best rough estimate of consciousness. If I talk to a robot/program and it seems to talk back, I'd find that, if not entirely convincing, at least interesting.
 
I've said that I regard the Turing test as still the best rough estimate of consciousness. If I talk to a robot/program and it seems to talk back, I'd find that, if not entirely convincing, at least interesting.

I find it more interesting that I'm certain people I've talked to would fail the Turing test at times. From this perspective I find it hard to understand the perspective that machines can't do these things when I can almost see the cogs whirring to spit out the prepackaged cultural responses to the mundane trivia of the day. The sort of exhaulted communication that we see as elevating our species to some great heights is not common place - most of it is done unthinkingly and automatically, regurgitating and repacking other's thoughts. (And yes I include myself in all these things).
 
Nope.

I can claim that when you drop a rock, it isn't actually the rock that changes, it is us and the rest of the universe that shifts. And you have no way to prove otherwise.

External properties are relative to the observer. If you want to include the position of a rock in some behavior, it must be the behavior of the rock + observer system.

With a thermocouple bending due to heat, the relevant properties change regardless of any property of any observer. The molecules within the system change position and energy relative to each other. Because of this, one can mathematically show the behavior of the thermostat to have changed without referencing any external entities.

You can't do that with a rock that merely changes position, velocity, momentum, etc.

If the rock heats up, or breaks, or otherwise has an internal state change, then the change stops being observer relative and the behavior of just the rock does indeed change.

Fortunately rocks do heat up, break and otherwise have internal state changes.

BEHAVIOR -- a change in the relative physical properties of a system.

SYSTEM -- several physical objects.

Logical definitions are a superset of physical definitions.

You can take my logical definition and make it physical in any number of ways. All you have to do is replace the variables and constants with real world entities.

I did so with a thermostat. A thermostat switches because the only time the AC is in the ON state is when the incoming power to the system and the thermostat itself are also in the ON state. And these ON states are clearly defined -- for the AC and incoming power, ON corresponds to current flowing in a given direction above a given threshold, and for the thermostat, ON corresponds to the thermocouple having bent enough to touch the contact.

I asked you to do so with a rock, to demonstrate how a rock can switch.

Have you done that yet?

ON is when a drop of water is allowed to pass by and erode a channel. OFF is when it is blocked - due to rock expanding and contracting in the heat. A system in exactly the same way.

Ahh, see -- you said it yourself. It's movement in the thermostat which changes its state. Not movement of the thermostat.



They change position relative to each other -- that's how they exchange information. Their position relative to an observer is irrelevant as far as the exchange is concerned.



Even if volcanoes switch, they are not a single rock, they are a system of multiple rocks (and other stuff), and so you cannot show a rock to switch using a volcano as an example.

Kind of like how I told you a thermocouple all by itself does not switch.

A rock in a vent will act as a switch. It will block or release the lava flow according to temperature and pressure.

If you are accepting that a volcano is a network of switches every bit as much as a computer, then I think that's progress.
 
I find it more interesting that I'm certain people I've talked to would fail the Turing test at times. From this perspective I find it hard to understand the perspective that machines can't do these things when I can almost see the cogs whirring to spit out the prepackaged cultural responses to the mundane trivia of the day. The sort of exhaulted communication that we see as elevating our species to some great heights is not common place - most of it is done unthinkingly and automatically, regurgitating and repacking other's thoughts. (And yes I include myself in all these things).

That's why the Turing test is not trivial. Getting a chatbot to put LOL and :) in replies to messages wouldn't be difficult.

However, I'm reasonably confident that none of the participants in this thread are artificial entities. Not even Pixy. Just the few thousand bits of entirely digital information that we are passing back and forth is enough to indicate that everyone here is a human being. Some of the posts might be from more than one person - or there might be one person sending all of them (except mine). But there's a human being behind every one.

This of course gives Rocketdodger and Pixy a great opportunity. If they can suddenly reveal that one of the posters is in fact a sophisticated AI program, then it gives huge support to their position. But even if it is theoretically possible, it's not been doable yet. Calculating splash damage from thrown acid is something programs are very good at. Conversation even at its most Pinteresque is something they are very bad at.
 
Okay, with you so far. Not saying I agree with you, though!


No, he's not. He's talking about consciousness in general.

After all, what is consciousness?

If you are aware of the outside world, is that sufficient? An ant is aware of the world; so too a planarian, or a paramecium. Clearly aware, because they can respond appropriately to events.

No. Consciousness is the awareness of self. And that is inherently and obviously self-referential.

Everything else is baggage.

I disagree. Most aspects of consciousness need no awareness of self for them to manifest. All of the sensory states conform to this, and feelings too. Until inner dialogue, and identification with inner dialogue, comes along there is simply no narrative self, no user illusion. Nothing happens to anybody until the thought arises that this is going on. Things simply are.

Look around you and ask...without inner dialogue where is the self-reference? One might consider there's a little in the reactive dispositions of the body perhaps, the way we move to stimuli or have attention directed, but this aside you won't find any.

It is completely possible to observe anything without any notion of self and actually this does not contradict Strong AI. It merely contradicts the notion that self is inherent in consciousness.

Nick
 
Last edited:
Fortunately rocks do heat up, break and otherwise have internal state changes.

Fortunately, I already said that.

Are you even paying attention?

SYSTEM -- several physical objects.

Obviously. And you are still ignoring the "relative" component.

ON is when a drop of water is allowed to pass by and erode a channel. OFF is when it is blocked - due to rock expanding and contracting in the heat. A system in exactly the same way.

Well, not quite, but you are getting closer.

"Not quite" because "water is allowed to pass by and erode a channel" isn't consistent from rock to rock. If you want to make it consistent, you have to include something like "where the exit angle is within 15 degrees of the entrance angle," etc.

Otherwise the ON state would be satisfied for water hitting the rock and rolling off in any direction, depending on where the channel was.

But, provided such constraints were included, you could then make a computer out of rocks and channels of water. And since you could make a computer, you could make a conscious entity as well (if consciousness is computation). In fact you could simulate an entire human brain, given enough rocks and channels.

Now, this is an inescapable conclusion given the premises.

Clearly, your choice is to reject the premises -- consciousness is more than computation, because otherwise a system of rocks and channels could be conscious.

My choice is to accept that consciousness might be different than what I thought it was -- because a system of rocks and channels could be just as conscious as I am.

A rock in a vent will act as a switch. It will block or release the lava flow according to temperature and pressure.

If you are accepting that a volcano is a network of switches every bit as much as a computer, then I think that's progress.

Yes, it is a network of switches, like a computer. However, since the output is not organized in any way, it is not a computer. It is akin to what would result from a child getting ahold of a VLSI design workstation as opposed to an educated professional.

And, before you go off on your nonsense again, let me make it clear that the concept of "organized" is not human centric -- it is based on statistical mathematics. Something is more organized than something else if there is a lower likelihood it resulted from random chance.
 
Last edited:
I disagree. Most aspects of consciousness need no awareness of self for them to manifest. All of the sensory states conform to this, and feelings too. Until inner dialogue, and identification with inner dialogue, comes along there is simply no narrative self, no user illusion. Nothing happens to anybody until the thought arises that this is going on. Things simply are.
Included in "things" are the very thinking that this is going on.

Note that you do here, explicitly, speak of different aspects of consciousness; once again I remind people that we use that word very sloppily, and to look for one explanation to cover all uses cannot be done at any level other than the behavior of the organism and the language used to describe it. Other than that, you must settle for reducing one aspect of consciousness at a time. Pixy has done this. It does not fit other aspects by your analysis here; there is no requirement that it should.
Look around you and ask...without inner dialogue where is the self-reference? One might consider there's a little in the reactive dispositions of the body perhaps, the way we move to stimuli or have attention directed, but this aside you won't find any.
Without self-reference where is the inner dialogue? See, it goes both ways. And even here, both "inner dialogue" and "self-reference" are being used sloppily. Self-referencing may certainly take place without inner dialogue. Perhaps vice versa, too--it depends on what one means by each term.
It is completely possible to observe anything without any notion of self and actually this does not contradict Strong AI. It merely contradicts the notion that self is inherent in consciousness.

Nick
In which aspect of consciousness? As long as we are using sloppy categorical terms, the question is sloppy.
 
But, provided such constraints were included, you could then make a computer out of rocks and channels of water. And since you could make a computer, you could make a conscious entity as well (if consciousness is computation). In fact you could simulate an entire human brain, given enough rocks and channels.

Or more.
 
Private experience is still observable. It's just that the number of observers is one. We do have control procedures that can be implemented to minimize error even in this sort of observation.
From some scientific views; not from all.
I don't think it is logically possible for a person even to observe their own private experiences in a scientific way. Say, for example, that I observed myself to be experiencing the feeling of typing these words. I would suppose given your views, you would take this to be reasonably good evidence of a particular experience taking place in me since the described experience conforms with what would be expected of a person typing out words on a message board. To me, this would be an incorrect conclusion because when someone makes an observation, they're already taking the discussion out of the conceptual realm of experience. An observation cannot be taken as evidence for an experience because it is explainable in terms of calculations made by the brain which cause the body to behave in certain ways that imply a notion we call "experience." The report that I typed about how I was experiencing the sensation of typing can be explained as the consequence of brain activity causing my fingers to move in accordance with certain patterns. In this sense, we must leave open the possibility that consciousness is a fictitious concept invented by the methods of processing idiosyncratic to the brain. It is this idea that I believe David Chalmers was trying to convey with his p-zombie, and as there is no scientific way of studying experience in light of this idea, there is no way for us to understand what experience is and what its properties are should it actually exist.
 
Fortunately, I already said that.

Are you even paying attention?



Obviously. And you are still ignoring the "relative" component.



Well, not quite, but you are getting closer.

"Not quite" because "water is allowed to pass by and erode a channel" isn't consistent from rock to rock. If you want to make it consistent, you have to include something like "where the exit angle is within 15 degrees of the entrance angle," etc.

That's an interesting new constraint - because it has no relevance to a single rock or thermostat. Which is where this particular subthread started.

If a single switch has some degree of consciousness, then clearly a rock does as well.

Otherwise the ON state would be satisfied for water hitting the rock and rolling off in any direction, depending on where the channel was.

But, provided such constraints were included, you could then make a computer out of rocks and channels of water. And since you could make a computer, you could make a conscious entity as well (if consciousness is computation). In fact you could simulate an entire human brain, given enough rocks and channels.

Now, this is an inescapable conclusion given the premises.

Clearly, your choice is to reject the premises -- consciousness is more than computation, because otherwise a system of rocks and channels could be conscious.

Well, indeed it is. I find it quite plausible that a computer could be made out of rocks and water. I don't accept that it is certain that consciousness is purely computational. I see no particular reason to believe such a thing.

My choice is to accept that consciousness might be different than what I thought it was -- because a system of rocks and channels could be just as conscious as I am.

You seem to be entirely wedded to the conviction that consciousness is exactly what you thought it was. You start with the certain knowledge that consciousness is computational in nature, and derive everything else from that. If you start with the assumption that consciousness is physical in nature, you will end up with a quite different set of ideas.

Yes, it is a network of switches, like a computer. However, since the output is not organized in any way, it is not a computer. It is akin to what would result from a child getting ahold of a VLSI design workstation as opposed to an educated professional.

And, before you go off on your nonsense again, let me make it clear that the concept of "organized" is not human centric -- it is based on statistical mathematics. Something is more organized than something else if there is a lower likelihood it resulted from random chance.

We seem to be gradually getting closer to a physical definition, but we are still a long way off.
 
Last edited:

Back
Top Bottom