The Hard Problem of Gravity

The ultimate thing I'm after when I describe reality is making sure I have the proper relationships.

Yes, that is what one strives to get.

Nonetheless you can't really tell if you have it and you certainly don't get it by directly extracting it from reality. It still remains synthetic.

It leaves this abstract world untouched, unharmed, because the relationships in themselves are what math is about.

Yes, I know.

Okay, but isn't that a tautology?

Yes - but what else would you expect it to be consistent with? The series of squares isn't consistent with the series of cubes but I don't think you'd call these series inconsistent because of it.

Really all I was stating was that if you define something that's its definition and if you can't define something it might as well have an infinitely long definition of what it is because nothing smaller can describe it other than it being what it is.

Pi is what it is defined to be. No more, no less. We shouldn't expect it to change "just because". Either your algorithm is the same or it isn't.

How would QM fit into the "law of excluded middle"? I'm trying to figure out if by "law of the excluded middle" you really mean something I would call "false dichotomy".

If QM can be defined in a way not covered by the above, (in that either you can write down a finite definition or you can't) then it's a false dichotomy.

Since the dichotomy has complete coverage that would be hard to achieve.

It really severely looks like reality can be both describable and not, at the same time--that there are things that can be described, and things that cannot be.

That's the problem with the fact that language allows paradox - mathematics included.

Be more specific. Is it your position that reality is not mathematical? That, say, there are no relationships in reality?

Reality is specifically what it is. Naming it "mathematical" achieves no more understand then naming it "macaroon".

225 is related to the number of dots, or the cost of flubnars, or the momentum of the ball in kg m/s.

225 is not any of these things though is it? The meme does not have weight.
 

Well, if one judges that the very capacity to answer the question in any form constitutes being aware then it doesn't matter.

Again, I have to ask you if you think that actually means something.


Why?

If SHRDLU's entirely busy with the table for a start he won't be able to answer any questions and anyway you are not aware of something that is constantly present and takes up all your awareness.

I imagine you are completely wrong.

How come? What is present in working memory is effectively what is present in awareness. No?

I see.

So if Descartes explains to you how he arrived at his cogito, and you realise that all that's really happening is neurons firing in his brain, then he doesn't understand any of this, he's just answering in a pat way?

He's just answering in a pat way. It depends of course whether you regard any response as determining awareness. And if that is so then why bother asking? Why not just see if the power's on?

Yes. It's a language issue - on two levels.

Are you aware of being aware?

Read Hofstadter. He actually uses SHRDLU as an example.

I will. But one can read Hof until blue in the face, SHRDLU is not developing a sense of self until it possesses the program to cause it to do this.

ETA: I looked up shrdlu in geb. I couldn't see where Winograd even discusses how shrd forms a sense of self. Presumably there are just basics - "I" "you" - pre-programmed in. Let me ask you a question....do you think that if a robot says "I" it necessarily has a sense of self?

Nick
 
Last edited:
Do we regard them as selves then, in the way we regard ourselves as selves?
What of a high level AI NPC in a game.
Have I killed someone when I blast him away?

This is the same issue that comes up in teletransporter discussions. For me death doesn't count if you can come back, particularly if you can come back instantaneously.

Nick
 
Nonetheless you can't really tell if you have it and you certainly don't get it by directly extracting it from reality.
Are you referring to not knowing whether or not your model of reality reflects the relationships that reality has?

If so, then certainly, I buy that.
It still remains synthetic.
I'm not sure there's a meaningful difference between calling it synthetic and not synthetic. I find it doubly appropriate in this particular thread to point this thing out. Even if we "create" a particular mathematical system, it still comes from brain.

We can explore the same notion from an abstract perspective--even if we create a particular mathematical system, it's still very much a discovery in an abstract search space.

I'm not sure if there was supposed to be something significant about its being "synthetic"--I can't imagine that there would be, unless there's some implication that it's "not real", which I cannot for the life of me find grounds to accept.
Yes - but what else would you expect it to be consistent with?
Alright, sounds fair to me.
That's the problem with the fact that language allows paradox - mathematics included.
Not quite sure I follow. Did you mean exclusive or or not? If you did, I don't buy your math-allows-paradoxes out.
Reality is specifically what it is. Naming it "mathematical" achieves no more understand then naming it "macaroon".
I'm not sure where you got that I was naming reality "mathematical". I was making claims that there is mathematics in reality. Things actually are related in reality. Sure, we come up with descriptions, but reality restricts correct descriptions--it does so by having entities that are interrelated.
225 is not any of these things though is it? The meme does not have weight.
You have 225 objects (speaking of the simple counting model) if and only if you can map them one to one to the standard "counting game" numbers, and your complete mapping runs out when you hit 225 by the rules. That's what it means to be 225. That is, so to speak, how much the meme weighs.

Of course, it has to be somehow applicable, to play that particular game. You need things you can map this way. I can play this with cups of water, for example, but I can't play it too well with water itself--at least not without introducing a different notion.

But there's no lack of things to play this game with. And lo and behold, every time I find thingies I can arrange into a 15 by 15 grid, I wind up with 225 thingies.
 
Last edited:
Well, if one judges that the very capacity to answer the question in any form constitutes being aware then it doesn't matter.
Yep.

If SHRDLU's entirely busy with the table for a start he won't be able to answer any questions and anyway you are not aware of something that is constantly present and takes up all your awareness.
Uh, I see a small problem with that sentence.

How come? What is present in working memory is effectively what is present in awareness. No?
No.

Awareness is a process.

He's just answering in a pat way. It depends of course whether you regard any response as determining awareness.
What we can examine is responses and processes. That's all there is.

If you don't consider the responses adequate, and you don't consider the processes adequate, then what you are looking for does not exist.

And if that is so then why bother asking?
That's what I'm asking you.

Why not just see if the power's on?
What does that tell us?

Are you aware of being aware?
Define "aware".

I will. But one can read Hof until blue in the face, SHRDLU is not developing a sense of self until it possesses the program to cause it to do this.
"Because you asked me to."

Sorry, Nick, either SHRDLU has a sense of self or there is no such thing.

ETA: I looked up shrdlu in geb. I couldn't see where Winograd even discusses how shrd forms a sense of self.
Introspection. Just like us.

Presumably there are just basics - "I" "you" - pre-programmed in.
Good grief. Read the book, Nick. It's a 700-page essay on why that assumption is false.

Let me ask you a question....do you think that if a robot says "I" it necessarily has a sense of self?
Do you think that is relevant to the question of whether SHRDLU has a sense of self? If so, why?
 
Find a copy of Micro Planner, you can have a SHRDLU of your very own. :)

Thanks, but what I think I need now is one of those counseling bots that can carry on a conversation and say empathetic things.
Such would be equal to to my former girlfriend in having no capacity for actual empathy, but superior in being able to use empathetic vocabulary.

Well, selves are complex works of programming and biochemistry.
Someday we'll have to make some hard decisions about just how much respect we'll give artificial beings.
 
Last edited:
Originally Posted by Apathia
Do we regard them as selves then, in the way we regard ourselves as selves?
What of a high level AI NPC in a game.
Have I killed someone when I blast him away?

This is the same issue that comes up in teletransporter discussions. For me death doesn't count if you can come back, particularly if you can come back instantaneously.

Nick

If you meet the Buddha in the game, kill him.
 
Awareness is a process.

And what is taking place in working memory is then not a process?

Define "aware".

Why not just say that you don't want to answer, Pixy? That at least is being straight, instead of this default avoidance position of asking for definitions when you don't want to risk an answer. Can you understand the question "are you aware?" Because if you can to me you can inevitably understand the question "are you aware of being aware?"

"Because you asked me to."

Sorry, Nick, either SHRDLU has a sense of self or there is no such thing.

I don't agree.

Introspection. Just like us.

Introspection is just another processing function.

Good grief. Read the book, Nick. It's a 700-page essay on why that assumption is false.

Judging by the quality of your responses thus far, I doubt that is so. And it seems an excessive investment to make.

Do you think that is relevant to the question of whether SHRDLU has a sense of self? If so, why?

Because teaching a robot to say "I" does not imo mean that it has a sense of self. Selfhood may be confered from the outside. A sense of self is an internal representation. If shrd is not programmed to have it, it doesn't have it.

Nick
 
Thanks, but what I think I need now is one of those counseling bots that can carry on a conversation and say empathetic things.
Such would be equal to to my former girlfriend in having no capacity for actual empathy, but superior in being able to use empathetic vocabulary.

Well, selves are complex works of programming and biochemistry.
Someday we'll have to make some hard decisions about just how much respect we'll give artificial beings.

I think empathy will prove a major challenge for Strong AI. I think it mostly comes down to feelings. Artificial beings don't feel (yet), though they can have the behaviour feelings ideally create. But when two bodies meet the conscious interactions are only a small part of the communication. Prof Wolfe makes an error, imo and IIRC, when he says love can only be experienced if it's directed towards something.

Nick
 
And what is taking place in working memory is then not a process?
Processing? Just a thought.

Why not just say that you don't want to answer, Pixy?
Why not define your terms, rather than issuing random ad hominems?

That at least is being straight, instead of this default avoidance position of asking for definitions when you don't want to risk an answer.
Ad hominem.

Can you understand the question "are you aware?"
Of course I can understand it. But I don't know what you think the question means. I'm pretty sure that someone here just said something like
For me this is getting more complex now, because "aware" can have subtly different meanings in English.
But I don't know who that might have been...

Because if you can to me you can inevitably understand the question "are you aware of being aware?"
I know what I mean by that. I have no idea what you mean, or even if the question is meaningful given your definition.

That's why I asked for your definition. Which you haven't given.

I don't agree.
Yes, I know that. Why don't you agree?

Introspection is just another processing function.
Of course it is. What do you expect it to be, an invisible elf?

Judging by the quality of your responses thus far, I doubt that is so. And it seems an excessive investment to make.
Read the book.

Because teaching a robot to say "I" does not imo mean that it has a sense of self.
You haven't read the conversation between Winograd and SHRDLU, have you?

Selfhood may be confered from the outside.
What?

A sense of self is an internal representation.
Of course it is. So?

If shrd is not programmed to have it, it doesn't have it.
Read the book. It's a 700-page essay on why that statement is wrong.
 
I think empathy will prove a major challenge for Strong AI.
Why?

I think it mostly comes down to feelings.
Which are just processing.

Artificial beings don't feel (yet)
Really? And your evidence for this assertion is....?

though they can have the behaviour feelings ideally create.
I see. So they behave in all ways as though they have feelings, but don't have feelings. They are f-zombies.

What is it that you think feelings are, Nick?

But when two bodies meet the conscious interactions are only a small part of the communication.
So?

Prof Wolfe makes an error, imo and IIRC, when he says love can only be experienced if it's directed towards something.
You need to listen to that lecture again, then, because YDNRC.
 
I'm not sure there's a meaningful difference between calling it synthetic and not synthetic.

Well, if it was not synthetic that would imply that you are making the rules.

I was making claims that there is mathematics in reality. Things actually are related in reality.

But this should cause no more problem than the claim that there is English in reality. Things actually have English nouns and verbs in reality.
 
Well, if it was not synthetic that would imply that you are making the rules.
Say what?

Maybe we're speaking a different language, but I interpret this to mean that you are suggesting that either mathematics is artificial, or we make up the rules. That makes no sense to me whatsoever.
But this should cause no more problem than the claim that there is English in reality.
225 is the same number as CCXXV. A XV by XV grid has CCXXV items in it. There's no language you can speak such that there's a different result. Methinks you have a false analogy here.
 
Last edited:
Say what?

Maybe we're speaking a different language, but I interpret this to mean that you are suggesting that either mathematics is artificial, or we make up the rules. That makes no sense to me whatsoever.

No, descriptions of reality are synthetic because they are synthesised from observation. Mathematics is purely analytical - no observations are required. If one were to "make up the rules" that would be analytical.

225 is the same number as CCXXV. A XV by XV grid has CCXXV items in it. There's no language you can speak such that there's a different result. Methinks you have a false analogy here.

In which langauge is an apple not an apple?
 
We can explore the same notion from an abstract perspective--even if we create a particular mathematical system, it's still very much a discovery in an abstract search space.

But that search space is entirely defined by physical reality.

As are all abstract spaces.

But there's no lack of things to play this game with. And lo and behold, every time I find thingies I can arrange into a 15 by 15 grid, I wind up with 225 thingies.

This is not evidence that 15 multiplied by 15 would still equal 225 if there were no thingies.

There is, in fact, no evidence at all that 15 or 225 could even exist if there were no thingies.

Which is exactly the point. As you yourself said, mathematics is relationships. How can a relationship exist without things to relate?
 
I just wanted to point out to you, yy2b, that the pertinent issue for this thread is that westprog is claiming a mathematical proof is not sufficient for arguments about physical things.

In particular, he seems to think that the theorems of computer science do not necessarily apply to the physical world.

I (and everyone else here, it seems, except for you, who I am not sure about yet) disagree with such a claim -- if mathematics says something will happen, then reality will match.

Whether this is because mathematics depends on reality (which is my viewpoint) or because reality agrees with mathematics for an unknown reason (which seems to be yours ) is irrelevant -- mathematics and reality will always agree.

Right?
 
I just wanted to point out to you, yy2b, that the pertinent issue for this thread is that westprog is claiming a mathematical proof is not sufficient for arguments about physical things.

In particular, he seems to think that the theorems of computer science do not necessarily apply to the physical world.

I (and everyone else here, it seems, except for you, who I am not sure about yet) disagree with such a claim -- if mathematics says something will happen, then reality will match.

Whether this is because mathematics depends on reality (which is my viewpoint) or because reality agrees with mathematics for an unknown reason (which seems to be yours ) is irrelevant -- mathematics and reality will always agree.

Right?

Well as a point of order it is the issue of utility that matters. If the math can be used to make a model of reality then that is the important thing, the utility in making predictions.

But mathematics as any symbolic communication is only useful from referents, you would not have math if had not learned how to count fingers, etc...
 
This is why I don't bother replying to Pixy any more. I have yet to see him make any kind of an argument on this subject. In an argument about consciousness, I'd at least expect someone to exhibit consciousness. A simple script would be able to run through my posts and put "wrong" "stupid" "irrelevant" "non-sequitor" at the paragraph marks.

Meh...argumentation is certainly not his strong point. The most unnerving part is that he does not seem to be able to see it.
 
No, descriptions of reality are synthetic because they are synthesised from observation.
Alright... makes a little more sense using that sense of the word.
In which langauge is an apple not an apple?
I don't understand the relevance of the question. In set theory, an apple would be an apple. So we're not even out of the realm of math yet.
rocketdodger said:
But that search space is entirely defined by physical reality.
Well, yeah, because we're physical beings. In this same sense, fairies are entirely defined by physical reality. I've no problems with the fact that we need brains in order to come up with a mathematical system, or to come up with fairies. But just because we come up with fairies doesn't mean there are fairies. And just because we come up with a mathematical system doesn't mean there are things that relate in the ways it talks about (though it does suggest that there's an isomorphic description, if that's what you mean).
This is not evidence that 15 multiplied by 15 would still equal 225 if there were no thingies.
Could it be something else? If not, there's hardly a difference between claiming that math does or doesn't exist when nothing is around, since in practice we're around trying to figure out if that's true. Before you tell me it has to be this way or that, you have to convince me that there's an actual meaningful sense in which it must be either this way or that way.
How can a relationship exist without things to relate?
Do you need 225 things in order for 15*15 to be 225? (Or Klein bottles in order for two Mobius strips to be able to be combined into them?) We're presumably talking about the relationships we're talking about, right?
 
Last edited:

Back
Top Bottom