The Hard Problem of Gravity

Apathia said:
Do we regard them as selves then, in the way we regard ourselves as selves?
What of a high level AI NPC in a game.
Have I killed someone when I blast him away?

I'd say that it's not consciousness itself that we care about, but the complexity of that consciousness. So the subroutine that manages the actions of an NPC might be conscious (in the sense we are using here), but less so than, say, an ant, and also possessing some other properties that make us less inclined to worry about snuffing it out. For example, it can be indefinitely reproduced, and also saved and recreated identically. Would we worry so much about killing even people if we knew for certain that there was reincarnation?

Would you argue that if someone created a program that was more 'complex' than a human it should be entitled to more legal rights than an actual human? Is complexity your criteria for ethics?
 
I (and everyone else here, it seems, except for you, who I am not sure about yet) disagree with such a claim -- if mathematics says something will happen, then reality will match.
Only if the mathematics apply. But if they do, most certainly. You can literally count on it. (And we do!)

Now we can't be absolutely sure they apply, but we don't really care about only what we can be absolutely sure of.
 
Only if the mathematics apply. But if they do, most certainly. You can literally count on it. (And we do!)

Now we can't be absolutely sure they apply, but we don't really care about only what we can be absolutely sure of.

You're right. The problem is when people assume that they have a sufficient description of the thing in question when they really don't.
 
Not if one works in flat space (euclidean) and the other works in curved space (non-euclidean.)

If the two mathematicians ask different questions then they will get different answers. The ambiguity lies in the language, not the mathematics. Given the same axioms, and the same relationships, then the same results will emerge. Always, without fail.

That 1+1=2 is far more fundamental than the existence of the moon. We can be pretty sure that the moon exists. We can be absolutely certain that 1+1=2.
 
Sure. So are the relationships that the language is trying to describe. And most certainly, the logical implications are part of it. The whole thing is math.

Language, and it's conventions, are inherently open-ended. The difference between everyday spoken language and formal mathematics is that the latter deals in quantitative analysis while the former may convey qualitative elements as well.
 
Of course I can understand it. But I don't know what you think the question means. I'm pretty sure that someone here just said something like

But I don't know who that might have been...


I know what I mean by that. I have no idea what you mean, or even if the question is meaningful given your definition.

I explained the two potential meanings. I didn't just trot out "define terms," like some kind of bloody chatbot. If you understand the question, why didn't you just answer it? I don't see what the problem is.

That's why I asked for your definition. Which you haven't given.

I don't want to spend my day coming up with definitions for terms as basic as "aware," no doubt only to have those definitions questioned. I don't see the point. I submit that in this context the meaning is clear. You said you understood the question. Can you see a clear alternative meaning? If not, why not just answer it?


You haven't read the conversation between Winograd and SHRDLU, have you?

If you mean the one Hof writes with Wino using a pseudonym, I flicked through it fairly slowly.


I can regard you as a self. I can regard myself as a self.

Of course it is. So?

If shrd isn't programmed to create internal representations of self, why would it happen?

Nick
 
Why?


Which are just processing.


Really? And your evidence for this assertion is....?

We actually don't know anywhere near enough about human feelings to give them to machines or to know if that is possible.

If you take a behaviourist perspective that feelings are the executors of evolutionary logic then it is clearly possible to programme the behaviour of, say, empathy or sorrow or happiness, into a robot. But it will be (in almost certain liklihood) an f-zombie.


I see. So they behave in all ways as though they have feelings, but don't have feelings. They are f-zombies.

For sure you can programme a computer with the behaviour of empathy. But I find it grossly unlikely (to say the least) that it experiences the feelings involved. Of course it's hard to definitively prove either way, given that a human cannot know what a computer feels.

You don't agree?

Nick
 
We actually don't know anywhere near enough about human feelings to give them to machines or to know if that is possible.

If you take a behaviourist perspective that feelings are the executors of evolutionary logic then it is clearly possible to programme the behaviour of, say, empathy or sorrow or happiness, into a robot. But it will be (in almost certain liklihood) an f-zombie.




For sure you can programme a computer with the behaviour of empathy. But I find it grossly unlikely (to say the least) that it experiences the feelings involved. Of course it's hard to definitively prove either way, given that a human cannot know what a computer feels.

You don't agree?

Nick

Would you expect Gepetto to admit that Pinocchio is not a real boy? ;)
 
So... English is in reality or not?
Sure is. Are you trying to prove something I'm not saying again?

ETA: Not all English statements are "in reality" in any meaningful sense. E.g., repeat "buffalo" 2^40 times.
 
Last edited:
You need to listen to that lecture again, then, because YDNRC.

I listened. I mean it's a little complex because he's also trying to distinguish love from emotions, but he does say that you can't really wake up in the morning and feel love. It's interesting because to me love, in the sense Wolfe doesn't really believe in, does tend to occur as self-awareness deepens, along with a tendency for identification with thought to decrease. This is another of those things that is going to be problematic for Strong AI at some point, I expect.

Nick
 
Sure is. Are you trying to prove something I'm not saying again?

Depends what you mean by "in reality" I guess.

ETA: Not all English statements are "in reality" in any meaningful sense. E.g., repeat "buffalo" 2^40 times.

So, I take it, not all mathematical statements are "in reality" in any meaningful sense?
 
I just wanted to point out to you, yy2b, that the pertinent issue for this thread is that westprog is claiming a mathematical proof is not sufficient for arguments about physical things.

I think this is a fairly conventional viewpoint. The essence of mathematics is that it does not make claims about the real world. A physical theory makes claims about the real world. Such a theory may have a mathematical

In particular, he seems to think that the theorems of computer science do not necessarily apply to the physical world.

I (and everyone else here, it seems, except for you, who I am not sure about yet) disagree with such a claim -- if mathematics says something will happen, then reality will match.

Mathematics never says that something will happen. It's always physics that makes physical predictions. There is no mathematical theory of gravitation. There's the Newtonian theory, which uses the mathematics of the inverse square law, but it's a physical theory. There's the Einsteinian General theory, which uses different mathematics, but it's still a physical theory.

This is the problem with the field of computing - it's based on mathematical theory, not physical. That works fine if you are trying to establish mathematical truths. It's also helpful when creating devices to implement mathematical ideas. But there's no physical concept of computing.

Whether this is because mathematics depends on reality (which is my viewpoint) or because reality agrees with mathematics for an unknown reason (which seems to be yours ) is irrelevant -- mathematics and reality will always agree.

Right?
 
Well, yeah, because we're physical beings. In this same sense, fairies are entirely defined by physical reality. I've no problems with the fact that we need brains in order to come up with a mathematical system, or to come up with fairies. But just because we come up with fairies doesn't mean there are fairies. And just because we come up with a mathematical system doesn't mean there are things that relate in the ways it talks about (though it does suggest that there's an isomorphic description, if that's what you mean).

Yes, that is exactly what I mean.

Even the most abstract theorem of mathematics -- far removed from any real world application -- must be isomorphic to the description of some behavior of physical reality because in the end that theorem itself it is just a set of particles that behaves in a mathematically describable way.

Could it be something else? If not, there's hardly a difference between claiming that math does or doesn't exist when nothing is around, since in practice we're around trying to figure out if that's true. Before you tell me it has to be this way or that, you have to convince me that there's an actual meaningful sense in which it must be either this way or that way.

The issue is whether or not every mathematical theorem thought up by humans has an isomorphism to the physical behavior of reality.

Now, if mathematics does not exist without reality, then the answer to that is a trivial "yes."

If mathematics exists when nothing is around, though, then it is not so cut and dry.


Do you need 225 things in order for 15*15 to be 225? (Or Klein bottles in order for two Mobius strips to be able to be combined into them?) We're presumably talking about the relationships we're talking about, right?

No, of course not. What you do need is a system capable of assuming at least 225 distinct states.
 
The essence of mathematics is that it does not make claims about the real world.

For all types of things, if you add a set of 4 of that thing to a set of 5 of that thing you end up with a set of 9 of those things.

That is a claim about the real world, and it is pure mathematics.

What on Earth do you think numbers are anyway?

A physical theory makes claims about the real world. Such a theory may have a mathematical

A physical theory is merely a particular instance of mathematics. As such, it is always mathematical.

Furthermore, there is absolutely no statement of mathematics that is not isomorphic to some physical theory. That is, unless you are Roger Penrose or one of his supporters.

Mathematics never says that something will happen.

Wrong. When you a combine sets of five discrete things and four discrete things you get a set of nine discrete things. That is what mathematics says, and it always happens.

It's always physics that makes physical predictions. There is no mathematical theory of gravitation. There's the Newtonian theory, which uses the mathematics of the inverse square law, but it's a physical theory. There's the Einsteinian General theory, which uses different mathematics, but it's still a physical theory..

And all of them are mathematical statements about physical objects. Nothing more.

Like I said, a physical theory is just a particular instance of mathematics. Kind of like a poem about a certain woman is a particular instance of English.

This is the problem with the field of computing - it's based on mathematical theory, not physical. That works fine if you are trying to establish mathematical truths. It's also helpful when creating devices to implement mathematical ideas. But there's no physical concept of computing.

You have been wrong about this for the entire duration of the thread. You still are.

At least you are consistent in your errors.
 
Last edited:
An apple is clearly not round in the same sense in which a sphere is round. Nothing in the physical universe is a perfect sphere. A sphere is a mathematical concept.

Irrelevant. Dodger didn't say the apple was spherical, but round. The exact shape of a particular apple can be described mathematically.
 

Back
Top Bottom