The Hard Problem of Gravity

yy2bggggs, would you please weigh in on another issue?

Westprog contends that computation theory is not a physical theory.
...
What is your opinion?
My opinion is that there is a fundamental problem with communication that is occurring here. Westprog does not mean what you interpret him to mean, but he's not saying the right things to communicate what he does mean. It's very painful to watch.

(ETA: What makes it more painful is that the above isn't necessarily true... but I can see how it could be. As such, even if I stand in and try to argue his point, I'm not really sure it's his.)

(ETA: Also, keep in mind that I'm pretty fairly skilled at OO programming, which corrupts the brain--because it pushes me into a crippled condition of thought whereby I'm forced to admit such ludicrous things as that circles are not ellipses).
 
Last edited:
My opinion is that there is a fundamental problem with communication that is occurring here. Westprog does not mean what you interpret him to mean, but he's not saying the right things to communicate what he does mean. It's very painful to watch.

(ETA: What makes it more painful is that the above isn't necessarily true... but I can see how it could be. As such, even if I stand in and try to argue his point, I'm not really sure it's his.)

Knowing precious little about math, and trying not to add further perhaps spurious interpretation, it strikes me that what Westprog is saying is that...because mathematical relationships exist outside of the brain, in the physical world, so the final nature of those relationships is non-physical. It strikes me that, should this be his position, then he is actually just railing against old interpretations of materialism.

Nick
 
If there were a artificial construct created with operational complexity comparable to, or greater than, that of a human would you consider it to be have 'greater' value than the life of a human?

For example, would it be justified to kill, or otherwise harm, a human to prevent harm from being done to said construct merely on the basis of its complexity?
Please reconsider this line of argument - Pixie's OP regarding 'what we care about' specifically stated "complexity of consciousness", not some vague overall complexity.
 
Please reconsider this line of argument - Pixie's OP regarding 'what we care about' specifically stated "complexity of consciousness", not some vague overall complexity.

I don't believe anyone much cares about complexity of consciousness. I identify myself as a human being, experience empathetic response with other humans, and it becomes hard to hurt them. I create social rules to protect myself and others. Somewhere way down the line complexity could be argued as a factor but I figure it's a long way down. Humans just don't think about this stuff so much. Our rules come more from the gut. Certainly when we look at other species a far more important factor than complexity is how fluffy they are.

Nick
 
It's only if GC is true that we run into a possible problem for proveability. The only way possible for us to not be able to prove that the GC is true or false would be a scenario in which the GC is true.

Yes but that's why if the only way to show GC is true is to enumerate all its instances then the GC is undecidable because it doesn't matter how many instances you enumerate it *may* be the case that the next instance is false.

Therefore you cannot formally decide that the GC is true if you are forced to use such a brute force method.
 
(ETA: Also, keep in mind that I'm pretty fairly skilled at OO programming, which corrupts the brain--because it pushes me into a crippled condition of thought whereby I'm forced to admit such ludicrous things as that circles are not ellipses).

Well, no it doesn't. It does force you to choose *ONE* set of relationships between objects though and that's where you have to compromise and perhaps choose that circles are not ellipses.
 
My opinion is that there is a fundamental problem with communication that is occurring here. Westprog does not mean what you interpret him to mean, but he's not saying the right things to communicate what he does mean. It's very painful to watch.

(ETA: What makes it more painful is that the above isn't necessarily true... but I can see how it could be. As such, even if I stand in and try to argue his point, I'm not really sure it's his.)

(ETA: Also, keep in mind that I'm pretty fairly skilled at OO programming, which corrupts the brain--because it pushes me into a crippled condition of thought whereby I'm forced to admit such ludicrous things as that circles are not ellipses).

When I say that computation is not a physical theory, I just mean that it's not a theory in physics. I hope that is clear enough.

Ideas in economics and poems and favourite chatup lines can all be modelled in the physical world, but that doesn't make them part of physics.
 
Well, no it doesn't. It does force you to choose *ONE* set of relationships between objects though and that's where you have to compromise and perhaps choose that circles are not ellipses.

If you derive circles from ellipses then you don't even have to do that. But while that's correct geometrically, it's messier to program.
 
When I say that computation is not a physical theory, I just mean that it's not a theory in physics. I hope that is clear enough.
Oh, it's clear. It's just that it's both pointless - most scientific theories aren't normally considered part of physics, but result from the principles of physics and are reducible to theories of physics, and that's what we mean by physical, and it's irrelevant that the scientists who study these specific specialised sub-fields of physics (like chemistry or biology) don't call themselves physicists - and false.

Ideas in economics and poems and favourite chatup lines can all be modelled in the physical world, but that doesn't make them part of physics.
Economics is sociology is psychology is biology is chemistry is physics.

If it happens in the real world, it's physics.
 
Yes but that's why if the only way to show GC is true[not false] is to enumerate all its instances then the GC is undecidable
Fixed it for you (as phrased, it was sort of contradictory).

And no, that's incorrect. If it's impossible to prove that GC is false (GC mind you, not statements in general), then GC is true.

Think of it this way. Suppose you have a proof that it's impossible to prove the GC is false. That proof is ipso facto equivalent to a proof that GC is true.

Or think of it this way. You can build a device (abstract one) that does enumerate through all of the integers, and stops once it finds a counterexample to the GC. This machine may or may not halt. If it's impossible to prove the GC false, it won't halt (assuming you build it correctly--described below). If it doesn't halt, GC is true. GC is equivalent to the statement that this machine won't halt.

Building the machine correctly requires that for any arbitrarily large number M that you choose, I can name a number S such that if your machine ever reaches the S-th step, it has either confirmed the GC up to M, or found a counterexample.

This is possible for GC because every candidate can be proven as either consistent with GC or inconsistent with it, in a finite number of steps, and I can build a machine such that I can give you a ceiling for how many steps it would take for each M (that is, I can build a machine such that I can do what I'm required to do above--given infinite tape, of course, and perhaps a spherical cow).
Well, no it doesn't. It does force you to choose *ONE* set of relationships
In almost all cases, neither is a proper relationship, but I won't argue for that in this thread. I was mainly just having a joke at my own expense.
 
Last edited:
Please reconsider this line of argument - Pixie's OP regarding 'what we care about' specifically stated "complexity of consciousness", not some vague overall complexity.

The question I posed was partly rhetorical and partly out of genuine curiosity as to what his position is on the matter.

My point was that cognitive complexity ["complexity of consciousness" or w/e you wanna call it] is hardly relevant in ethical or moral considerations. What qualifies a subject as being worthy of moral consideration is whether or not it has the capacity to experience suffering -- or experience anything in a qualitative manner at all. If one wanted to argue for ethics on the basis of "complexity of consciousness" then the life of a person of average intelligence would easily trump that of someone who is mental handicapped.

The fact of the matter is that his 'definition' of consciousness is so intellectually impoverished that one can't even formulate a sane and coherent basis of ethics from it.
 
Last edited:
The question I posed was partly rhetorical and partly out of genuine curiosity as to what his position is on the matter.

My point was that cognitive complexity ["complexity of consciousness" or w/e you wanna call it] is hardly relevant in ethical or moral considerations. What qualifies a subject as being worthy of moral consideration is whether or not it has the capacity to experience suffering -- or experience anything in a qualitative manner at all. If one wanted to argue for ethics on the basis of "complexity of consciousness" then the life of a person of average intelligence would easily trump that of someone who is mental handicapped.

The fact of the matter is that his 'definition' of consciousness is so intellectually impoverished that one can't even formulate a sane and coherent basis of ethics from it.

That seems a bit confused to me. Are you saying we should define consciousness on the basis of who it's ok to kill or similar?

I think it's very hard to make intellectually sound statements here. It's more people's gut reactions that calls the shots. With suffering....so, you're saying that if there's a lesion in your ventromedial PFC, which prevents you from feeling anything very much, then you have less right to live?

See what I mean?

Nick
 
Oh, it's clear. It's just that it's both pointless - most scientific theories aren't normally considered part of physics, but result from the principles of physics and are reducible to theories of physics, and that's what we mean by physical, and it's irrelevant that the scientists who study these specific specialised sub-fields of physics (like chemistry or biology) don't call themselves physicists - and false.


Economics is sociology is psychology is biology is chemistry is physics.

If it happens in the real world, it's physics.

That is absolute rubbish and you should know better. There is nothing in current physical theory that is even remotely applicable to the domains of economics, sociology, or psychology. If anything, physical theory sets boundary conditions and limitations on what can be accomplished within those domains but it does not dictate, predict, or fully explain their goings-on. I dare you to try and refute this. The onus is on you to not only explain how a field like sociology is directly reducible to known physics but provide a link to a single scientific paper that illustrates that physics necessarily predicts the emergence of life and other higher order processes.

The ironic thing is that when it was proposed to you that there might be a way to integrate all those fields into one coherent theoretical framework you outright poo-pooed it.
 
Last edited:
When I say that computation is not a physical theory, I just mean that it's not a theory in physics. I hope that is clear enough.
Not quite. There's an ambiguity that's absolutely critical to address here--one that's forgivable for missing, because we don't usually tend to obsess with such things.

That ambiguity is in what exactly you mean by "not a physical theory". "Is" relationships are, on occasion, and particularly in this context, not as straightforward as they are usually treated to be.

Part of my OO joke was intended in the ha-ha-only-serious sense that even extremely obvious "is's", such as circles being ellipses, have subtle contexts in which what is meant by the "is" must be described. Since I take it a large number of netizens are naturally CS geeks, I'm generally assuming (or hoping) that there are a lot of people who can identify with this canonical example.

ETA: Put it this way. Every positive claim that rocketdodger is making is true, in the sense that he is intending it (culling out, for the moment, his claim that you're wrong). Do you agree or disagree? If you agree, yours is to show the sense in which you are intending your claim that is different. If you disagree, you're either wrong or you don't understand what rocketdodger is saying.

Oh, and in addition, it's entirely possible you mean something different from computation than what rocketdodger means. That has to be nailed down exactly as well. Wouldn't hurt to nail down physical theory. In fact, just going formal in general would help.

Argh... I keep adding to this, but I think it's necessary to clarify.

What conditions must be true in order to meaningfully say that a computational theories are physical theories, and what conditions must be true (yes, it has to be treated separately--otherwise the "meaningless" category creeps in) in order to say that computational theories are not physical theories? The answer to both of these questions is what I was talking about above--they are the definition of "is". </clinton>
 
Last edited:
That seems a bit confused to me. Are you saying we should define consciousness on the basis of who it's ok to kill or similar?

I think it's very hard to make intellectually sound statements here. It's more people's gut reactions that calls the shots. With suffering....so, you're saying that if there's a lesion in your ventromedial PFC, which prevents you from feeling anything very much, then you have less right to live?

See what I mean?

Nick

Oh, I gotcha :)

My point is that we extent moral rights to others (whether they be humans or critters) because of their presumed inherent capacity to experience. To empathize with a subject necessarily implies some shared subjectivity. One would prefer not to harm a dog because we can empathize with it subjectively -- not because it meets some explicit operational criteria of cognition.

Even in the case of euthanasia, its no coincidence that the subjective capacity of the individual in question is central to the ethical consideration. We ask questions like "are they conscious?", "will they suffer?", and "do we have the right/responsibility to take their life?".

The same consideration extends to the question of abortion. One of the major contentions is whether or not an embryo/fetus can experience pain [i.e. does it have some subjective capacity] and whether or not this affords them legal rights.
 
Last edited:
You can build a device (abstract one) that does enumerate through all of the integers, and stops once it finds a counterexample to the GC. This machine may or may not halt.

Yes. This is what makes it formally undecidable. This is the reduction of finding GC to the Halting Problem.

If you could write a function to know if your machine would halt or not we could find GC. Hell, we could prove anything.

Since we can't we can't and your proposed machine doesn't change that. You do not know if it will halt you do not therefore know how many steps it would take to find a counterexample to the GC and if the GC is true you cannot compute that it is true. It is therefore formally undecidable.

Unless it's false and you find a counterexample. Then it's decided. But the point is that you don't know using this technique. Some other proof mechanism might do it but this one cannot. (And until one finds it you are still left not knowing one way or the other).
 
Last edited:
Oh, I gotcha :)

My point is that we extent moral rights to others (whether they be humans or critters) because of their presumed inherent capacity to experience. To empathize with a subject necessarily implies some shared subjectivity. One would prefer not to harm a dog because we can empathize with it subjectively -- not because it meets some explicit operational criteria of cognition.

I think one has to be careful when intellectualising our gut responses in situations like this. There can be a wisdom to feelings that transcends the intellect. This aside, for sure we can develop empathy towards dogs easily and it can "feel wrong" they should be killed.

Even in the case of euthanasia, its no coincidence that the subjective capacity of the individual in question is central to the ethical consideration. We ask questions like "are they conscious?", "will they suffer?", and "do we have the right/responsibility to take their life?".

The same consideration extends to the question of abortion. One of the major contentions is whether or not an embryo/fetus can experience pain [i.e. does it have some subjective capacity] and whether or not this affords them legal rights.

Prior to Strong AI being recognised as the truth, someone somewhere is going to have to think of a whole heap of clever answers to a million ethical questions that will no doubt arise in its wake.

Nick

eta: I mean, it's a bit crazy all of it really, if you ask me. Anyway, even if the foetus feels pain it doesn't have a self to place the pain onto. I don't know, it seems to me that only in America are people so mad as to want to get lawyers for embryos. No doubt I'll get hassle for that one. But I do think it would be more reasonable to put energy into stopping wars first.
 
Last edited:

Back
Top Bottom