A computer program doesn't "know" things in the same way that a person does. A person might know something, yet refuse to tell anyone about it. What a computer program "knows", on the other hand, is, by definition, just what it reports. What other meaning could be given to the expression, "such-and-such a program knows such-and-such a fact"?Originally posted by Robin
But the argument only shows that A doesn't know, it says nothing at all about what B knows. After all if Penrose's reasoning is valid then it could also be used by B, but it would simply be prevented from reporting this by the artifice of A.
If it never gives the wrong answer, then it doesn't stop. This is straightforward: if it did stop, then, by that very act of stopping, it would be giving the wrong answer.But do we even know that A does not stop?
"If it did stop it would not" sounds slightly odd, I agree---it's not something that a nonmathematician would ordinarily say---but, strictly speaking, there's nothing wrong with it. If p implies not-p, we can logically conclude that p is false: it can't be true, because that would lead to a contradiction. On the other hand, there's no problem if it's false. So it's false.Penrose's reasoning is that A does not stop because if it did stop it would not. That seems rather weak to me,
Not just "some algorithm". We know which one. It's A. The only way we know that one exists in the first place is by explicitly constructing it. We construct an A, based on B, in such a way that B can't correctly say it doesn't halt. We know that B can't correctly say it doesn't halt, because if B did say so, it would halt and B's answer would turn out to be incorrect after all.the more correct conclusion would simply be that there is some algorithm that B cannot tell if it halts
There are two possibilities: (1) B says that A doesn't halt; A halts; B gave the wrong answer. (2) B doesn't say that A doesn't halt; A doesn't halt; B failed to give the right answer.
Yes, I agree. That example is perfect.This is what I am trying to show with my 'Dunbar Conference' example - that a human mathematician working to the same constraints will also not be able to signal the correct answer.