I am sorry to bring an old post up like this, but this thread is moving faster than I can keep up!
You lost me with this quote.
The propositions H1, there is a grue in my house, and H2 there isn't have equal probabilities to me because of my total ignorance about them. The logic is simple: I have no reason to believe that P(H1) > P(H0) and no reason to believe that P(H0) > P(H1), but P(H1) + P(H0) = 1; therefore, P(H1) = P(H0) = 0.5.
There is no reason to believe that P(H1) > P(H0), and there is also no reason to believe that P(H0) > P(H1), and this leads you to believe that P(H0) = P(H1). Why?
It's self-evident, isn't it? If I have no reason to believe that either H0 or H1 is more likely than the other, then I must believe that they are equally likely. The conclusion is logically necessary.
This is formalized by the principle of indifference, which says that if we have equal amounts of evidence for two propositions, we must assign them the same probability. No evidence for either of two propositions is just a special case of equal evidence.
I would have said there is no reason to believe any of these, so it would be impossible to assign a probability.
I would say that if you can't assign a prior probability to a proposition, then you can't learn about it, or at least you can't rationally assign a probability to it after you have collected evidence for (or against) it. Conversely, if you collect some data on the hypothesis, and do rationally assign a probability to the hypothesis after seeing the data, then you must have had a prior probability all along. Consider the following example.
You conduct an experiment and collect some data on a hypothesis, H1, and its complement, H0. You analyze the data using an appropriate statistical model and determine that the data are three times as likely under H1 than under H0. Now, let's say that on the basis of that experiment you conclude that H1 is three times as likely as H0. Then what you're telling me is all the information you have on these hypotheses came from the experiment; you knew nothing about them beforehand. Now what happens if I plug your numbers into Bayes' Theorem, work backwards, and calculate your prior probabilities? I get .5 for each hypothesis, just as the principle of indifference dictates.
If, on the other hand, you conclude that the relative probabilities of H1 and H0 are anything other than 3:1—that is, any ratio other than what the data told you—then you must be taking other information into account (or making a mistake). That is, you had a reason to believe one of the hypotheses was more likely all along. For instance, if you now believe that H1 is
four times as probable as H0, and we plug your numbers into Bayes' Theorem, we find that your prior probability for H1 was not .50, but in fact .57, implying that you had a small preference for H1 over H0 from the start.
Also, when I read the Wikipedia article that you linked to, I find:
I do not think that ESP and the existence of alien life are possibilities that are indistinguishable except for their names, and for that reason it seems that the principle of indifference is not applicable here.
I agree.