phobos
Unregistered
P
I made a post in the thread 'The Motivated Assumption' claiming that
For those who haven't been following the quantum mechanics of it all, suppose I've encoded Two Towers in a rigged coin. The digits of the probability of a heads result (in a binary number base, let's say) reproduce the binary representation of the film, which is 1.4 billion bytes (a standard two-CDR rip).
The question is, how many tosses will it take before someone can expect to have a good quality copy of the film? I claimed that it would be so many that I might as well have sent the binary - in other words, 11.2 billion tosses. However, probability theory isn't really my field, and so I could quite easily be wrong here. How many times would I need to toss the coin before I could reasonably claim to have established the probability of a Heads to 11.2 billion decimal places?
It seems, intuitively, that each toss would double the accuracy of my estimate of the probability, and thus add an extra digit to my binary representation of the probability. So I'd need exactly the same number of tosses as there are bits in the encoded message.
Unfortunately, I can't (say) encode a DVD rip of Two Towers in the finer decimal points of the 'a' value of my qubit and then teleport it to a friend. Even if I have a means of rigging my qubit so precisely, the receiver will never know the exact values of a and b, because when you measure a superposition it comes out |1> or |0>, not both. By statistical analysis of a whole _lot_ of my teleported qubits, he could determine the values of a and b to arbitrary accuracy (since a determines the probability of |0> and b the probability of |1>), but then I have to send so many that I might as well have used old-fashioned binary :-(
For those who haven't been following the quantum mechanics of it all, suppose I've encoded Two Towers in a rigged coin. The digits of the probability of a heads result (in a binary number base, let's say) reproduce the binary representation of the film, which is 1.4 billion bytes (a standard two-CDR rip).
The question is, how many tosses will it take before someone can expect to have a good quality copy of the film? I claimed that it would be so many that I might as well have sent the binary - in other words, 11.2 billion tosses. However, probability theory isn't really my field, and so I could quite easily be wrong here. How many times would I need to toss the coin before I could reasonably claim to have established the probability of a Heads to 11.2 billion decimal places?
It seems, intuitively, that each toss would double the accuracy of my estimate of the probability, and thus add an extra digit to my binary representation of the probability. So I'd need exactly the same number of tosses as there are bits in the encoded message.