• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Probability

phobos

Unregistered
P
I made a post in the thread 'The Motivated Assumption' claiming that

Unfortunately, I can't (say) encode a DVD rip of Two Towers in the finer decimal points of the 'a' value of my qubit and then teleport it to a friend. Even if I have a means of rigging my qubit so precisely, the receiver will never know the exact values of a and b, because when you measure a superposition it comes out |1> or |0>, not both. By statistical analysis of a whole _lot_ of my teleported qubits, he could determine the values of a and b to arbitrary accuracy (since a determines the probability of |0> and b the probability of |1>), but then I have to send so many that I might as well have used old-fashioned binary :-(

For those who haven't been following the quantum mechanics of it all, suppose I've encoded Two Towers in a rigged coin. The digits of the probability of a heads result (in a binary number base, let's say) reproduce the binary representation of the film, which is 1.4 billion bytes (a standard two-CDR rip).

The question is, how many tosses will it take before someone can expect to have a good quality copy of the film? I claimed that it would be so many that I might as well have sent the binary - in other words, 11.2 billion tosses. However, probability theory isn't really my field, and so I could quite easily be wrong here. How many times would I need to toss the coin before I could reasonably claim to have established the probability of a Heads to 11.2 billion decimal places?

It seems, intuitively, that each toss would double the accuracy of my estimate of the probability, and thus add an extra digit to my binary representation of the probability. So I'd need exactly the same number of tosses as there are bits in the encoded message.
 
Probability isn't my strong suit either, but I'll give it a quick go. For a binomial distribution, standard deviation is

(p(1-p)/n)^(1/2)

where p, in our example, is the probability of a heads result with our biased coin, and n is the number of trials in our sample. Then our 95% confidence interval is our sample p plus or minus two standard deviations. So our confidence interval scales with n^-(1/2). That suggest to me that to get an accuracy of one part in roughly 100 billion, we would need n = 10^22 = lots of qubits.

Then again, perhaps that's a load of bull :D
 
Well, the information you get from the coin tossed would also be a bunch of ones and zeros. So you wouldn't be able to get any more information out of it than the number of bits you get from the coin.

In any case it wouldn't be more than you could get from a compressed version of the file with the same number of bits/tosses. I guess it would be a lot less too, since the coin won't give you the same bit-sequence each time.
 
phobos said:
The question is, how many tosses will it take before someone can expect to have a good quality copy of the film?
Leaving aside the questions of QM and probability, the quality of the copy is a function of the encoding system. Various encoding systems exist for sending the most "important" information first. It may not be necessary to double the information in order to double the quality.
 
phobos said:
I made a post in the thread 'The Motivated Assumption' claiming that



For those who haven't been following the quantum mechanics of it all, suppose I've encoded Two Towers in a rigged coin. The digits of the probability of a heads result (in a binary number base, let's say) reproduce the binary representation of the film, which is 1.4 billion bytes (a standard two-CDR rip).

The question is, how many tosses will it take before someone can expect to have a good quality copy of the film? I claimed that it would be so many that I might as well have sent the binary - in other words, 11.2 billion tosses. However, probability theory isn't really my field, and so I could quite easily be wrong here. How many times would I need to toss the coin before I could reasonably claim to have established the probability of a Heads to 11.2 billion decimal places?

It seems, intuitively, that each toss would double the accuracy of my estimate of the probability, and thus add an extra digit to my binary representation of the probability. So I'd need exactly the same number of tosses as there are bits in the encoded message.

Actually one can do much better than simply measuring in the |0>,|1> basis and using the relative frequency to estimate the values of a and b for a set of N qubits in the state a|0>+b|1>. One simple way of doing better is a Bayesian feedback. The best (but technologically infeasible at present) is a joint measurement on all the qubits.

The point is rather moot - teleportation provides absolutely no advantages for classical information transmission. The two towers is a particularly crappy string of classical information, but its classical nonetheless...
 

Back
Top Bottom