Ev and the basics of information theory.
The Shannon definition for information turns out to be mathematically equivalent to the negative of the quantum mechanical definition for entropy. If one considers that entropy is the measure of randomness this relationship becomes intuitively apparent. Increasing the information in a system reduces the randomness and thus reduces the entropy. So how does this relate to genetic evolution. One of the basic problems of Information theory is to take an initial ensemble with an initial probability distribution to a final ensemble with a final probability distribution by the input of information. In other words, you take a more random higher entropy ensemble to a less random lower entropy ensemble by the input of information. When information theorist talk about 1 bit of information, they are saying that based on a single yes or no question, the answer to that question allows them to decide which ensemble has a lower entropy. The answer to the binary question allows you to reduce the entropy and therefore the randomness by 1 bit.