Re random: Many people insist that quantum mechanics, particularly when Heisenberg's Uncertainty Principle is considered, is truly random. A particle physicist, however, will insist in turn that it is completely deterministic. It is not, however, in the physicist's view, important what happens in a single interaction; and each single interaction is, in fact, random, by just about any definition you care to name. The physicist insists that it is the
probabilities that are determined; and they are very rigid. The mathematics are among the most rigorous and well-confirmed in all of science; the prediction made using Feynman's approach to quantum electrodynamics to calculate the probabilities of interactions between photons and electrons has been confirmed to
seventeen significant figures (it's probably higher by now, too). This is far and away the most exact confirmation of the prediction of a theory in all of science.
Thermodynamics is ruled as we now know by the fluctuation theorem; but even before the FT was understood, thermodynamics was understood by philosophers to represent the ultimate deterministic outcome (2LOT: the law of entropy) of a random process (the collisions of molecules in a gas).
People who talk about entropy in information theory with respect to evolution claim that it means that order cannot arise from disorder; the exact opposite is the truth. Entropy is completely deterministic. Energy will
always flow from a point of higher energy to one of lower until they are equal, if a path is open to it. If that path is convoluted, it is possible to make the energy do work while it is flowing. Order therefore arises directly from chaos in such simple machines as the steam engine. What the 2LOT says is NOT that order cannot arise from chaos; what it says is that the TOTAL order will decrease over time, no matter what else happens.
I have seen several instances of people confounding order with determinism, and chaos (randomness) with indeterminism, on this thread. It is time to clear the air. If you want to talk about genes, and living organisms interacting with their environments and each other, that is biology. But if you're going to talk about randomness, don't quote from non-authoritative sources; and understand clearly that chaos (randomness) is not the same as indeterminism. It is eminently possible for a completely deterministic system to be highly chaotic; and equally possible for a totally ordered system to have very high entropy. If you want to talk about information, and order, and chaos, and determinism, and systems theory, you best talk to a physicist; biologists are not trained in this area. That's not a knock on biologists; they know all kinds of things physicists don't. I'm just saying, when I see order vs. chaos confounded with determinism vs. indeterminism, it raises my hackles. And I see it all the time in conversations about evolution.
Physicists use "stochastic" to refer specifically to those chaotic systems that are completely deterministic. The word has spread out from there, clearly there are dictionaries that use the less precise definition of "random" for it, but let's try to keep the original meaning in mind when we use it.
It boils down to this canard which is kleinman's mantra: So, the question is... Can you provide an example of a random mutation that is known to increase the information content of the genome?
No problem, and this is why kleinman is so universally reviled; when faced with this example, he ignores it.
Chromosomes recombine at conception; two haploid gametes meet and form a diploid zygote. If a site on one chromosome of one gamete has mutated, is the chromosome of the zygote more or less complex? Clearly, if the mutation site is the locus of a protein template, then a novel protein will be expressed. Therefore, in these cases, mutation
adds information, from a purely biochemical point of view.
Last but not least: I repeat that I think you have misjudged mijo. This individual knows enough physics to talk about entropy, information, order, chaos, randomness, and determinism. What he does not yet know but may understand after reading the above (and I would bet understood after my earlier treatment of much the same subject), is that these terms are highly deceptive in the mouths of biologists. I would like to see some understanding of the fact that someone exposed to an education in the physical sciences might easily make these kinds of mistakes in understanding, especially when dealing with a bunch of biologists who, quite frankly, don't appear to understand these terms any better than mijo does. Definition of randomness, indeed. Please make sure you understand what these terms mean, folks, before you misuse them some more.
Mijo, don't talk to biologists about physics; and don't try to understand biology by understanding physics, either. Both are mistakes. Biology deals with systems that show emergent behaviors that are novel from the point of view of physics. We've only gotten physicists to admit that there was anything interesting about chaos in the last thirty years; to expect them to understand what is going on in biology is a waste of time. Similarly, because of physicists' contempt for "soft sciences," engendered by precisely the kind of fuzzy terminology we've seen on this thread, there is little point in talking about physics with the biologists either; they don't want to hear physics terms like "stochastic," so what they hear instead is the "random" they think it means. Your initial mistake was confusing the two yourself; I hope I've given you some help with that.