• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Randomness in Evolution: Valid and Invalid Usage

Assertions, assertions, assertions. All I see is assertions from cyborg.

Why don't you google natural selection and seen how it's explained, cyborg?

Because I am perfectly consistent even if you fail to be.

So, for the record:

Natural selection defines that the nature of genetic propagation is survival. It is not biased towards any particular gene surviving.

The environment defines the bias for any particular gene.

FAIL as usual from you.
 
Last edited:
So, for the record:

Natural selection defines that the nature of genetic propagation is survival. It is not biased towards any particular gene surviving.

The environment defines the bias for any particular gene.

Epic fail for cyborg, yet again.

The events in the environment are what comprise natural selection. Since the environment defines the bias, natural selection is inherently biased. Natural selection does not exist outside the environment.
 
Will you look at that... another biologist that asserts that selection is not random"
http://scienceblogs.com/pharyngula/2008/04/i_get_email_13.php#more
Mr Armitage was shocked? Why? Look at the date on this paper:

H. J. Muller (1939) Reversibility in Evolution Considered from the Standpoint of Genetics. Biological Reviews 14: 261-80.

You know, "stochastic, mechanistic random shuffling of genes" certainly will produce amazing amounts of complexity, far more than we see in organisms. Fortunately, we also have this non-random process called "selection" that constrains rampant randomness to a more limited functionality. That was an idea that was figured out in 1859. Maybe Mr Armitage needs to work a little harder on keeping up with the contemporary literature.


And that's from today!

He's reviewing a creationist "paper"...
Let's see, that is not unlike Dawkins reviewing Behe's book:

http://www.mukto-mona.com/Articles/dawkins/inferior_design140707.htm
I]The crucial passage in "The Edge of Evolution" is this: "By far the most critical aspect of Darwin's multifaceted theory is the role of random mutation. Almost all of what is novel and important in Darwinian thought is concentrated in this third concept."

What a bizarre thing to say! Leave aside the history: unacquainted with genetics, Darwin set no store by randomness. New variants might arise at random, or they might be acquired characteristics induced by food, for all Darwin knew. Far more important for Darwin was the nonrandom process whereby some survived but others perished. Natural selection is arguably the most momentous idea ever to occur to a human mind, because it — alone as far as we know — explains the elegant illusion of design that pervades the living kingdoms and explains, in passing, us. Whatever else it is, natural selection is not a "modest" idea, nor is descent with modification. [/I]

I've also posted peer reviewed papers and science articles saying as much. But some self appointed experts think they are clearer and smarter than Biologists, Talk Origins, those who actually teach the subject to others, Science magazines, etc. Nobody gives them the idea that they are, but in their head they insist that "evolution is not nonrandom"... they will never concede that otherwise. The same people... the same obfuscating nothingness... the same non points--

Now why in the world would people what to do this when, as illustrated above, it is unclear and is identical to the obfuscation of creationists? Why wouldn't someone interested in conveying information aim to sound like those who actually do convey information?

And how much do you want to bet that these same people will never ever say that evolution is not random no matter how much explaining you do or how much evidence there is or how many peer reviewed papers say exactly that? Does it matter to them that no respectable scientist anywhere is using random the way they are or describing evolution the way they are? Nope, because in their own head, they are smarter than them. They are smarter than you. In their head they it's almost impossible to find the point in their writings.

Randomness is not essential for things to evolve-- only reproduction and a selection process. Even with cloned animals as identical as can be-- some animals will preferentially survive and have more offspring... and these will be the founders of new populations and future genomes. Unless every member of a species fails to reproduce, you will have evolution. If they all die out, you have extinction.
 
Last edited:
Evasion noted, while chimps kill small animals, I don't recall that they and the other apes wipe each other out. Whatever, modern society can engage in efficient warfare.

Pony up your evidence for hs sapiens 'wiping out' hs neandethalis. Large piles of neanderthal skeletons with stone tool wounds? Stone arches with 'Arbeit macht frei' written on them? (Sorry that is undue sarcasm)

And then tell me about how homo sapiens sapiens wiped out the large mega fauna in north america while you are at it.

Seeing you ducked the question the first time, I will ask again. Do you agree that there is anthropological evidence that man and neanderthals range in Europe over lapped?
 
...The problem in discussing evolution by natural selection as a stochastic process is that there s very little evidence that demonstrates that individuals with identical phenotypes must all reproduce, making it a deterministic process. Most often individuals in evolution by natural selection are described as having a probability of reproducing, which is a description of a stochastic process.
You keep on talking about all individuals with identical phenotypes. Natural selection does not stop the reproduction of all individuals with identical phenotypes.
The whole point of natural selection is that not all individuals with identical phenotypes reproduce. Some do and some don't. The ones with a phenotype that the selection favours reproduce more than the ones that that the selection does not favour.
 
You keep on talking about all individuals with identical phenotypes. Natural selection does not stop the reproduction of all individuals with identical phenotypes.
The whole point of natural selection is that not all individuals with identical phenotypes reproduce. Some do and some don't. The ones with a phenotype that the selection favours reproduce more than the ones that that the selection does not favour.

Well, that is by definition a stochastic process. The "direction" we perceive in evolution by natural selection is a result of multiple rounds of natural selection expressing this "favoritism".
 
Crap, I actually agree with you!

Really?

Come back to my example of the molecule of gas in a box. Ignoring quantum mechanics, the dynamics of those molecules is completely deterministic. And yet it is utterly impossible - even in principle - to predict the location of any of the molecules a significant amount of time - like an hour - later. I think I can show it would require a computer much larger than the observable universe to do so.

So the position of that molecule is unknowable, and there's a perfectly valid and mathematically rigorous sense in which, after an hour, it's equally likely to be anywhere in the box*. You wouldn't call that random?


*Basically if you allow just a little uncertainty in the initial position or velocity of any of the molecules, the position where any particular one ends up mixes and spreads equally over the volume.
 
Last edited:
No, just because an environment favors the development of a trait doesn't mean evolution is non-random. All the descriptions I have read of natural selection say that better-adapted individuals have a high probability of surviving than others. That is a description of a stochastic process.


The only adaptation that matters is ability to reproduce, survive to reproduce, not just surviving.

Reproduce.
 
That is a distinction without a difference. Something that is non-random is deterministic and vice versa.

I think again that is rather a blanket satement, we can have deterministic processes that prodeuc eresults that are indistinguishable from random. We can have mixes: totaly causal and deterministic (where the same conditions will always produce the same results), causal deterministic with variable inputs (where same conditions will not always produce the same results, but might) and totaly acausl/non-deterministic (where there is a probability that same conditions will produce the same results).

If I understand correctly

(same conditions, same results) is what you are calling determined

(same conditions, variable inputs) you are calling random/ some say stochastic.

(same conditions, probable distribution) you are calling random.

Is this an accurate summation?
 
So the position of that molecule is unknowable, and there's a perfectly valid and mathematically rigorous sense in which, after an hour, it's equally likely to be anywhere in the box*. You wouldn't call that random?

It's still deterministic. The problem is that "random" is being used in different ways. I'd rather still with one definition.
 
It's still deterministic. The problem is that "random" is being used in different ways. I'd rather still with one definition.

OK, but then you've defined the word in such a way that the distinction between determinism and randomness is operationally meaningless.

And let me point out again that to the best of our knowledge all fundamental interactions in the physical world are truly random, because of quantum mechanics, so these deterministic models exist only as abstract mathematical models, not as physical processes (like evolution).

I think it's clear that your (and mijo's) definition is bad, because it doesn't correspond to anything with any operational meaning. Some aspects of true random systems are almost perfectly predictable - for example, I can predict the pressure in a real box of gas like that with extreme precision - and some aspects of deterministic systems are totally unpredictable - like the position of the molecule in the thought experiment we are discussing.

What we really care about (apart from semantic or academic debates such as this one) is predictability, and nearly all the definitions of "random" I have found in sources - including technical ones - agree on that. "Random" means "unpredictable", and that is not synonymous with "non-deterministic".
 
Last edited:
Um, yes.

Even if every phenotype has a equal probability of reproduction, the generalization of the central limit theorem still apply.

If that is true, then all of population genetics is wrong. Is that what you are claiming?
 
I've got a few thoughts.

1. Regression of the mean works in opposition to evolution. It was one of the central reasons that lamarckian genetics was rejected. If everything could mix evenly any variation would regress back to the mean. Ie not net change over time. But in the discrete system of mendelian genetics that change cannot always regress. I may be explaining this point poorly so I encourage you to read about the history of the debate between the two.

2. I think macroscopic traits and complexity are often different things.
a. Evidence from the genome shows that we collect a lot of information in the genome, and the amount of information increases with time(just more junk in the genome). Evolution removes this sort of stuff, so it certainly decreases the complexity of the species(or gene pool) over time.(with a very strict definition of complexity)
b. Macroscopic traits like intelligence say very little about complexity. The way that we ended up being intelligent is probably not the simplest way to create a creature of equivalent complexity nor the most complex way. In other words evolution has a lot of appendices, this is particularly true in cognition, where we have newer layers of brain(in the evolutionary sense) layered on top of older ones and it seems a lot of those underlying layers do very little processing in humans. So a lot of extra complexity but very little extra intelligence.

3. Randomness is absolutely not necessary for evolution, but what you do need is variation. To have evolution in the way we generally think of it(ie not directed by things other than best fit to environment) that variation should be a good approximation of uniform variation. Luckily one of the constraints humans place on random variables(both intuitively and explicitly) is that those variables have a uniform distribution.

4. I personally think that the world is deterministic all the way down. As I understand quantum mechanics, the issue with hidden variables isn't that they are necessarily random, they can be non-random if they are non-local. I'm fine with non-locality, everything is connected, cool.

5. Given that, a fundamental question here is the definition of random. I think things are random if they appear random even if they have underlying deterministic behavior. If you disagree okay, but I think its a matter of opinion, or convention, if you will. I think once we finally define our terms this is a non-issue.

6. Finally there is a more interesting question beyond this. Does evolution necessarily converge on different forms? I can't say for any particular case, but what I can say for sure is that evolution is essentially a hill climbing algorithm(also often called simulated annealing). Which means it can only change from one form to another by small steps(or local variation). This precludes certain possibilities. So despite the fact that our brain is designed (I say designed only in the sense that I am anthropomorphizing an inanimate and non-directed process) by piling on layers, that now represent inefficiencies, it seems very likely that we couldn't have done it any other way. Any other intelligent evolved creatures in our universe would necessarily have their histories "built in" to their architecture, even if they end up doing it in a different way. It will never converge on the best way, because those intermediate steps are inseparable from the process( or algorithm). If you look at developmental biology all creatures on earth share this developmental commonality(and incidentally a common ancestor)

tl;dr

I'd just like to conclude with an interesting tangentially related thought. In A New Kind of Science, Wolfram claims there are actually 3 kinds of randomness rather than the two mentioned above.(This is actually the central claim of the book)

Specifically there are systems that don't have complexity built into their initial conditions(ie not chaotic systems), they don't have any "true" randomness, but instead they are systems that generate randomness.

They have simple homogeneous initial conditions and simple rules, but because they pass some incredibly low computational threshold they generate randomness. It's quite amazing.
http://mathworld.wolfram.com/Rule30.html
A 6 rule cellular automaton on a simple 1-d binary grid can generate a pattern that never repeats starting from a single black square. It is also proven that this system is a universal Turing machine. wow.

Well said!
 
OK, but then you've defined the word in such a way that the distinction between determinism and randomness is operationally meaningless.

And let me point out again that to the best of our knowledge all fundamental interactions in the physical world are truly random, because of quantum mechanics, so these deterministic models exist only as abstract mathematical models, not as physical processes (like evolution).

I was under the impression that those random fluctuations averaged out at our scale ?

I think it's clear that your (and mijo's) definition is bad, because it doesn't correspond to anything with any operational meaning. Some aspects of true random systems are almost perfectly predictable - for example, I can predict the pressure in a real box of gas like that with extreme precision - and some aspects of deterministic systems are totally unpredictable - like the position of the molecule in the thought experiment we are discussing.

Are you arguing that "deterministic" should mean "predictable" or am I misunderstanding you ?
 
What we really care about (apart from semantic or academic debates such as this one) is predictability, and nearly all the definitions of "random" I have found in sources - including technical ones - agree on that. "Random" means "unpredictable", and that is not synonymous with "non-deterministic".
From one your own links:
Gould said:
In ordinary English, a random event is one without order, predicatability or pattern. The word connotes disaggregation, falling apart, formless anarchy, and fear. Yet, ironically, the scientific sense of random conveys a precisely opposite set of associations. A phenomenon governed by chance yields maximal simplicity, order and predictability--at least in the long run. ... Thus, if you wish to understand patterns of long historical sequences, pray for randomness.
 

That is precisely the point I was making above - many "truly random" processes are highly predictable, or at least have aspects which are. In fact if that weren't the case all of science would be an abject failure, because all physical processes are quantum mechanical.

But to say something like, "the day/night cycle is random" is just not even wrong, because that's not how the word is ever used (even in science).
 

Back
Top Bottom