• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Randomness in Evolution: Valid and Invalid Usage

It is relevant by showing a system that does not show periodic orbits yet is chaotic. It demonstrates that the definition of chaos that you have provided is too restrictive.

Although this spat over the definition of chaos is irrelevant insofar as we're talking about evolution. Whether we call them apples or oranges doesn't matter.

OK, I was a little hasty in saying that chaos required a dense set of periodic orbits. It was a published definition by an expert in choas theory, Robert Devaney, but, as Devaney himself admits, it is not the only definition of chaos and there is no universally agreed upon definition of chaos. Nonetheless, it seems that you are ignoring that the Kolmogorov–Arnold–Moser theorem (which applies to the KAM tori mentioned in the MathWorld article on chaos) mentions only quasiperiodic orbits and says nothing about periodic orbits themselves.
 
i'd be happy too. but while i see that they claim the tori are lost, i find no claim that the chaotic sea is not littered with unstable periodic orbits. so as far as i can tell mathworld makes no claim that this is an example. i remain happy to follow up pointers to specific text

You're splitting the hairs pretty to the point of reinterpreting the English language if you think that they didn't mean that statement as an example. But generally I think systems described by KAM-theorem exclude tori with rational valued periods. Thus KAM-systems are only quasi-periodic. Yet KAM-systems are still considered chaotic. I'm confused as to how you don't consider this sufficient.

@Mijo: I agree with you this only relaxes the definition to periodic or a quasi-periodic systems, but I think rule 30 is probably a good example of a chaotic system that is aperiodic. (Although as you note, not completely disordered)
I'm not sure why we would want our definition of chaos to exclude a system like this. If you read each row of progression in the system off as a binary number it actually grows exponentially and without bound.

Speaking of dissipative systems. Aren't all natural systems ultimately dissipative and only non-dissipative in approximation? This seems to exclude any natural system from being chaotic in theory(not just that we are unsure because we can't model it or measure it sufficiently well).
 
OK, I was a little hasty in saying that chaos required a dense set of periodic orbits. It was a published definition by an expert in choas theory, Robert Devaney, but, as Devaney himself admits, it is not the only definition of chaos and there is no universally agreed upon definition of chaos. Nonetheless, it seems that you are ignoring that the Kolmogorov–Arnold–Moser theorem (which applies to the KAM tori mentioned in the MathWorld article on chaos) mentions only quasiperiodic orbits and says nothing about periodic orbits themselves.

I answered this in my response to lenny. :-)

On the topic of evolution I'd like to just agree to disagree. You don't like my definition and I don't like yours. If we define random systems as any system that is non-deterministic then evolution is trivially random. If we define random as uniformly distributed outcomes between uncorrelated trials, then evolution is trivially non-random.

I think we can agree that the true behavior of evolution is somewhere in between,whether we consider it random or not.

If you want to continue this discussion I wouldn't mind talking about some of these issues of convergence and divergence, but I don't think we're gonna make any more progress on this "fundamentally random" definition.
 
Back to the OP (you can see why I started a different thread about chaotic systems...)

It is indisputible that there are many highly nonlinear feedback loops in ecosystems. Because of these interactions between organsims, sometimes smalll chance changes in one organism can have a large effect on the whole ecosystem, and thus on the "fitness landscape" and course of evolution for many organisms.
I think we've talked about this plenty. Non-linearity doesn't imply chaos or randomness. Some systems are sensitive to perturbation some don't.

These chance effects are likely to be more important when the fitness landscape is more plastic, which will be when the environment is changing, or has changed recently*. This assertion is borne out by the fossil evidence, where most major "explosions of diversification" occur after mass extinctions, where the environment has changed, and there are many empty niches available.
So?

I'll reiterate the following post (hidden for brevity) because this is the sort of situation that I would expect to occur (infrequently), whereas denying the probabilistic nature seems to make it hard to explain why only one colony in many evolved the ability to metabolise citrate.

Can you provide the link to the paper. I trust New Scientist about as far as I can throw it, which quite un-randomly ends up being the distance to the trash bin.

With evolution, chance events become more important over time, they don't average out as in "well behaved systems".

I know you feel that way. Maybe if you close your eyes, click your ruby slippers together, and wish really hard it will come true.
 
Last edited:
zosima said:
With evolution, chance events become more important over time, they don't average out as in "well behaved systems".
I know you feel that way. Maybe if you close your eyes, click your ruby slippers together, and wish really hard it will come true.

Whether the KT impact was "random" it certainly was arbitary and had a huge effect on evolution.

You could claim that the effects are only pseudoranddom but (surely) you can't deny their importance over long timescales.
 
zosima said:
Can you provide the link to the paper. I trust New Scientist about as far as I can throw it, which quite un-randomly ends up being the distance to the trash bin.

The article quoted the original scientist, quite extensively, so you would expect it to be pretty accurate...

Anyway this has been discussed recently:

From another thread...

Found the whole article in a pdf file.

Historical contingency and the evolution of a key innovation in an experimental population of Escherichia coli

And here's Behe's comments from his blog.
I think the results fit a lot more easily into the viewpoint of The Edge of Evolution. One of the major points of the book was that if only one mutation is needed to confer some ability, then Darwinian evolution has little problem finding it. But if more than one is needed, the probability of getting all the right ones grows exponentially worse. “If two mutations have to occur before there is a net beneficial effect — if an intermediate state is harmful, or less fit than the starting state — then there is already a big evolutionary problem.” (4) And what if more than two are needed? The task quickly gets out of reach of random mutation.
Here's your answer about what the Creationists will say. Ha! Talk about being unable to face the fact you have been wrong in a big way.

Behe is saying that the observation of a rare sequence of mutations occurring (needing to occur for species evolution), proves it can't happen. What an idiot.

From the abstract (EDIT: of the original article):

The role of historical contingency in evolution has been much debated, but rarely tested. Twelve initially identical populations of Escherichia coli were founded in 1988 to investigate this issue. They have since evolved in a glucose-limited medium that also contains citrate, which E. coli cannot use as a carbon source under oxic conditions. No population evolved the capacity to exploit citrate for >30,000 generations, although each population tested billions of mutations. A citrate-using (Cit+) variant finally evolved in one population by 31,500 generations, causing an increase in population
size and diversity. The long-delayed and unique evolution of this function might indicate the involvement of some extremely rare mutation. Alternately, it may involve an ordinary mutation, but one whose physical occurrence or phenotypic expression is contingent on prior mutations in that population. We tested these hypotheses in experiments that ‘‘replayed’’ evolution from different points in that population’s history. We observed no Cit+ mutants among 8.4x1012 ancestral cells, nor among 9x1012 cells from 60 clones sampled in the first 15,000 generations. However, we observed a significantly greater tendency for later clones to evolve Cit+, indicating that some potentiating mutation arose by 20,000 generations. This potentiating change increased the mutation rate to Cit+ but did not cause generalized hypermutability. Thus, the evolution of this phenotype was contingent on the particular history of that population. More generally, we suggest that historical contingency is especially important when it facilitates the evolution of key innovations that are not easily evolved by gradual, cumulative selection.


I could highlight many phrases in the abstract, as this supports both my general point, and the New Scientist article.
 
Last edited:
jimbob... they are saying that the environment DETERMINES the outcome.

But that is something quite differnt that calling evolution a deterministic process, which pertains to how organisms interact with their environment. The continued conflation of the fact that enviroment determines the probabilities of different phenotypes' ability to produce reproductively viable offspring with the idea that evolution is a deterministic process only causes confusion and seems to be quite deliberate on the part of those who insist that evolution is non-random.

The point of jimbob's posts on Lenski's research is that it demonstrates quite clearly that identical populations* under identical eviromental conditions do in fact evolve differently. In other words, their evolution is random.

This, however, doesn't that the evolution of a Cit+ population was completely unexpected as citrate was in 68-fold excess to glucose, providing an abunant and unutilized resource that would provide some selective advantage to populations who evolved such a trait. Nonetheless, any insistence that evolution is non-random must be accompanied by ab explanation as to why the other 11 populations, which were identical to the Cit+ population at the beginning of the experiment, did not evolve to metabolize exogenous citrate.

*Lenski is quite clear in the cited research describing the Long-Term Evolution Experiment that the 12 populations of E. coli were identical to one another at the beginning to the experiment (i.e., there was no standing interpopulation or intrapopulation variation at the beginning of the experiment).
 
But generally I think systems described by KAM-theorem exclude tori with rational valued periods. Thus KAM-systems are only quasi-periodic. Yet KAM-systems are still considered chaotic. I'm confused as to how you don't consider this sufficient.
orbits with rationally related periods are NOT tori, they are periodic orbits. and they remain in the "chaotic sea" after the tori are gone. thus the destruction of the tori says nothing about the existance of the periodic orbits. no?
Speaking of dissipative systems. Aren't all natural systems ultimately dissipative and only non-dissipative in approximation?
to the extent that we are discussing chaotic mathematical systems (to the extent that KAM is relelevant), it is important not to confuse physical systems and mathematical systems.

if you really want to talk about natural systems, you'll find it hard to establish that they are "ultimately" described by mathematics at all. many of us believe them to be, but that is a religous belief not one based on evidence.
This seems to exclude any natural system from being chaotic in theory(not just that we are unsure because we can't model it or measure it sufficiently well).

i do not see how that could follow, unless you are suggesting that dissipative systems cannot be chaotic: the most commonly discussed chaotic systems are dissipative (Lorenz's equations, all systems with strange attractors, or any attractor for that matter!)

did i miss something here? what was the "this" in "this seems to exclude"?
 
OK, I was a little hasty in saying that chaos required a dense set of periodic orbits.
a little hasty, but not much. a dense set of unstable periodic orbits often features in the various definitions of choas, not just bob devaney's. and some argue that the existance of such a set follows from other definitions of chaos.
Nonetheless, it seems that you are ignoring that the Kolmogorov–Arnold–Moser theorem (which applies to the KAM tori mentioned in the MathWorld article on chaos) mentions only quasiperiodic orbits and says nothing about periodic orbits themselves.
yes, i think that is the main point. no hamiltonian counter example has been provided at this point.
 
jimbob... they are saying that the environment DETERMINES the outcome.

No they are not, the environment affects the outcome, but it doesn't determine it.

Historical contingency (aka chance) also plays a part:

Thus, the evolution of this phenotype was contingent on the particular history of that population. More generally, we suggest that historical contingency is especially important when it facilitates the evolution of key innovations that are not easily evolved by gradual, cumulative selection.

The conditions weree as close to identical as possible in the twelve groups, and one group only out of the twelve evolved the ability to metabolise citrate. This is the only e.coli population anywhere known to metabolise citrate. This ability evolved after 30,000 generations, but depended on a prior mutation that occured betweeen 15,000 and 20,000 generations in this population.

If the environment determined this, then the other eleven groups would also have evolved this ability.
 
If the environment determined this, then the other eleven groups would also have evolved this ability.

It's been X number of months are you still insist on getting this wrong and playing semantic games:

No they are not, the environment affects the outcome, but it doesn't determine it.

Because this is a deterministic relationship. You've had this explained. You don't like the words. So you reject it.
 
What the shadowing theorem proves is that the computational path has a true analog in the chaotic system, that is the path of the shadow.
agreed, but the point is that that path is not chaotic: the periodic computer trajectory is shawdowed by an unstable periodic orbit of the chaotic system.
When we're talking about physical systems, we're really talking about effective definitions and approximations.
agreed. and i think there is value in lorenz's suggestoin that we call a physical system chaotic if the best model we have for that system today is chaotic.
In finite memory a system will definitely approach a finite periodic cycle. I would argue that that it is effectively chaotic up until the cycle begins to repeat.
the hallmarks of chaos are defined in the limit as time goes to infinity; transient behavour is almost always explicitly excluded. (not sure if i need that "almost")
The length of which would be limited by the size of available computational resources. In infinite time&memory( ie an abstract computational machine), you could have a period of infinite length.
agreed
In any physical system(analog or digital) there will be finite resources and thus a finite set of possible states(although it might be harder to visualize what these states are in an analog system). Once the system runs out of distinct states, the next step must land in a repeating state by the pigeon hole principle.
there is no evidence for this in an analogue system, the fact that the observation of the state is quantized by the A/D converter implies we have a limited number of values we can observe, not that the system has a limited number of states.
It can often be difficult to create a controllable chaotic simulation in an analog system because noise will create stability in the system.
i know of no cases in the lab where this has been a problem. there are other problems, of course; and i am not sure what you mean by "controlable": you can certainly damp out chaos if you try, but if you do not actively attempt to, then the observations tend to look as if they came from a chaotic process.
An example of one technique for making an analog system controllable is to use a low pass filter. Then your system is an accurate approximation of the mathematical ideal up to the efficiency of the filter.
how does filtering the observations have any effect on the dynamics of the system?
 
If the environment determined this, then the other eleven groups would also have evolved this ability.

It's been X number of months are you still insist on getting this wrong and playing semantic games:

No they are not, the environment affects the outcome, but it doesn't determine it.

Because this is a deterministic relationship. You've had this explained. You don't like the words. So you reject it.

No I am not playing semantic games.

The differences in the initial populations and their environments were insignificant*. The differences in the outcomes were significant.

As the difference in the initial populations were insignificant with rewpect to the evolution of this trait, they could have been identical, and there would be no certainty whether the trait would appear or not. Beyond the 20,000th generation (in the particular population), the odds were highly skewed in favour of this trait's emergence.

The evolution of citrate metabolism in e.coli was an unlikely event. It happened onece. Other unlikely events will also happen.


*We know this, because the significant change in the population occured somewhere between the 15,000th generation and the 20,000th generation.
 
Last edited:
cyborg said:
No they are not, the environment affects the outcome, but it doesn't determine it.

Because this is a deterministic relationship. You've had this explained. You don't like the words. So you reject it.


So:


The environment affects the outcome but doesn't determine it, "Because this is a deterministic relationship"??

Your addition makes no sense.

Did you mean that the environment does determine the outcome? If this is the case how can significantly identical populations and environments lead to significantly different outcomes?
 
Jimbob, the parts that were not identical but adaptive to the population became exponentially multiplied over time... How?-- because the environment DETERMINES which of these minute differences in the DNA works best in whatever environment it finds itself in over time. This really isn't contested amongst anyone but a few people determined to call evolution random no matter how unclear that might be. Multiple peer reviewed articles call evolution "nonrandom" or determined. I haven't seen one call it "random" yet --unless by random, they mean "unpredictable" or "unguided"-- and, when that is the case, they are very clear upon the definition of random. I have seen not a single reputable source look at evolution nor the information on e. coli and refer to it with the muddled terminology you and mijo use. You may be able to extrapolate your "explanations" into that paper... but to the rest of us, it looks like some semantic twisting to call evolution "random".

You guys wanting to call evolution random have not provided a single peer reviewed paper that actually says what you are saying nor one that defines random as you do. Though you think that the e. coli experiment supports your "description of evolution"-- I assure you that the vast majority would disagree and prefer the many definitions already provided by real experts such as Dawkins and Coyne.
 
Jimbob, the parts that were not identical but adaptive to the population became exponentially multiplied over time... How?-- because the environment DETERMINES which of these minute differences in the DNA works best in whatever environment it finds itself in over time.

this is merely a request for clarification.

is it true you are suggesting that no (relevant) mutations happened during the experiment? that there were slight differences in the initial populations, and the environment then "determined" the best variation.

under these assumptions i think all would agree that everything was effectively "determined" by the environment. no jimbob?
 
Yes Lenny. If the differences were significant. However they weren't:

The initial 12 populations came form the same stock. Because the experiment was well thought-out, they also froze samples of the populations at different generations, part of the abstract quoted below:

We observed no Cit+ mutants among 8.4x1012ancestral cells, nor among 9x1012 cells from 60 clones sampled in the first 15,000 generations. However, we observed a significantly greater tendency for later clones to evolve Cit+, indicating that some potentiating mutation arose by 20,000 generations.

PDF file of entire pper here
 
mutations are random in respect to whether an organism benefits, but they are determined by the environment. Translation errors don't happen for "no reason"... the process has high fidelity, but it's prone to errors... these "errors" are the source of our mutations. We can call them random, because we can't predict which will occur and they occur whether they benefit or are a detriment to their host and its' progeny.

But the environment DETERMINES which of these changes is multiplied exponentially... we can say that whatever we see in the final populations was determined by what reproduced best in the prior populations in the respective environments over time.

That, in essence, is what natural selection is about--what evolution is about. Calling this random or "probabilistic" is obfuscating and useless as far as I can tell. If you want to communicate what is going on in the e. coli experiments with any one... I'd suggest you use words that scientists actually use and not the explanations of jimbob or mijo. I find them maddeningly obfuscating, misleading, and biased towards "needing" to call evolution a "random process".

Why not use the words of people who actually teach this stuff to tons of people and who know what the most common errors in understanding are? Why do some people imagine themselves as having expertise on how it should be described when no-one has every intimated that they've gained understanding through their explanations?
 
Last edited:

Back
Top Bottom