• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Randomness in Evolution: Valid and Invalid Usage

I don't understand why this disconnect is occuring: if I specify the algorithm:
<snip>

Sure, but that bears little resemblance to the algorithm I proposed - which is to send ALL input sequences to 1.

And how do you propose to specify which sequence this algorithm is to produce without ending up back where you started?

That's your problem - or rather a problem with your proposed definition - not mine.
 
the important part is the deterministic relationships - i.e. what things will have a causal effect on other things and conversly what things will not have a causal effect on other things.

I'd agree with that. ETA: but how do you determine this? Its the first collison deterministic? yes, its the n'th collision deterministic? yes with respect to its immediate predescessors, but no with respect to the first, if n is more than 12. There have been no other external inputs, so the result of the nth impact isn't determined by the (n-12)th impact. The 12th impact and beyond are random.


Biology is more complex than snooker balls, so with more random influences


However if you are discussing a chaotic system, you might be able to describe the feedback loops both positive and negative, but you won't be able to descibe what the effects of these will be beyond a certain time in the future.

In the snooker ball case, we hypothetically knowe the inputs to the theoretical (quantum) resolution. We know the laws of motion governing the balls. We can easily know what happens after the first impact... and second... but not the twelfth. There have been no further external inputs.

In biological systems I am saying it would be theoretically impossible to say that owlet x will breed successfully, because all the traits might be in its favour, as events could overtake it. These events will have immediate causes, but mixed up with them will be random causes.

You can obviously predict some failures with 100% accuracy, the sterile etc...

Do you see what I am saying that if the fitness environment is chaotic, then it will also be subject to random changes (probably over long ttimescales relative to a human lifetime, but short relative to life's tenure on Earth)?

A single mutation could spread, and completely alter the fitness environment for many organisms. A slight change in the structure of H5N1 might be a catastrophic demonstration of that. Without modern medicine, there could be a strong selective pressure arising on humanity from one slight change to a virus.
 
Last edited:
Sure, but that bears little resemblance to the algorithm I proposed - which is to send ALL input sequences to 1.

Which fails for the reasons below.

That's your problem - or rather a problem with your proposed definition - not mine.

Well no it's not. You're not thinking it through.

It's all very well saying, "here's a program that prints all sequences so it's infinitely compressive!" but that's a set of sequences, not a sequence. It is not of the same type so it doesn't fall under the definition.

To fall under the definition it would be have to be a sequence such as:

0100011011000001010011100101110111...

Which is the concatenation of the sequences of those sets and is a sequence under its own right.

One and only one sequence encoded by an algorithm and its input.

Your "solution" is like you asking me to compress a file and me returning your algorithm and saying:

"Well, it'll generate your file eventually - it's your problem figuring out when that is."

Not acceptable.
 
Last edited:
What about the perfectly compressed works of Shakespere?

Wouldn't that be a random sequence by your definiton Cyborg?

ETA: Wuithout the algorithm I'd agree that it would be indistinguishable from a random sequence.
 
Last edited:
What about the perfectly compressed works of Shakespere?

Wouldn't that be a random sequence by that definition?

Yes it would - what are you trying to say?
 
Yes it would - what are you trying to say?

Possibly that such a definition doesn't make a meaningful distinction between random and non-random.

One would think that since so many people have insisted that my definition of random is meaningless that you would be more careful in coming up with your own.
 
Possibly that such a definition doesn't make a meaningful distinction between random and non-random.

I fail to see how. Compressed data is not expanded data. Shakespere's Compressed works are not Shakespere's Uncompressed works.

One would think that since so many people have insisted that my definition of random is meaningless that you would be more careful in coming up with your own.

It's not exactly "my own" definition.
 
Which fails for the reasons below.

What reasons? I gave you an algorithm that compresses all sequences, no matter what the length, to one bit. And it doesn't take many bits to specify. That's a clear counterexample to what you were claiming.

As for generating a given sequence, I can do that as well - just define that sequence to be "1", and the algorithm outputs "1".

I think perhaps the best way to see that your definition cannot possibly work is to think of these sequences as representing numbers written in binary. Then each sequence is just some integer. But it is clearly ridiculous to make a rule that calls some integers random, and some not.
 
Sol: Cyborg is right, you can't claim to compress data indefinitely without taking into account the algorithm itself.

Cyborg: Sol is right, you can't compress data at all without SOME algorithm, without which the entire idea is nonsensical.

Can't we just all get along ? ;)
 
Last edited:
Sol: Cyborg is right, you can't claim to compress data indefinitely without taking into account the algorithm itself.

Well, one correct statement is the one cyborg quoted above - that you can't losslessly compress every sequence of a given size. But remove the "lossless" or the "every" and there's no statement.

I've continued this discussion because there is a good idea here. I agree with cyborg that when sequences are hard to compress they are close to random. I would have phrased it in terms Shannon entropy (a sequence of N bits is random if its Shannon entropy is maximized), but that's very similar. In fact I've discussed that on this forum in the past.

But all of these definitions rely on limits - I don't think there's any way to define any of them properly for a finite sequence, or a finite quantity of data. If I'm wrong, I'd like to know how to do it. And incidentally if there is a way, it would define an unambiguous method for measuring information in the genome and give the lie to the creationists that say mutations can't increase information.
 
I fail to see how. Compressed data is not expanded data. Shakespere's Compressed works are not Shakespere's Uncompressed works.

Are you saying that the uncompressed works of Shakespere are nonrandom, whilst the compressed works aren't?

If you were arguing that information with no redundancy would look indistinguishable from random noise, then I would agree with you. However that isn't what you are saying, you are saying that it is random.

A pretty odd type of random as, by your definition, Shakespere's uncompressed works aren't random, yet you apply a deterministic algorithm that would always give the same result, and this result is random by your definition.
 
What reasons? I gave you an algorithm that compresses all sequences, no matter what the length, to one bit. And it doesn't take many bits to specify. That's a clear counterexample to what you were claiming.

Please re-read what I said carefully. Your types are wrong. It doesn't do the job for the reason I explained.

As for generating a given sequence, I can do that as well - just define that sequence to be "1", and the algorithm outputs "1".

Eh?
But it is clearly ridiculous to make a rule that calls some integers random, and some not.

That is not the rule. Please read the link on the undecidablility of that proposition. We can only really compare the "randomness" of our finite analyses - we can't ever prove randomness.
 
Last edited:
A pretty odd type of random as, by your definition, Shakespere's uncompressed works aren't random, yet you apply a deterministic algorithm that would always give the same result, and this result is random by your definition.

If I randomly constructed a Java program then it would be a deterministic algorithm inspite of this. You are mixing up the difference between the algorithm and the representation of the algorithm.

Say I had the works represented in ASCII and I wanted to compress them. Because in ASCII the 8th bit of a standard character is always 0 I can reduce the size of any standard ASCII file by an eighth by simply removing this bit. This sequence, however, would most likely still be compressable and hence would not be "random", but it would be more "random" than the original file because there is less potential to compress it as there is less redundancy in the file.

Now as you have both rightly seen the meaning of this analysis increases with the size of the sequences you are considering. So the works of Shakespere would certainly not be considered random as they can likely be compressed by a dictionary algorithm by quite a high percentage. The compressed file should be considered random because if it is not there is still redundancy to exploit. (Not the action remember, the representation).

The compressed file and the uncompressed file can both be said to be equivalent algorithms. So we are interested in minimal algorithms to produce a sequence - and we should expect the representation of those minimal algorithms to not be producable by an algorithm itself. (Which is smaller).
 
Wow, know what, Shakespeare is evolving into a random sequence. Du_u_ude.

And the perscription of antibiotics is random.
 
Last edited:
Please re-read what I said carefully. Your types are wrong. It doesn't do the job for the reason I explained.

Sorry cyborg - I did re-read it, and I really don't understand what you're saying. It doesn't coincide with the link you gave (as far as I can tell). I don't see the point in discussing this further with you, particularly in this thread. When I have time I'll start a new one about randomness.
 
Sorry cyborg - I did re-read it, and I really don't understand what you're saying.

A set of sequences is not a sequence - the output you are producing is not type valid. If you are saying: "the algorithm keeps producing sequences actually," then it's not a terminating algorithm and hence it doesn't encode a particular sequence - 0, 1, 00, 01 etc... can only be considered "scratch" results - working memory. There has to be a definite result.
 
A set of sequences is not a sequence - the output you are producing is not type valid. If you are saying: "the algorithm keeps producing sequences actually," then it's not a terminating algorithm and hence it doesn't encode a particular sequence - 0, 1, 00, 01 etc... can only be considered "scratch" results - working memory. There has to be a definite result.

But... your comment (the one my post responded to) was about my proposed compression algorithm, not the production algorithm.

We're obviously not on the same page here. Like I said, there is no point in continuing this conversation.
 
I still say that you have it backwards Cyborg.

If you remove all redundant information from a datastream, the result will be fully compressed. It would look like random noise, as any patterns in the data could be compressed further. However it would not be random noise, as it would contain useful information.

Your approach is useless, as you define Shakespear's uncompressed works as norandom, yet a perfectly compressed version of the same works as random*. Whilst at the same time you also seem to think that the results of a chaotic system are nonrandom.

In other words the output of system where identical inputs can produce significantly different and diverging results can be nonrandom (according to you), whilst one where the same inputs will always produce the same outputs, and produce a "random" output from a nonrandom input, even though you can transform thise outputs in both directions and always get the same results.


*I'll grant you pseudorandom, in that it won't be predictible, but it isn't random.
 
Last edited:
Back to the OP:

Cyborg:

cyborg said:
the important part is the deterministic relationships - i.e. what things will have a causal effect on other things and conversly what things will not have a causal effect on other things.
I'd agree with that. ETA: but how do you determine this? Its the first collison deterministic? yes, its the n'th collision deterministic? yes with respect to its immediate predescessors, but no with respect to the first, if n is more than 12. There have been no other external inputs, so the result of the nth impact isn't determined by the (n-12)th impact. The 12th impact and beyond are random.


Biology is more complex than snooker balls, so with more random influences


However if you are discussing a chaotic system, you might be able to describe the feedback loops both positive and negative, but you won't be able to descibe what the effects of these will be beyond a certain time in the future.

In the snooker ball case, we hypothetically knowe the inputs to the theoretical (quantum) resolution. We know the laws of motion governing the balls. We can easily know what happens after the first impact... and second... but not the twelfth. There have been no further external inputs.

In biological systems I am saying it would be theoretically impossible to say that owlet x will breed successfully, because all the traits might be in its favour, as events could overtake it. These events will have immediate causes, but mixed up with them will be random causes.

You can obviously predict some failures with 100% accuracy, the sterile etc...

Do you see what I am saying that if the fitness environment is chaotic, then it will also be subject to random changes (probably over long ttimescales relative to a human lifetime, but short relative to life's tenure on Earth)?

A single mutation could spread, and completely alter the fitness environment for many organisms. A slight change in the structure of H5N1 might be a catastrophic demonstration of that. Without modern medicine, there could be a strong selective pressure arising on humanity from one slight change to a virus.
 
However it would not be random noise, as it would contain useful information.

What is useful information you should realise is a matter of interpretation.

Your approach is useless, as you define Shakespear's uncompressed works as norandom, yet a perfectly compressed version of the same works as random*. Whilst at the same time you also seem to think that the results of a chaotic system are nonrandom.

For one thing you have talked of chaotic systems as being determinstic so I don't get what your point is here.

On the other you seem to have completely missed the point that if you can construct an equivalent representation for a sequence which is a small ratio of it then the sequence in question would be considered "non-random". If the sequence you have constructed to represent wouldn't be considered "random" then it's not a minimal sequence.

I don't really know how else to get you to understand that there is a difference between representation and expression.

In other words the output of system where identical inputs can produce significantly different and diverging results can be nonrandom (according to you), whilst one where the same inputs will always produce the same outputs, and produce a "random" output from a nonrandom input, even though you can transform thise outputs in both directions and always get the same results.

You have completely failed to grasp the concept here.
 

Back
Top Bottom