Annoying creationists

Status
Not open for further replies.
And then there's the restriction on memory to be considered...
Yea. I'm not sure what the data structures look like in Ev. RAM may be cheap, but this program isn't exactly designed for this type of demonstration. Java/Pascal are fantastic languages to quickly put together a program to demonstrate a concept, but they're not designed for large scale scientific simulation.

One can very easily design parallel versions of these types of algorithms and run them on a large group of networked computers (the computations for each member of the population are independent, and you can quicksort the portion of the population on each machine and then mergesort the sorted lists to speed up the sorting.) If we had the resources, it's conceivable we could run such a simulation using a parallel Ev.
 
Yea. I'm not sure what the data structures look like in Ev. RAM may be cheap, but this program isn't exactly designed for this type of demonstration. Java/Pascal are fantastic languages to quickly put together a program to demonstrate a concept, but they're not designed for large scale scientific simulation.

One can very easily design parallel versions of these types of algorithms and run them on a large group of networked computers (the computations for each member of the population are independent, and you can quicksort the portion of the population on each machine and then mergesort the sorted lists to speed up the sorting.) If we had the resources, it's conceivable we could run such a simulation using a parallel Ev.
What'd be the limit on the number of parallel cpus that you'd can get from this? I mean, how efficient is quick sorting? Could you get a linear speed increase with respect to the number of processors? would this kind of programming be better served with distributed memory?
 
What'd be the limit on the number of parallel cpus that you'd can get from this?
Theoretically? None. In real life, it would depend a lot on the actual implementation.
Could you get a linear speed increase with respect to the number of processors?
If we were sufficiently clever, I suspect we could get very very close to a linear increase.
would this kind of programming be better served with distributed memory?
I'm imagining a Beowulf-like cluster, so yes.

ETA I have experience with parallel evolutionary computation, so this isn't all idle speculation on my part.
 
Last edited:
My employer can likely provide as much horsepower as may be reasonably required for any conceivable computer model (we make the chips that make the world go round).

Obviously, I will need some well-reasoned estimate of the exact nature of the computing platform required, and it is preferable that you choose a platform or network of platforms which actually exists or is reasonably easy to put together, because a cost-effective research request will be far more easily received, vis-à-vis a request for an exotic computing system.

If you put together a formal proposal for the proposed research, specifying the parameters and time required to conduct the experiment (and it would help if Dr. Schneider or some equally credible researcher(s) would sign on to the proposal), then I can ask our Marketing Department and see if they will be interested in providing a computing forum for the challenge. It seems like this would make pretty good press ("ev vs. id"), which would justify the project's cost.

I make no promises, but what you are all alluding to as desirable is definitely possible to accomplish -- assuming that there is "some" reasonable amount of computing horsepower that can resolve the dispute.
 
Last edited:
Dr. A said:
No, no, you misunderstand me. I meant that the computer time required to simulate one generation is linear w.r.t. population size.

Sorry, I should have made that clearer.
Oh, right, of course.

~~ Paul
 
Joobz said:
What'd be the limit on the number of parallel cpus that you'd can get from this? I mean, how efficient is quick sorting? Could you get a linear speed increase with respect to the number of processors? would this kind of programming be better served with distributed memory?
We could easily parallelize: the reproduction of the best half of the bugs; the mutation of the bugs; the scoring of the bugs. The tie-breaking and sorting phases would be harder.

~~ Paul
 
Annoying Creationists

kjkent1 said:
So, based on your response here, it appears that Dr. Schneider has established at least the mathematical possibility of evolving information from randomness. and, what remains to be established is that such evolution could occur within the time available since the creation of this universe.
Correct.
kjkent1 said:
Assuming that I've successfully characterized the issue, then, how do you know that the parameters which you are using to confirm your hypothesis that evolution could not have occured withing the available time constraints, are realistic?
Mutation rates are commonly measured. You can start with any introductory biochemistry text and get some values for the parameter. You can also do a literature search on this topic and get some idea on the range of known measured mutation rates. You can google “mutation rate” and “bacteria” and you will find many universities that post data like this on their web sites. Genome lengths are also a well documented topic. Bacteria like Mycoplasma which are the smallest free living life forms have a genome length of about 900,000 bases. There are symbiont bacteria that have genome lengths of about 500,000 bases but are dependent on their hosts for crucial metabolic processes. Thus far, neither Paul nor I have been able to get a genome length of greater 100,000 bases to converge but the trends obtained from smaller genomes cases show how rapidly the generations for convergence increase as you increase genome length in the model. The limitations for getting these large genomes to run in ev is in part due to the huge memory requirements for these more realistic cases and the huge amounts of computer time required to run these types of cases. I ran one case with a genome length of 1000 bases, a population of 1 million and a mutation rate of 1 mutation per 1000 bases per generation. This case required over 300 megabytes of RAM and took 100 hours of cpu time on a 2.8gHz processor. If I tried to run this case with a realistic mutation rate, the cpu time would be measured in years.
kjkent1 said:
I doubt that Dr. Schneider would dispute the claim that the original parameters of EV are not a realistic model of a known biological organism. But, in the real world, there are a myriad of ways under which evolution does not occur. That is, extinction "happens." So, the fact that you may be able to defeat EV with a particular set of conditions should be no surprise.
We don’t have to speculate about what Dr Schneider claims. He said the following on his FAQ page for ev:
Dr Schneider said:
Isn't the standard Ev mutation rate of one base change per genome per generation excessive? No. If you think about it (or try it yourself) you will see that if you slow it down you get the same results: Rsequence still will evolve towards Rfrequency. Of course it will take longer to get the results.
I am not defeating ev with a particular set of conditions, I am only demonstrating the behavior of this model with realistic parameters. Dr Schneider is the one who chooses a particular set of conditions in order to support his assertions that a human genome can evolve in a billion years.
kjkent1 said:
It should be equally unsurprising that other sets of parameters cause EV to work as advertised.
I know of no realistic parameters which give results from ev that would predict the evolution of a human genome in a billion years. Paul and Myriad have run many cases and I don’t think they would make this claim anymore. The position evolutionists who have investigated the mathematical behavior of ev is that it doesn’t represent all the possible evolutionary mechanisms and I think this is the only reasonable fall back position to take if you still believe the theory of evolution to be true.
kjkent1 said:
It seems to me that what remains to be done is for both sides of the debate to try to agree on a set of parameters that reasonably models some known biological organism, and then run EV and see what happens.
That is the simple straight forward approach but the computational requirement for such a computer run would require a super computer. Even the largest super computers would be taxed by this model. The only way to get an idea of the mathematical behavior of ev is by doing parametric studies. I don’t know of anyone with the computer resources available to do a 1,000,000 base genome with a 10^12 population and a 10^-6 mutation rate in a reasonable length of cpu time. It would be interesting to do such a run though.
kjkent1 said:
Then, if the outcome does not produce a realistic evolutionary outcome, tweek the paramaters until one appears. Then check the parameters to ensure that they continue to reasonably model the original biological organism.
Do you know somebody with the computer?
kjkent1 said:
Eventually, by repeating the above, I would think that you would come upon a set of parameters that will successfully evolve the target organism within the time required.
I don’t think you will be able tweak the parameters to do it. Dr Schneider has already used a close to mathematical optimum mutation rate. If you increase the mutation rate from his value, ev will not converge. Increasing the genome length only causes ever increasing generations for convergence. Increasing populations does help decrease the generations for convergence somewhat, but not to the degree that increasing the genome length causes the generations for convergence to increase. Binding site width does not appear to have much of an effect on the generations for convergence. You simply do not have any parameter that can overcome the effect of the increasing generations for convergence with increasing genome length.
kjkent1 said:
If not, then I'd say you have a reasonable proof that current evolutionary theory has a timing problem. This doesn't necessarily rule out evolution, but it may call for some additional research into the underlying mechanisms of evolutionary change.
kjkent1 said:
Conversely, if the organism evolves within the time available, then I'd say that your ID hypothesis is falsified.
That’s the challenge.
kjkent1 said:
At the moment, it seems to me that EV has demonstrated that evolution is possible, within some limit of measurement, although the limit is questioned as being too great. It further seems to me that engaging in a flame war is a non-productive use of time, unless everyone enjoys casting insults at each other.
kjkent1 said:
So, why not just get together and try to do the research and then offer the results for publication?
Ev only demonstrates evolution is possible with Dr Schneider’s superficial analysis of his program using unrealistic parameters. I don’t mind the flame war, actually, I haven’t seen any sparks from this debate from the evolutionists willing to talk about this topic. Just think of this debate as mud wrestling.

If someone wants to publish these findings based on the use of realistic parameters in ev, feel free. Any type of publication would take years to put together and get peer reviewed. I would rather get this out in the open now.
kjkent1 said:
I'm not a scientist, but I thought that getting published is the "money zone" for a researcher, and I would think that trying to get published would be a more interesting pursuit than merely arguing amongst each other for free.
kjkent1 said:
But, then, I really like money, so maybe that's just my personal prejudice sneeking into this post.
kjkent1 said:
My employer can likely provide as much horsepower as may be reasonably required for any conceivable computer model (we make the chips that make the world go round).
As the program is written now, population affects the memory requirements linearly. The 1000 base genome and 1024k population case I ran took 300 megabytes of ram. The memory requirements are also proportional to genome length. So if you want to run a genome length of a million and a population of a million, you are talking about 1000*300 megabytes of RAM. You are also confronted with lots of cpu clock cycles. You might be able to distribute these clock cycles over multiple processors, if you can’t, these runs will take years.
Dr Adequate said:
Well, the problem is that the time ev takes to run depends (linearly, I presume) on the population size. It's taking Paul long enough to simulate what happens with a population size of about 1000000.
Dr Adequate said:
By contrast, there are 100000000000000 bacteria in a single human gut. (Why am I counting bacteria? --- Because the simulation doesn't allow for sexual recombination.)
Adequate, population is the only parameter that gives ev a chance of supporting the theory of evolution. Every population series I have run show a rapid approach to what appears to be an asymptote so the preliminary data does not look good for the evolutionist viewpoint, but I will let you cling to this last hope until the computing power becomes available to settle this issue.

Adequate, how does recombination increase the information in the gene pool?
Dr Adequate said:
So Dr Schneider has compensated for an unrealistically small population size by having an unrealistically high mutation rate.
Dr Adequate said:
Kleinman's dodge is to make the mutation rate realistic but not the population size, and then to claim that these parameters are realistic, even though this is a lie, and he knows it, and we know it.
Adequate, I have run the largest population series that my computer could handle. Dr Schneider published a single case with a population of 64, I did a population of 1024k so either post my lie or stop whimpering. Anyway, there weren’t 100000000000000 of our primate ancestors out there when human evolved. Would you evolutionarians make up your mind? Is it small sub-populations that rapidly evolve as Gould proposes with his punctuated equilibrium hypothesis or is it large populations that are required for evolution?
Kleinman said:
See what I mean, you are much more careful with your statements about ev than you were just one year ago. Dr Schneider asserts that ev represents reality, do you want me to post his quotes again? Dr Schneider also asserts the ev represents punctuated equilibrium according to the definition hypothesized by Gould. This has been published in a prestigious peer reviewed journal. You never question these assertions last year but now that you have a better idea of the mathematics simulated by ev, you now can see some of the problems with Dr Schneider’s assertions.
Paul said:
I think that Schneider was pointing out the fast evolution of the binding sites followed by a period of stasis. If you want to argue that he was claiming that Ev precisely models "Gould's punctuated equilibrium," whatever that is, you'll have to take that up with him. I doubt he was claiming that, since no one agrees on the parameters of punctuated equilibrium anyway.
Kleinman said:
Paul said:
I agree that Ev represents reality accurately to the degree that it models it, which is only a limited degree. Again, if you think that Schneider is making claims about the actual number of generations to evolve binding sites in real life, you'll have to take that up with him. Again, I doubt it, since Ev's binding site model is quite simplistic.
I agree with you that Dr Schneider saw the sigmoid curve when plotting the convergence of ev and thought of punctuated equilibrium, however the references in his paper to specific page numbers in Gould’s work seem to indicate more. Regardless of Dr Schneider’s intended meaning of punctuated equilibrium, ev argues against Gould’s hypothesis of punctuated equilibrium.

I agree with you that ev is limited to modeling random point mutations and natural selection and I think that Dr Schneider got the basic model essentially correct. Dr Schneider does not want to discuss these issues publicly, however he has posted on his web site that he wants to do laboratory experiments to verify the results of his model. I don’t think he has thought through his proposals.
Kleinman said:
Paul, you probably won’t believe me but I do care about what you think. What do you think that ev simulates? Do you think that ev accurately simulates random point mutations and natural selection?
Paul said:
I think it simulates an idealized model of chromosomes, binding sites, point mutations, populations, and natural selection.
So do I, and that is why I think this model causes a mathematical problem for the theory of evolution.
Kleinman said:
If not, which examples of chromosomal doubling represent speciation? Why don’t you expand on this line of thought?
Dr Adequate said:
AFAIK, yes. Species can't interbreed if they have different chromosome numbers.
What does AFAIK mean?

So unless two organisms have homologous chromosomes they can’t breed. So if a plant has say 4 identical copies of its chromosomes, it can only breed with another plant with 4 identical homologous chromosomes?
Kleinman said:
Again Adequate raises the pointer on the whino-meter.
Dr Adequate said:
Are you genuinely too stupid to tell the difference between correcting your ignorance and "whining", or are you just addicted to lying?
Feel free to post my lies you whiner.
Dr Adequate said:
So Dr Schneider has compensated for an unrealistically small population size by having an unrealistically high mutation rate.
Paul said:
My experiments with mutation rate indicates that generations are linear with respect to mutation rate. This should come as no surpise.
Paul, I told you google is watching. Why don’t you post your data and then you can explain this question you asked on the following thread:
http://www.randi.org/forumlive/showthread.php?t=67488
Paul said:
Then, for some reason, I decided to run another series of experiments with a fixed population (64) and a fixed number of mutations per base (1/16,000). I'm varying the chromosome length from 512 bases by factors of 2. I would expect the number of generations to evolve a perfect creature to remain constant, because the probability of mutating any non-junk DNA base in the chromosome remains constant. However, I'm seeing what appears to be a factor of 2 increase in the number of generations required as the chromosome increases in length by a factor of 2.

Does anyone have any thoughts on why this should be the case?
 
Kleinman quoting me said:
Then, for some reason, I decided to run another series of experiments with a fixed population (64) and a fixed number of mutations per base (1/16,000). I'm varying the chromosome length from 512 bases by factors of 2. I would expect the number of generations to evolve a perfect creature to remain constant, because the probability of mutating any non-junk DNA base in the chromosome remains constant. However, I'm seeing what appears to be a factor of 2 increase in the number of generations required as the chromosome increases in length by a factor of 2.
I then responded to my own question:
me said:
I may have opened my big mouth too soon. Nothing to see here until I run more cases...

~~ Paul
 
Kleinman said:
So unless two organisms have homologous chromosomes they can’t breed. So if a plant has say 4 identical copies of its chromosomes, it can only breed with another plant with 4 identical homologous chromosomes?
Differences in chromosome number don't always prevent interbreeding. Horses and donkeys can interbreed. So can wild zebras and horses.

On various methods of speciation:

http://www.talkorigins.org/origins/postmonth/jun06.html

~~ Paul
 
Mutation rates are commonly measured. You can start with any introductory biochemistry text and get some values for the parameter. You can also do a literature search on this topic and get some idea on the range of known measured mutation rates. You can google “mutation rate” and “bacteria” and you will find many universities that post data like this on their web sites. Genome lengths are also a well documented topic. Bacteria like Mycoplasma which are the smallest free living life forms have a genome length of about 900,000 bases. There are symbiont bacteria that have genome lengths of about 500,000 bases but are dependent on their hosts for crucial metabolic processes. Thus far, neither Paul nor I have been able to get a genome length of greater 100,000 bases to converge but the trends obtained from smaller genomes cases show how rapidly the generations for convergence increase as you increase genome length in the model. The limitations for getting these large genomes to run in ev is in part due to the huge memory requirements for these more realistic cases and the huge amounts of computer time required to run these types of cases. I ran one case with a genome length of 1000 bases, a population of 1 million and a mutation rate of 1 mutation per 1000 bases per generation. This case required over 300 megabytes of RAM and took 100 hours of cpu time on a 2.8gHz processor. If I tried to run this case with a realistic mutation rate, the cpu time would be measured in years.

This may have already been considered, but wasn't there a study just released showing a colony of lizards which have apparently rapidly mutated in response to rapid environmental stress?

See Science 17 November 2006:Vol. 314. no. 5802, p. 1111.

Doesn't this provide evidence for a substantially different/faster model of evolutionary change?
 
Annoying Creationists

Paul said:
I then responded to my own question:
Paul said:
I may have opened my big mouth too soon. Nothing to see here until I run more cases...
Well then post your data.
Kleinman said:
So unless two organisms have homologous chromosomes they can’t breed. So if a plant has say 4 identical copies of its chromosomes, it can only breed with another plant with 4 identical homologous chromosomes?
Paul said:
Differences in chromosome number don't always prevent interbreeding. Horses and donkeys can interbreed. So can wild zebras and horses.
Why don’t you explain that to Adequate?
 
We could easily parallelize: the reproduction of the best half of the bugs; the mutation of the bugs; the scoring of the bugs. The tie-breaking and sorting phases would be harder.

~~ Paul
Is the sorting used for anything more than finding the best half of the bugs? If that were the case, we could simply replace the sort routine with the selection algorithm. If not, I suspect the selection algorithm would be useful in distributing the tie-breaking and sorting phases. If I looked more closely at the code (which I plan on doing once this semester ends) I could probably give a more accurate assessment.

Retooling the code to run in parallel just to simulate an insane population size seems to me like a lot of work for very little return. It is probably possible, though.
 
total pwonage of the Hammegiester
As incisive and witty as I find your commentary, Dr., I must confess that sheer, brute fact is the greatest slam-dunkage of all.

And that you presented these facts unadorned, let them stand in a post entirely by themselves, is proof that we are in the presence of artistic genius.

a065.gif
 
Annoying Creationists

kjkent1 said:
This may have already been considered, but wasn't there a study just released showing a colony of lizards which have apparently rapidly mutated in response to rapid environmental stress?

See Science 17 November 2006:Vol. 314. no. 5802, p. 1111.

Doesn't this provide evidence for a substantially different/faster model of evolutionary change?
I don’t believe this article is talking about random point mutations and natural selection. I believe these authors are talking about recombination and natural selection. This example that these authors are talking about of longer legged lizards having better survival would be analogous to breeding dogs for a particular quality whether it be long legs or better sense of smell. This process is much faster than creating a new gene from scratch. You are not making new genes with recombination. Ev is all about creating new binding sites (the region before the gene that enables the gene to be recognized) from scratch. Even this process is profoundly slow and you still have not evolved the gene that goes along with the binding site yet in Dr Schneider’s model. You are comparing apples and oranges here.
 
What does AFAIK mean?

As far as I know...
Feel free to post my lies you whiner.

This is the closest to the truth, you've come. You post almost only lies, and we "whine" that you are a liar.

Paul, I told you google is watching. Why don’t you post your data and then you can explain this question you asked on the following thread:
Google is watching kleinman. Figured out what thermodynamics is yet? Well, anyway, are you still insisting on miss quoting and intentially missreperesenting people's statements?

Paul A. said:
Originally Posted by Kleinman quoting me
Then, for some reason, I decided to run another series of experiments with a fixed population (64) and a fixed number of mutations per base (1/16,000). I'm varying the chromosome length from 512 bases by factors of 2. I would expect the number of generations to evolve a perfect creature to remain constant, because the probability of mutating any non-junk DNA base in the chromosome remains constant. However, I'm seeing what appears to be a factor of 2 increase in the number of generations required as the chromosome increases in length by a factor of 2.
I then responded to my own question:

Originally Posted by me
I may have opened my big mouth too soon. Nothing to see here until I run more cases...
~~ Paul
I guess you are. You can't use facts, so you must use deceitful tactics.
 
Adequate, population is the only parameter that gives ev a chance of supporting the theory of evolution. Every population series I have run show a rapid approach to what appears to be an asymptote so the preliminary data does not look good for the evolutionist viewpoint, but I will let you cling to this last hope until the computing power becomes available to settle this issue.
Yes, I think I will indeed "cling to the hope" that what is mathematically certain is also true.

You feel free to cling to the hope that what is certain must be false.

Adequate, how does recombination increase the information in the gene pool?
It ... doesn't .. you ... bleedin' ... halfwit.

Don't you know ANYTHING about genetics?

It does, however, speed up the process of evolution.

Adequate, I have run the largest population series that my computer could handle. Dr Schneider published a single case with a population of 64, I did a population of 1024k so either post my lie or stop whimpering. Anyway, there weren’t 100000000000000 of our primate ancestors out there when human evolved.
Of ... course ... there ... weren't ... you ... bleedin' ... halfwit.

Stop trying to change the subject. We are discussing ev.

Ev ... simulates ... bacterial ... reproduction ... you ... bleedin' ... halfwit.

Would you evolutionarians make up your mind? Is it small sub-populations that rapidly evolve as Gould proposes with his punctuated equilibrium hypothesis or is it large populations that are required for evolution?
Small ... populations ... often ... evolve ... rapidly ... you ... bleedin' ... halfwit.

Some ... evolution ... takes ... longer ... you ... bleedin' ... halfwit.

You ... have ... had ... all ... this ...explained ... to ... you ... you ... bleedin' ... halfwit.

FIND SOME NEW LIES, YOU BLEEDIN' HALFWIT.

---

And so kleinman's hopeless, futile quest for a clue continues.
 
Last edited:
Annoying Creationists

I sense some hostility in Adequate. We better have him go through the metal detector before he enters this forum.
 
Status
Not open for further replies.

Back
Top Bottom