• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

DNA is intelligently coded?

chipmunk stew

Philosopher
Joined
Jun 6, 2005
Messages
7,448
This is an ID angle I hadn't heard before. Dr. Stephen Meyer argued on The Tavis Smiley Show that DNA is evidence of an intelligent designer :jaw-dropp because it's a binary code sequence and we know from working with computers that if we need something coded, we need an intelligent programmer. Or something. :boggled: (Here's an audio-only link for faster downloading.)
 
I'm pretty sure DNA codes in base 4.
 
To be fair, Meyer actually called it "digital code" not "binary code"--I mis-paraphrased. But still. Sheesh.

Tavis was just eating it up and lobbing softballs. Meyer is an excellent speaker and made the Discovery Institute and ID look very legit.

My favorite part of the clip though is halfway through, when this pops up at the bottom of the screen:

"Coming up...P-Funk's George Clinton"
 
To be fair, Meyer actually called it "digital code" not "binary code"--I mis-paraphrased. But still. Sheesh.

Tavis was just eating it up and lobbing softballs. Meyer is an excellent speaker and made the Discovery Institute and ID look very legit.

My favorite part of the clip though is halfway through, when this pops up at the bottom of the screen:

"Coming up...P-Funk's George Clinton"

Digital? He obviously hasn't looked at DNA data at more than a highschool introductory text level.

On the bright side, if that's the new argument for ID, it'll go down in a blaze of glory really fast. Go watch how wrong he is...

http://alphard.ethz.ch/gerber/approx/default.html

No programmer needed for those digital programs. Natural selection does it all.
 
It's trivially demonstrable that computer software can be evolved.

Yes, evolved!

Check out the work of John R. Koza, among many others.

I developed a coding platform in the mid '90s for evolving software to perform specific tasks, so I know how this works.

I'll give a little bit of detail here for those of you who have never heard of this before. I'll keep it as jargon-free as I can.

Imagine, if you will, a computer language in which all programs are syntactically correct. That is, you are able to write anything you like using the 'terms' of the language and you won't get a syntax error. The program won't necessarily do anything useful, but at least it will execute.

OK, now imagine that you have a process that will generate a computer program by selecting, say, 100 or so of the terms of the language and stringing them together. This program is a jumble of instructions and will almost certainly not perform any useful function.

OK, now imagine, say, 100 of these randomly generated programs, all useless.

Call this your initial population.

Now imagine you have a particular task in mind. It could be the XOR function, or it could be to navigate a maze, or it could be filter out all frequencies above 1KHz, or anything else that's measurable. In my first experiment, the task was for the 'predator' to find the 'prey'. These were just dots on the screen, btw.

OK, now imagine that you can execute your programs from the initial population. You run each one for say, a few seconds, or for a few thousand or milliion cycles. At the end of each run, you perform a fitness test. In my case, it was 'how close did the predator get to the prey?' Meaure this for all 100 programs of the initial population. Rank the programs in order of fitness. Most of them will be crap. In fact, they probably all will be. Doesn't matter.

Now, scrap the worst 80. From the best 20, 'breed' 80 new programs. What? Breed? Yeah, kinda. Essentially what you do is grab two of the twenty programs, grab a random chunk of one and a random chunk of the other, stick one chunk on the end of the other chunk, or insert it in the middle, or whatever, and there's a new program.

OK, now we have our next generation of 100. It contains the best 20 from the previous generation plus another 80 that we bred from them.

Now you run these 100, measure their fitness, rank them, cull, breed as before.

Repeat.

Keep repeating.

Guess what?

Eventually, you'll have bred a program that solves your problem.

In my tests, it sometimes only took hundreds of generations to get a very fit program. On my PC in the mid '90s, that may have taken anywhere from 10 minutes to half an hour.

In addition, there are systems like Thomas Ray's where small programs reproduce and compete for survival. They can mutate. I've developed a system like that as well as part of my research when I was at RMIT in 1995. It's fun watching computer parasites evolve!
 
Dr. Stephen Meyer argued on The Tavis Smiley Show that DNA is evidence of an intelligent designer because it's a binary code sequence and we know from working with computers that if we need something coded, we need an intelligent programmer.
Admitting up front that I didn't bother with the link on the assumption that it's the same tired, trivial bit of bafflegab found in every creationist's bag of favorite tricks:

It reduces to this:

1) I hereby define "code" as something created by an intelligence.
2) I hereby declare DNA to be a code.

All together now, in your very best Gomer Pyle voice: "WELL, SHUZZAAUUM!"


logical muse:

Imagine, if you will, a computer language in which all programs are syntactically correct. That is, you are able to write anything you like using the 'terms' of the language and you won't get a syntax error. The program won't necessarily do anything useful, but at least it will execute.

OK, now imagine that you have a process that will generate a computer program by selecting, say, 100 or so of the terms of the language and stringing them together. This program is a jumble of instructions and will almost certainly not perform any useful function.
I often wondered how such a thing could be accomplished while avoiding the problem of hard loops. Even avoiding explicit looping constructs, it seems like the whole business is doomed to hang sooner or later on implicit ones.
 
I often wondered how such a thing could be accomplished while avoiding the problem of hard loops. Even avoiding explicit looping constructs, it seems like the whole business is doomed to hang sooner or later on implicit ones.
In my system there are no loop constructs in the language. Each program is executed multiple times in a run.

Consider the program to be like a big formula. You calculate the result, and then apply it to the system. In one of my tests, the task was for a 'critter' to find its 'food'. Each execution resulted in a new heading for the critter. The new heading was then taken and applied to the critter which was always on the move. This was done a fixed number of times, say, 1,000. At the end of the 1,000 executions, the critter would hopefully be somewhat closer to the food.

Due to the setup of this particular scenario (critter always takes a step, heading determined by genetic programming), the fittest programs got the critter to the food in virtually a straight line and then the critter just circled the food.
 
We've had this conversation before. As Dymanic said, it hinges on the definition of code.

First you have to define code crisply. Then you can discuss whether DNA is a code. Then you can discuss whether it could have evolved.

~~ Paul
 
Sure. But what makes the ball roll up the hill by expending energy, and what decides it should?
 
And therein lies the problem. What imagines the goal?
It's a reasonable enough question.

The complaint here appears to be that the goal is an artifice, and that this invalidates the example as a demonstration of the principle that a blind process subject to random mutation can exhibit a ratcheting toward a goal. It seems to me that as long as the process really is blind to the goal, the principle is adequately demonstrated.

It's essentially the same as the argument that artificial selection is not valid as a demonstration of such a blind ratcheting, because the goal state is contrived.

If we expose a bacterial culture to increasing levels of an antibiotic, is it we, the experimenters who define the goal state as "antibiotic resistance", or is that not an implicit feature of the interaction between the bacteria and the antibiotic? After all, we don't have anything to do with the mechanical details of how such resistance is achieved, we're just providing an introduction.
 
There is just as much of a "predefined goal" in natural evolution as there is in artificial evolution. The predefined goal is: surviving and multiplying under the pressure of the current state of the environment. The only difference is that the natural environment contains orders of magnitude more information than the artifical environment in a simulation, resulting in a richer set of evolved solutions.

It's just a question of how vast the fitness landscape is.

~~ Paul
 
Sure. But what makes the ball roll up the hill by expending energy, and what decides it should?
Wow. Again, the argumentum ad nauseum. So what made the ball? And made the maker of the ball? And made that maker and told him to make the maker to make the ball to make it roll up hill?

Yawn.

Gravity works whether folks want it to or not. It doesn't have to be observed, willed, or forced to do so. It, like many other things that appear to be anathema to you, just is. And, we can test it as well. So far, in all cases, it works. That's what we state: to all observation so far, these are the rules for gravity. No motivating gremlin is needed.

Same for evolution. All of the parts and such are there that make for it to work without interference.

If life did not mutate, it could not "go towards" any goal we set, or more appropriately would not respond to any selective pressure, man-made or no, by any other mechanism than promptly dying.

If life were perfect as-is, antibiotics would work every time. Given that antibiotics don't, and these are bacteria exposed to fungal bacteriocides and mechanisms that they have never encoutnered, either all of the possible information that could ever be required is encoded, or the information changes.

Changing information is the first blow against a perfect creator-driven universe. Why would a perfect creator build an imperfect creation, and if it is due to an anti-creator, how can the creator claim omnipotence without claiming imperfection as the original "goal?"

So, we come to Creator made evil on purpose, and how can a set not contain itself at a minimum (i.e., God is not all good but part evil).
 
There is just as much of a "predefined goal" in natural evolution as there is in artificial evolution. The predefined goal is: surviving and multiplying under the pressure of the current state of the environment.
~~ Paul
Indeed. And you accept that the choice of human/human brain as a better survival & reproduction strategy than, say, a bacteria or a slime mold, was occasioned by random mutations, the environment, and a few billion years. Of course you do since you have nothing else. Others are just not so certain.


PatKelley said:
Changing information is the first blow against a perfect creator-driven universe.
Find someone who is arguing for "a perfect creator-driven universe" and feel free to argue your point.
 
Indeed. And you accept that the choice of human/human brain as a better survival & reproduction strategy than, say, a bacteria or a slime mold, was occasioned by random mutations, the environment, and a few billion years. Of course you do since you have nothing else. Others are just not so certain.

Uh, no, I won't say that at all. Bacteria have it all over us in terms of survival. That's why most germ-killing chemicals say "Kills 99.94% of disease-causing bacteria." They can survive dehydration, freezing, suffocation, near-boiling temperatures, and on sunlight alone in some cases. The water-bear when encapsulated in a tun(its hibernation-spore phase) can survive more than 200 times the human lethal dose of radiation, hard vacuum, dehydration, and near boiling. There are bacteria that could grow in older space stations (re: Mir before it burned up) feeding on metal and glass in hard vacuum and extra-atmospheric radiation levels for years.

Our brains allow us to sequester more energy than a single bacteria, which is how we guarantee success in a less-competetive environment. When the environment is stable, size goes up as a competetive advantage, along with specialization. When environment is unstable, low energy use, generalism, and low metabolism do best. Humans would not survive animal kingdom kumite.

Find someone who is arguing for "a perfect creator-driven universe" and feel free to argue your point.
Ah. So, then, the creator had it in mind to make a universe that would look like it could run fine without him, so that we would have no proof and only have faith then?
 
Last edited:

Back
Top Bottom