• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

carbon dating...

malfunktion

New Blood
Joined
Jun 11, 2003
Messages
14
From my understanding, science uses ratios of Carbon-14 or Uranium-238, etc., to date older materials like fossils.

Since scientists obviously cant observe the half-life for millions or billions of years, are there any tests to prove that the half lifes remain constant? Would there be any reason to believe that the decay might be inconsistent over such a long time period?
 
malfunktion said:
From my understanding, science uses ratios of Carbon-14 or Uranium-238, etc., to date older materials like fossils.

Since scientists obviously cant observe the half-life for millions or billions of years, are there any tests to prove that the half lifes remain constant? Would there be any reason to believe that the decay might be inconsistent over such a long time period?

For decay rate to change, fundamental constants of quantum mechanics would have to change. Something that dramatic might make life on earth itself impossible. So it's a very safe bet that decay rates are constant.

What's much more important than decay rates (which we really can safely assume are constant) is starting ratios. In other words, the starting ratio of carbon 14 to carbon 12 need not necessarily remain constant, and you need to know what it started at in order to conclude anything from what it's at now. Fortunately, however, we have records in the form of tree rings which can be used to calibrate this very precisely going back several thousand years. In fact, you can even cross-reference dead trees with living ones (use early rings on a living one to establish the date of late rings on a dead tree, then count back to the early rings in the dead tree and calibrate from there) to extend this back even further. I'm not sure exactly how far back this calibration goes, but I recall hearing it's at least seven thousand years (but don't quote that number). There are other methods for calibrating such dating techniques (can't say I know them myself, but they're out there), particularly when dealing with geological dating, and the key really is to find the consistency between different techniques.

Is there some particular reason you're curious about this? Are you wondering about the validity of some particular result or claim?
 
Just a few comments in additions to Ziggurat's.

There are many different isotope dating tests using a variety of elements (more than ten I think).

The most common for archeological purposes is carbon 14 dating.

This test is based on the notion that an organism takes in CO2 directly from the air, eats plants that do or eats animals that eat plants while it is alive. The carbon in the air has a higher percentage of carbon 14 than carbon tham carbon in the ground because the carbon 14 is constantly decaying. The carbon 14 concentration in the air is maintained because of cosmic ray bombardment.

Originally there was a systematic error in carbon 14 dating. The pine trees in the bristlecone forest in California (both dead and living as Ziggurat stated) were used to calibrate the test as far back as 10,000 years. The test can be used for dates as far back as 50,000 years but I don't believe that an independent method calibrating these dates has been found, although it seems like that simple interpolation probably makes the test fairly accurate.

I actually read a good summary of most of the various techniques on a Christian web site once. The guy who was a Christian and a chemist was explaining why the creationists were wrong about the inaccuracy of the various techniques. Unfortunately, I don't have a link for you right now.

As to your question about inaccuracies, I think Zigurat is probably right. The errors with the carbon 14 dating had to do with not knowing exactly what the concentrations of carbon 14 in the atmosphere were in the past and not errors stemming from misunderstanding the decay rates of carbon 14.
 
I only date other carbon based lifeforms.......

Altho I did date a fossil once and my frends never let me hear the end of it !
 
davefoc said:

I actually read a good summary of most of the various techniques on a Christian web site once. The guy who was a Christian and a chemist was explaining why the creationists were wrong about the inaccuracy of the various techniques. Unfortunately, I don't have a link for you right now.

Is this it?

I believe this also addresses the point about whether we would expect half-lives to be constant. Yes, this an important assumption in radioactive dating, but one that is testable and which is well supported by our understanding of the theory behind nuclear decay. The half-lives of most isotopes is pretty much independent of any external factors, such as temperature or pressure, so we wouldn't expect the decay rates to vary according to that. The only other way to change the half-life that I can think of is that some of the fundamental constants could have changed, as Ziggurat noted. However, this is at least testable, since important quantities like the fine structure constant can be studied in the past by, for example, spectroscopy of stars and galaxies many light years away. As far as I'm aware, there seems to be little or no change in these constants over the lifetime of the universe.

So, no, there's no reason to believe that the decay rate could be inconsistent over a long time period.

Edited for clarity.
 
Also, every dating technique gives a range of dates, not a specific date. The older the fossil, the bigger the range.
 
Starting ratio is important for 14C dating because 14C is being produced all the time in the atmosphere.

It's much less important for the common geological chronometers because production took place before the solar system formed. The isotope ratios are expected to have been homogenised during solar system formation and, sure enough, turn out to have been.

Young Earth Creationists like to wave their hands and say half lives may have changed in the past to account for the masses of isotopic data requiring an old Earth. However, they never produce a model that can be tested. They have a big problem because the ages derived from beta decay, alpha decay, electron capture and fission all tell the same story. So not only do they have to speed up radioactive decay, they have to speed up all forms of radioactive decay to the same extent.
 
great question malfunction, I am not a qm specialist but what I can say is that I got into this great fight with JJ about wether the speed of light was constant for all epochs of the universe. And if you take the idea that we can see back in time through telescopes, it would appear that most physical constants are constant back through time. Now since it is the weak force(maybe, i could be very wrong) that mediates radioactive decay, I feel that if that constant changed then we would be able to detect it.
 
Brian the Snail Asked:
Is this it?

Yes.

A correction to what I said previously:
I said that I ddn't think non tree ring methods had been developed for calibrating the carbon 14 test. The author of the site linked to above listed a few non-tree ring tests that could be used to calibrate carbon 14 dating farther back than the tree rings.


Dancing David:
I was interested in your comment about the constancy of the speed of light. Were you arguing that you thought that the speed of light was constant for all time and in all places? Except for the recent inflationary theory that only talks about a very short time following the big bang I think the generally accepted belief is that the speed of light is constant across time and distance. But I have wondered what is the basis for this belief.

Just an observation:
One thing that I hadn't thought about before with regard to this stuff is that the decay rate is independent of temperature. I know that radioactive decay is supposed to be caused by the weak force which I guess has nothing to do with thermal excitation, but it still surprises me that temperature wouldn't affect it.
 
davefoc said:

One thing that I hadn't thought about before with regard to this stuff is that the decay rate is independent of temperature. I know that radioactive decay is supposed to be caused by the weak force which I guess has nothing to do with thermal excitation, but it still surprises me that temperature wouldn't affect it.

Actually, it's not that surprising. First of all, thermal energy (at least at room temperature) is of the order of fractions of an electron-volt, while nuclear decay energies are of the order of MeVs. The energy scales are completely different. Then you have the fact that temperature really only affects the translational motion of the atoms, and not the internal structure (at least, not directly). So, actually things like pressure or chemical bonds are better candidates for changing decay rates. But even these processes mostly affect the outer electrons of the atom, while leaving alone the inner electrons that can actually perturb the nucleus. So in most cases the effect of the environment is going to be neglible. And the effect is even small in the exceptions to this rule (for example, the article I linked to mentions changes of an order of 1% in the decay rate for Be-7, which is very much a special case, and isn't used for dating anyway).
 
A small footnote:

As pointe above, temperature does not affect decay rates, but it has an effect on geochronology. When dealing with techniques like Sm/Nd, Pb/Pb, K/Ar and others that are used to date rocks and minerals (obtaining ages that cause heart attacks in YECs), one must keep in mind that isotope decay starts below a certain temperature. Isotopic systems may be completely or partially resetted by heating. This brings an extra error source. Some YEC texts point to these errors (as well as other resulting from contamination or superimposed phases of mineral growht) as a "proof" that the methods are unreliable.
Just another example that they don´t know what they are talking about.
 
davefoc said:
Brian the Snail Asked:
[

Dancing David:
I was interested in your comment about the constancy of the speed of light. Were you arguing that you thought that the speed of light was constant for all time and in all places? Except for the recent inflationary theory that only talks about a very short time following the big bang I think the generally accepted belief is that the speed of light is constant across time and distance. But I have wondered what is the basis for this belief.


I had some mistaken beliefs about the potential for the speed of light to vary, but there is this thing called the 'fine structure constant', which is ties up with electron shells and photon emissio, and because the spectral lines are constant the fine structure is constant and the speed of light is constant. But there is this Canadian who has measued the fine struvtue constant at the edge of visible space and he says that at that distance it varies.

So at least after photon decoupling the fine structure is constant.
 
Correa Neto said:
A small footnote:

As pointe above, temperature does not affect decay rates, but it has an effect on geochronology. When dealing with techniques like Sm/Nd, Pb/Pb, K/Ar and others that are used to date rocks and minerals (obtaining ages that cause heart attacks in YECs), one must keep in mind that isotope decay starts below a certain temperature. Isotopic systems may be completely or partially resetted by heating. This brings an extra error source. Some YEC texts point to these errors (as well as other resulting from contamination or superimposed phases of mineral growht) as a "proof" that the methods are unreliable.
Just another example that they don´t know what they are talking about.

Could you point me in the direction of this?

Are you saying that
a. because of the temerature effect , decay will not occur above a ceratin temperature?

or

b that high temperatures cause isotopes to undacy?

Sorry for my confusion.
 
Re: Re: carbon dating...

Ziggurat said:

For decay rate to change, fundamental constants of quantum mechanics would have to change. Something that dramatic might make life on earth itself impossible. So it's a very safe bet that decay rates are constant.

Well, the result would most likely be something that also radically changes the way that chemistry would work.

We would probably notice evidence of that in old rocks, and we don't.
 
Above a certain temperature called "homogeneization temperature" or similar terms, there is no decay. The temperature depends on the isotope.

Think in this way-
A lump of lava is cooling, and it contains a give isotope. Decay will only happen below a certain temperature, when the lava cristallized. In this way, we can date the eruption.

A rock is heated, say by metamorphism, but does not melts. The isotopic systems, depending on the closure or resetting temperature, may be unaffected, partially affected or totally resetted. In this way, we can date the metamorphic event, but some error may seep in if the system was only partially resetted.

If you want I can point you to some texts on this issue.
 
Dancing David :"I had some mistaken beliefs about the potential for the speed of light to vary..."

There are a few Theorists out there that believe in VSL. One, Joao Magueijo, is a fellow at Imperial College ( the place Tez is moving to ). He postulates that the SOL was closer to 279,000 MPS during the first inflationary phase I belive.
 
Correa Neto said:
Above a certain temperature called "homogeneization temperature" or similar terms, there is no decay. The temperature depends on the isotope.
So, wait a minute ... you're saying that a radioactive isotope will become nonradioactive if kept above a certain temperature? And that for things like Carbon-14, this temperature is within the range of ordinary molten lava?

How on Earth does the thermal energy of an atom prevent its nucleus from decaying?
 
tracer said:

So, wait a minute ... you're saying that a radioactive isotope will become nonradioactive if kept above a certain temperature? And that for things like Carbon-14, this temperature is within the range of ordinary molten lava?

How on Earth does the thermal energy of an atom prevent its nucleus from decaying?

My understanding is that the decay products can vacate the premises if the substance isn't solid.
 
Some of this issue about decay rates and temperature seems to be related to how the clock is reset when the rock is melted and what happens when the rock is only partially remelted.

Could somebody talk a little about the mechanism of resetting the clock? Suppose you're doing one of the tests involving Uranium and Lead. If you remelt the rock then Uranium and Lead are still there in same ratios so clock isn't reset? I'm confused on this although at one time I thought I understood it.

Is anybody saying that the decay is actually stopped by heating above a certain temperature or just that measurement of the age of the rock is affected by that heating?
 
My understanding is that the decay products can vacate the premises if the substance isn't solid.
If some of the radio-isotopes escape, then you can't make all the necessary measurements, then your results are skewed. Higher rock temperatures can create chemical instabilities that may allow some of the radio-isotopes to sneak out. (Chemical instabilities such as migrating pore fluids)

The isotope clock is reset when the system becomes closed, ie chemically stable.
 

Back
Top Bottom