By Dr Brandon van der Ventel
Many think decay of radioactive materials can prove the earth is billions of years old but this is not the case.
Radiometric dating does not measure the age directly, but rather the ratio of the radioactive (unstable) parent nucleus to the stable daughter nucleus, as well as the present decay rate. However, several assumptions need to be made to proceed with the calculation:
First, one needs to assume that there were no daughter nuclei present at the start; that is, the presence of the daughter nucleus is entirely due to the decay.
Second, there had to be no leakage of either parent or daughter nuclei into or out of the sample. But how can we be sure of any of these assumptions if no-one was present when the rocks were formed or if the change in the elements were not monitored over the entire geological history?
Third, the equation is valid only if the decay rate (?) is a constant, and there is much evidence against this.
A radiometric "date" for rock layers near a fossil is accepted only if it fits into the grand evolutionary scheme of things. If this is not the case then either new samples are taken or a different dating method is used. Notice, this is not akin to a system of 'checks-and-balances' but rather a situation where results are 'reinterpreted' in order to obey the evolutionary dogma. Also, radioactive 'dating' methods have also been known to give incorrect ages for samples of known age.
Radiocarbon dating of coal samples and diamonds actually points to a young earth. This is because carbon-14 has a half-life of between 5,000 to 6,000 years. Yet, the RATE (Radioisotopes and the Age of the Earth) research group has found coal and fossilized wood containing carbon-14 (yet 'dated' to be millions of years old). Carbon-14 is also present in diamonds, although diamond is the hardest substance on Earth, thus impervious to contamination.•