The Foundations of Radioactive Dating Methods

by John Walton
via Sentry Magazine, Vol. 20 No. 4, December 1994

A recent Bible study class focused on the topic of the "age of the Earth." Considerable discrepancy exists between the age of the Earth as determined by some scientific methods and that indicated by the Bible. The accepted age of the Earth based on radioactive dating methods (radiometric dating) is about 4.5 billion years. The Bible gives an age of roughly 6,000 years. [Assuming that the genealogical records are complete and without gaps.]

Radiometric dating methods are considered reliable. Witness the various books, articles, museum placards, and the like that routinely state the age of objects as millions and sometimes billions of years, often without question or controversy. The incontestable use of radiometric techniques may be based on the relatively simple mathematical mechanics of radioactive decay.

It is a truism that when a thing is accepted without consideration of its assumptions, it is taken as a truth. Such, it appears, is the case for radiometric dating methods. These methods were discovered in the early 1900s and are now considered the de facto standard for quantitative age determination of various objects, including rocks, fossils, and other artifacts.

It is also a truism that a thing is only as strong as its foundation. In an effort to learn how radioactive dating is conducted, I discovered that these methods rely on assumptions that call into question the certainty of their age determinations. Two assumptions in particular will be discussed.

A Thumbnail Sketch of Radiometric Dating

All radiometric dating methods work similarly. The atoms of a radioactive (parent) element spontaneously decay into the atoms of a different (daughter) element. For example, uranium decays into lead, rubidium into strontium, potassium into argon, and carbon-14 into nitrogen, among other common elements used for dating. (The carbon-14 method is useful for determining the age of organic materials; things that were once alive.) The radioactive decay process is generally insensitive to changes in chemical or physical conditions and is thus considered reliable for dating old objects.

Radioactive decay obeys the following mathematical formula:
N = N0e-λi (1)

N0 represents the original number of parent atoms in the sample, which, after time t, will decay, leaving N parent atoms and D daughter atoms. Rearranging the equation to solve for time t yields:

t = (1/λ)loge(D/N + 1) (2)

To determine the age of a rock (or other item), a sample must be found that contains a specific radioactive element with a known decay rate, X. Then by counting the number of daughter atoms, D, and the number of parent atoms, N, remaining in the sample, a quantitative age determination can be made by using equation (2) above.

The decay rate, X, is related to the element’s half-life: the time it takes for half of the atoms of a parent element to decay to atoms of its daughter element. Half-lives range from billions of years to millionths of a second. Thus, an element’s half-life determines its applicability for dating certain objects. For example, half of a given amount of the uranium U238 isotope decays to the lead Pb106 isotope in almost 4.5 billion years, making it appropriate for dating very old objects.

The Foundations of Radiometric Dating

As shown, the mathematics of radiometric dating are relatively simple and straightforward. Implicit in radiometric methods, however, are several assumptions that can affect the accuracy of the calculated age. Among the assumptions are two that warrant consideration:

  • An element’s decay rate, X, is constant and has been constant since the beginning of time, when the radioactive element was created.
  • All the daughter atoms are a result of the decay of parent atoms, and none have been added or leached out.

These and other assumptions are rooted in a geologic concept that was first formulated in the mid-1800s called the Principle of Uniformitarianism. This principle holds that nature behaves uniformly and that, by studying the present, one can infer the behavior of the past. While this principle seems reasonable, note that it cannot be proven through scientific experiment. Geologists cannot travel back in time and confirm that details of past processes can indeed be inferred from present processes.

The first assumption of constant decay rate can be thought of in terms of the swinging pendulum of a clock. A clock will run fast if its pendulum swings too fast. Similarly, if the pendulum swings too slowly, the clock will run slowly. If the pendulum does not swing at a constant rate, the clock will run both too fast and too slow; it simply cannot be relied upon to measure time accurately. The decay rate of a radioactive element is equivalent to the swinging pendulum of a clock. The constancy of both is necessary for either to be of any value for measuring time.

Decay rates for radioactive elements have been determined only in the last 90 years, since radioactive decay was discovered. For geologists to assume that decay rates have remained constant for more than 4 billion years is a significant extrapolation. Supposed evidence of constant radioactive decay is not conclusive. Changing radioactive decay rates would render radiometric dating methods useless. There is no certain way to determine what effect—either too old or too young—changing decay rates would have on age estimates made by radiometric techniques.

The second assumption can also affect the accuracy of age estimates made with radiometric methods. An accurate count of the daughter and parent atoms in an object being dated is required. There is no way to be certain, however, that some of the daughter atoms were not already present in an object when the parent started to decay. For example, there may already have been some lead in a rock containing uranium, which ultimately decays into lead. Thus, when counting all the lead atoms in the rock (including the extra atoms originally there), the radiometric date would overestimate its age; it would appear that more of the uranium had decayed. Similarly, if some of the lead has leached from the rock, possibly due to groundwater, then the radiometric date would underestimate the rock's age.

It is true, however, that relatively small errors in age calculations will occur when small amounts of daughter atoms are already present in (or have been leached out of) a rock. For example, if 10 percent of the daughter atoms in a rock were already there when the parent started to decay, the age estimate would actually be slightly more than 4 billion years instead of 4.5 billion. Almost all of the daughter atoms would have to be pre-existing in a rock to make an error that would result in a true age of 10,000 years or less.

(This point is often omitted in criticism of this assumption. I make it on the basis of objectivity.)

These and other assumptions of radiometric dating methods can make their age determinations tenuous at best. For example, a certain Swedish shale, called "kolm", contains uranium nodules. This kolm is interesting because it can be dated using two radiometric methods: one employing uranium-lead decay and another comparing two isotopes of lead. The uranium-lead calculation yields an age of 380 million years—an age considered "disconcerting" by geologists (presumably because it is too low). The lead-to-lead calculation yields a figure of 770 million years, an age considered "clearly too large." Instead of rejecting either calculation, a compromise of 440 million years has been accepted for the Swedish shale! Conflicting age determinations are apparently routine for radiometric dating techniques.

All said, it should be noted that geologists are not ignorant of the assumptions made in radiometric dating. Diligent efforts have been made to reduce errors in radiometric dating and ensure accurate results. Recent radiometric dating of meteorites has yielded the generally accepted age of approximately 4.5 billion years (although apparently not without some of the same age contradictions as mentioned for the Swedish shale), and hence the presently accepted age of the Earth by most of science. Even with conservative consideration given to radiometric dating assumptions (conservative for the Christian, radical for the geologist), it appears that ages less than a few tens of millions of years would not likely be calculated.

Other Age Calculations

Several other secular earth-age calculations are often publicized by antagonists of radiometric techniques. Most of these methods give an age of the Earth significantly less than 4.5 billion years, some on the order of 10,000 years or less. These ages are more in line with the Bible and are probably the primary reason they are espoused by some opponents of radiometric dating techniques. Among these calculations are:

  • "The shrinking sun”: estimates of the rate at which the sun is consuming itself indicate that it would have been twice as large as it is now just 1 million years ago; so large that all life on Earth could not exist because of high temperature, so the sun, and consequently the Earth, must be much younger.
  • "Depth of space dust on the moon": measurements of the rate of accumulating interplanetary space dust indicate that the surface of the moon should have a layer of dust at least 200 feet deep. Astronauts found only a thin layer, suggesting a much younger moon and, by extension, a younger Earth.

These and several other secular "young earth-age” estimates are subject to at least one criticism they share with radiometric dating methods: they are based on the Principle of Uniformitarianism. Implicit in their age calculation is that the rate of some present-day process has been the same (or at least a determinable) rate in the past.

Thus, earth-age determinations based on these techniques are not inherently more reliable than radiometric methods and should be met with the same skepticism.

An Apparent Age of the Earth

So, is the Earth 4.5 billion years old, as indicated by radiometric dating techniques, or 6,000 years old, as indicated by the Bible? Could it be that both ages are correct?

A reasonable notion is the idea that God’s creation has an "apparent age." It is likely that God created the universe as a functioning system. For example, God evidently created Adam as a grown man. If we were to inspect Adam on the second day of his existence, would we have estimated his true age to be 2 days old? More likely, we would have guessed he was an adult. Using this information to infer the age of the Earth would result in an erroneous estimate measured in years (based on Adam’s apparent age) instead of the true age measured in days.

The Word of God indicates that the Earth is on the order of 6,000 years. "God said it, and that settles it," is the justifiable sentiment of most Christians on the age of the Earth and other controversial matters. However, the concept of creation with an apparent age provides a basis for reconciling the Earth's extreme age from radiometric techniques with the younger age in the Bible. Note that because of the previously discussed problems associated with radiometric dating, there is no particularly compelling reason to rectify it with the true word of God. But to many Christians, God is the creator of Earth and all of its processes, including radioactive decay. The apparent age concept provides a self-consistent explanation for the disparate ages.

References:

  1. The Genesis Flood, The Biblical Record and Its Scientific Implications, by John C. Whitcomb, Jr., Th.D and Henry M. Morris, Ph.D., 1961.
  2. McGraw-Hill Encyclopedia of Science & Technology, 7th Edition.
  3. The Illustrated Origins Answer Book, by Paul S. Taylor, 1992.
  4. "The Age-of-the-Earth Debate," by Lawrence Badash, Scientific American, August 1989.
  5. Earth, by Frank Press and Raymond Siever, 1974.