Carbon dating is used every day in historical research. It's not flawed because it works.
Why ask questions about dating the earth if you yourself disbelieve that the earth is 6,000 years old? Sounds like you don't know what you believe.
Read carefully........
People who ask about carbon-14 (14C) dating usually want to know about the radiometric[1] dating methods that are claimed to give millions and billions of years—carbon dating can only give thousands of years. People wonder how millions of years could be squeezed into the biblical account of history.
Clearly, such huge time periods cannot be fitted into the Bible without compromising what the Bible says about the goodness of God and the origin of sin, death and suffering—the reason Jesus came into the world (See Six Days? Honestly!).
Christians, by definition, take the statements of Jesus Christ seriously. He said,
“But from the beginning of the creation God made them male and female” (Mark 10:6).
This only makes sense with a time-line beginning with the creation week thousands of years ago. It makes no sense at all if man appeared at the end of billions of years.
We will deal with carbon dating first and then with the other dating methods.
How the Carbon Clock Works
Carbon has unique properties that are essential for life on earth. Familiar to us as the black substance in charred wood, as diamonds, and the graphite in “lead” pencils, carbon comes in several forms, or isotopes. One rare form has atoms that are 14 times as heavy as hydrogen atoms: carbon-14, or 14C, or radiocarbon.
Carbon-14 is made when cosmic rays knock neutrons out of atomic nuclei in the upper atmosphere. These displaced neutrons, now moving fast, hit ordinary nitrogen (14N) at lower altitudes, converting it into 14C. Unlike common carbon (12C), 14C is unstable and slowly decays, changing it back to nitrogen and releasing energy. This instability makes it radioactive.
Ordinary carbon (12C)is found in the carbon dioxide (CO2) in the air, which is taken up by plants, which in turn are eaten by animals. So a bone, or a leaf or a tree, or even a piece of wooden furniture, contains carbon. When the 14C has been formed, like ordinary carbon (12C), it combines with oxygen to give carbon dioxide (14CO2), and so it also gets cycled through the cells of plants and animals.
We can take a sample of air, count how many 12C atoms there are for every 14C atom, and calculate the 14C/12C ratio. Because 14C is so well mixed up with 12C, we expect to find that this ratio is the same if we sample a leaf from a tree, or a part of your body.
In living things, although 14C atoms are constantly changing back to 14N, they are still exchanging carbon with their surroundings, so the mixture remains about the same as in the atmosphere. However, as soon as a plant or animal dies, the 14C atoms which decay are no longer replaced, so the amount of 14C in that once-living thing decreases as time goes on. In other words, the 14C/12C ratio gets smaller. So, we have a “clock” which starts ticking the moment something dies.
Obviously, this works only for things which were once living. It cannot be used to date volcanic rocks, for example.
The rate of decay of 14C is such that half of an amount will convert back to 14N in 5,730 years (plus or minus 40 years). This is the “half-life.” So, in two half-lives, or 11,460 years, only one-quarter of that in living organisms at present, then it has a theoretical age of 11,460 years. Anything over about 50,000 years old, should theoretically have no detectable 14C left. That is why radiocarbon dating cannot give millions of years. In fact, if a sample contains 14C, it is good evidence that it is not millions of years old.
However, things are not quite so simple. First, plants discriminate against carbon dioxide containing 14C. That is, they take up less than would be expected and so they test older than they really are. Furthermore, different types of plants discriminate differently. This also has to be corrected for.[2]
Second, the ratio of 14C/12C in the atmosphere has not been constant—for example, it was higher before the industrial era when the massive burning of fossil fuels released a lot of carbon dioxide that was depleted in 14C. This would make things which died at that time appear older in terms of carbon dating. Then there was a rise in 14CO2 with the advent of atmospheric testing of atomic bombs in the 1950s.[3] This would make things carbon-dated from that time appear younger than their true age.
Measurement of 14C in historically dated objects (e.g., seeds in the graves of historically dated tombs) enables the level of 14C in the atmosphere at that time to be estimated, and so partial calibration of the “clock” is possible. Accordingly, carbon dating carefully applied to items from historical times can be useful. However, even with such historical calibration, archaeologists do not regard 14C dates as absolute because of frequent anomalies. They rely more on dating methods that link into historical records.
Outside the range of recorded history, calibration of the 14C "clock is not possible.[4]
Other Factors Affecting Carbon Dating
The amount of cosmic rays penetrating the earth's atmosphere affects the amount of 14C produced and therefore dating the system. The amount of cosmic rays reaching the earth varies with the sun's activity, and with the earth's passage through magnetic clouds as the solar system travels around the Milky Way galaxy.
The strength of the earth's magnetic field affects the amount of cosmic rays entering the atmosphere. A stronger magnetic field deflects more cosmic rays away from the earth. Overall, the energy of the earth's magnetic field has been decreasing,[5] so more 14C is being produced now than in the past. This will make old things look older than they really are.
Also, the Genesis flood would have greatly upset the carbon balance. The flood buried a huge amount of carbon, which became coal, oil, etc., lowering the total 12C in the biosphere (including the atmosphere—plants regrowing after the flood absorb CO2, which is not replaced by the decay of the buried vegetation). Total 14C is also proportionately lowered at this time, but whereas no terrestrial process generates any more 12C, 14C is continually being produced, and at a rate which does not depend on carbon levels (it comes from nitrogen). Therefore, the 14C/12C ratio in plants/animals/the atmosphere before the flood had to be lower than what it is now.
Unless this effect (which is additional to the magnetic field issue just discussed) were corrected for, carbon dating of fossils formed in the flood would give ages much older than the true ages.
Creationist researchers have suggested that dates of 35,000 - 45,000 years should be re-calibrated to the biblical date of the flood.[6] Such a re-calibration makes sense of anomalous data from carbon dating—for example, very discordant “dates” for different parts of a frozen musk ox carcass from Alaska and an inordinately slow rate of accumulation of ground sloth dung pellets in the older layers of a cave where the layers were carbon dated.[7]
Also, volcanoes emit much CO2 depleted in 14C. Since the flood was accompanied by much volcanism (see Noah's Flood…, How did animals get from the Ark to isolated places?, and What About Continental Drift?), fossils formed in the early post-flood period would give radiocarbon ages older than they really are.
In summary, the carbon-14 method, when corrected for the effects of the flood, can give useful results, but needs to be applied carefully. It does not give dates of millions of years and when corrected properly fits well with the biblical flood.
Other Radiometric Dating Methods
There are various other radiometric dating methods used today to give ages of millions or billions of years for rocks. These techniques, unlike carbon dating, mostly use the relative concentrations of parent and daughter products in radioactive decay chains. For example, potassium-40 decays to argon-40; uranium-238 decays to lead-206 via other elements like radium; uranium-235 decays to lead-207; rubidium-87 decays to strontium-87; etc. These techniques are applied to igneous rocks, and are normally seen as giving the time since solidification.
The isotope concentrations can be measured very accurately, but isotope concentrations are not dates. To derive ages from such measurements, unprovable assumptions have to be made such as:
The starting conditions are known (for example, that there was no daughter isotope present at the start, or that we know how much was there).
Decay rates have always been constant.
Systems were closed or isolated so that no parent or daughter isotopes were lost or added.
There Are Patterns in the Isotope Data
There is plenty of evidence that the radioisotope dating systems are not the infallible techniques many think, and that they are not measuring millions of years. However, there are still patterns to be explained. For example, deeper rocks often tend to give older “ages.” Creationists agree that the deeper rocks are generally older, but not by millions of years. Geologist John Woodmorappe, in his devastating critique of radioactive dating,[8] points out that there are other large-scale trends in the rocks that have nothing to do with radioactive decay.
“Bad” Dates
When a “date” differs from that expected, researchers readily invent excuses for rejecting the result. The common application of such posterior reasoning shows that radiometric dating has serious problems. Woodmorappe cites hundreds of examples of excuses used to explain “bad” dates.[9]
For example, researchers applied posterior reasoning to the dating of Australopithecus ramidus fossils.[10] Most samples of basalt closest to the fossil-bearing strata give dates of about 23 Ma (Mega annum, million years) by the argon-argon method. The authors decided that was “too old,” according to their beliefs about the place of the fossils in the evolutionary grand scheme of things. So they looked at some basalt further removed from the fossils and selected 17 of 26 samples to get an acceptable maximum age of 4.4 Ma. The other nine samples again gave much older dates but the authors decided they must be contaminated and discarded them. That is how radiometric dating works. It is very much driven by the existing long-age world view that pervades academia today.
A similar story surrounds the dating of the primate skull known as KNM-ER 1470.[11] This started with an initial 212 to 230 Ma, which, according to the fossils, was considered way off the mark (humans “weren't around then"). Various other attempts were made to date the volcanic rocks in the area. Over the years an age of 2.9 Ma was settled upon because of the agreement between several different published studies (although the studies involved selection of “good” from “bad” results, just like Australopithecus ramidus, above).
However, preconceived notions about human evolution could not cope with a skull like 1470 being “that old.” A study of pig fossils in Africa readily convinced most anthropologists that the 1470 skull was much younger. After this was widely accepted, further studies of the rocks brought the radiometric age down to about 1.9 Ma—again several studies “confirmed” this date. Such is the dating game.
Are we suggesting that evolutionists are conspiring to massage the data to get what they want? No, not generally. It is simply that all observations must fit the prevailing paradigm. The paradigm, or belief system, of molecules-to-man evolution over eons of time, is so strongly entrenched it is not questioned—it is a “fact.” So every observation must fit this paradigm. Unconsciously, the researchers, who are supposedly “objective scientists” in the eyes of the public, select the observations to fit the basic belief system.
We must remember that the past is not open to the normal processes of experimental science, that is, repeatable experiments in the present. A scientist cannot do experiments on events that happened in the past. Scientists do not measure the age of rocks, they measure isotope concentrations, and these can be measured extremely accurately. However, the “age” is calculated using assumptions about the past that cannot be proven.
We should remember God's admonition to Job, “Where were you when I laid the foundations of the earth?” (Job 38:4).
Those involved with unrecorded history gather information in the present and construct stories about the past. The level of proof demanded for such stories seems to be much less than for studies in the empirical sciences, such as physics, chemistry, molecular biology, physiology, etc.
Williams, an expert in the environmental fate of radioactive elements, identified 17 flaws in the isotope dating reported in just three widely respected seminal papers that supposedly established the age of the earth at 4.6 billion years.[12] John Woodmorappe has produced an incisive critique of these dating methods.[13] He exposes hundreds of myths that have grown up around the techniques. He shows that the few “good” dates left after the “bad” dates are filtered out could easily be explained as fortunate coincidences.