What does the constant decay rate has to do with the age of the earth? What if the decay rate change, then what happen?
First, I agree with those who say that the decay rate has
not changed; we have good experimental and theoretical evidence for this.
Second, even if it
had changed in the past (e.g. a hypothesized "accelerated nuclear decay"), this would
not affect
calibrated radiocarbon dates.
Why is this? Let's consider how radiocarbon calibration is done. We start with tree species which have pronounced annual rings (e.g. N. American Bristlecone Pine, Irish Oak). We both count the rings (to get a calendar date), and radiocarbon date them assuming a constant radiocarbon ratio in the atmosphere and today's decay rate (to get an "uncalibrated" date). This gives us a calibration curve, allowing us to convert uncalibrated dates to calendar dates.
Now suppose you find a piece of old wood and have it dated. How is this done? First, a date is calculated assuming a constant radiocarbon ratio in the atmosphere and today's decay rate (exactly the same way that the calibration curves were done). Then, we use the calibration curves to get the actual calendar date.
How would a change in decay rate (perhaps due to a hypothesized "accelerated nuclear decay" about 5000 years ago) have affected the situation? It would have changed the uncalibrated dates, both for our unknown sample and for the calibration curve. Thus the calibration curve would be different. But after calibration, the date for the unknown sample would come out the same. In essence, the calibration procedure cancels out any changes in decay rate or in atmospheric radiocarbon concentration. A calibrated date depends only on two assumptions:
1) the atmospheric concentrations of radiocarbon were the same at the unknown wood and at the trees which were used for calibration
2) tree rings are annual and we can count them accurately.