The most widely used tool to measure the age of the Earth is radioactive decay. The great scientist Ernest Rutherford was the first to define the concept of 'half-life,' that is, the time it takes for one half of the atoms in a given quantity of a radioactive element (such as plutonium) to decay into another element (such as uranium), or for one isotope of an element (such as carbon-14) to decay into another isotope of that same element (such as carbon-12)...
For more on the flaws in radioactive dating methods, pick up a copy of Thousands...not Billions in the ABR bookstore.
The most widely used tool to measure the age of the Earth is radioactive decay. The great scientist Ernest Rutherford was the first to define the concept of 'half-life,' that is, the time it takes for one half of the atoms in a given quantity of a radioactive element (such as plutonium) to decay into another element (such as uranium), or for one isotope of an element (such as carbon-14) to decay into another isotope of that same element (such as carbon-12).
Moreover, Rutherford and all scientists since him have declared that the radioactive decay of a given element or isotope occurs 'at a specific, universal, immutable rate' (Castelvecchi 2008: 21). Based on this assumption, scientists use the decay rate of certain substances to date the age of rock formations, fossils, and the Earth itself.
However, this assumption has recently come under doubt. The November 22, 2008, issue of the journal Science News reported that, 'when researchers suggested in August [2008] that the sun causes variations in the decay rates of isotopes of silicon, chlorine, radium and manganese, the physics community reacted with curiosity, but mostly with skepticism' (Ibid.).
Despite this skepticism, there is proof that this is true. For example, a team at Purdue University in Indiana was monitoring a lump of manganese-54 in a radiation detector box to measure the isotope's half-life. At 9:37 PM on December 12, 2006, the instruments recorded a sudden dip in radioactivity. At that same moment, satellites on the other side of the Earth (the daylight side) detected X rays coming from the sun, which signaled the beginning of a solar flare (Ibid.).
This was not the only evidence for such a change in the radioactive decay rate. As far back as the 1980s, a study of silicon-32 at the Brookhaven National Laboratory in New York State, and another study of radium-226 at the PTB, a scientific institute in Germany, made similar findings. Both studies were long-term, and, according to Science News, 'both had seen seasonal variations of a few tenths of a percent in the decay rates of the respective isotopes' (Ibid.). The journal went on to point out:
A change of less than a percent may not sound like a lot. But if the change is real, rather than an anomaly in the detector, it would challenge the entire concept of half-life and even force physicists to rewrite their nuclear physics textbooks (Ibid.).
Because the decay rates in the two studies from the 1980s were altered by the seasons, physicists suspect that the sun was affecting the rates of decay, 'possibly through some physical mechanism that had never before been observed' (Ibid.). The Brookhaven study, for example, which lasted from 1982 till 1986, showed that samples of silicon-32 and chrlorine-36 'had rates of decay that varied with the seasons, by about 0.3 percent' (Ibid. 22). Science News went on to report:
The samples were kept at constant temperature and humidity, so the changing seasons should have had no effect on the experiment. The team tried all the fixes it could to get rid of the fluctuations, but, in the end, decided to publish the results (Ibid.).
The results were ignored by the scientific community. 'People just sort of forgot about it, I guess,' commented David Alburger, the Brookhaven scientist who had conducted the experiment (Ibid). Alburger was unaware that, at the exact same time, the German scientists at the PTB had found the same thing, with 'yearly oscillations in a decay rate, in a 15-year experiment with radium-226' (Ibid.). Again, the finding made no splash in the scientific community.
Such small fluctuations in the rate of radioactive decay may not seem like much, but, as Science News noted, it is great enough to cause physicists to change their entire way of looking at the concept of half-life and the accuracy with which it measures ancient ages. Moreover, if solar activity was greater in the past, before humanity began measuring it, then the changes in radioactive decay might actually be greater than those measured by the scientists at Brookhaven, PTB, and Purdue.
Reference:
Castelvecchi, D. 2008. 'Half-Life (More or Less).' Science News 174, no. 11.
Editorial note: The Institute for Creation Research published detailed scientific evidence to show that these dating methods have several flaws, and produced evidence to show there was billion-fold accelerated decay in the past, most likely occuring at the time of the Flood. ABR hosted a RATE Conference (Radioisotopes and the Age of the Earth) with over 700 attendees in the Fall of 2006. For more on this important research, visit: http://store.icr.org/prodinfo.asp?number=BRATE1