Radioactivity, also known as radioactive decay, is a process by which a radioactive isotope loses subatomic particles (helium nuclei or electrons) from its nucleus along with usual emission of gamma radiation, and becomes a different element.
Radioactive atoms are inherently unstable; over time, radioactive “parent atoms” decay into stable “daughter atoms.” When molten rock cools, forming what are called igneous rocks, radioactive atoms are trapped inside. By measuring the quantity of unstable atoms left in a rock and comparing it to the quantity of stable daughter atoms in the rock, scientists can estimate the amount of time that has passed since that rock formed.
Bracketing the fossils Fossils are generally found in sedimentary rocknot igneous rock.
Sedimentary rocks can be dated using radioactive carbon, but because carbon decays relatively quickly, this only works for rocks younger than about 50 thousand years.
So in order to date most older fossils, scientists look for layers of igneous rock or volcanic ash above and below the fossil.
Scientists date igneous rock using elements that are slow to decay, such as uranium and potassium.
By dating these surrounding layers, they can figure out the youngest and oldest that the fossil might be; this is known as “bracketing” the age of the sedimentary layer in which the fossils occur.
Geologists do not directly measure the age of a rock.
They choose rocks containing radioactive “parent” isotopes that emit particles and radiation to become a different “daughter” element and measure ratios of elements to their isotopes.
Attempts to transform these ratios into dates are where this becomes problematic.
Assigning a date requires that the rate at which the parent decays into the daughter element has been the same throughout the rock’s history.
It is similar to assuming that the constriction in an hourglass has always been the same diameter, and the same number of sand grains passes every minute.