Select Page

Chronometry

The Chronometric Revolution

Three centuries ago our collective human knowledge of ancient history – meaning the period to about 1000 BCE – was restricted to four written sources: the Indian Vedas, the Chinese Five Classics, the Hebrew Bible and the Greek poet Homer.

The Bible, it was assumed, presented literal historical truth, and with little science to prove otherwise, the claim by Archbishop of Ireland James Ussher (1581-1656) that the world was created in 4004 BCE was widely believed.

Archaeology changed all this when French linguists in the 1820s unlocked the secrets of Egyptian hieroglyphics, taking (Western) history back to about 3000 BCE. British scholars of the 1840s then mastered the Old Persian, Assyrian, and Babylonian languages of Mesopotamia. The geological evidence of rocks, which was dismissed by religious belief, became scientifically accepted in the stratigraphy of the 1920s. The 19th century had unearthed pre-classical civilizations and the investigation of human evolution.There has, since WWII, been a little-acknowledged Chronometric Revolution as scientific technology has expanded our biologically given senses at both the macro- and micro-scales.

‘. . . at the end of the nineteenth century it was still impossible to assign reliable absolute dates to any events before the appearance of the first written records’ but ‘There now exist no serious intellectual or scientific or philosophical barriers to a broad unification of historical scholarship’.

The Chronometric Revolution over the last few decades has allowed us to date the age of the universe, individual rocks and fossils, archaeological remains, and the divergence of lineages in biological evolution.

In 1905 Ernest Rutherford pioneered the study of radioactive decay over time in what would subsequently develop, after 1945, into what we now call radiometric dating. Geochronology, the dating of rocks, uses the constant rate of decay of radioactive impurities for dating. The half-life of radioactive isotope carbon fourteen (C14) is relatively short at 5,730 years and this is reliable for dating organic remains up to 50,000 years old which conveniently covers a large portion of the time during which anatomically modern humans migrated across the globe.

Willard Libby pioneered radiocarbon dating in the 1940s and received Nobel prize for this work in 1960. This has transformed the study of archaeology. Accelerator mass spectrometry extending this period to 80,000 years, and this time span was expanded again with the advent of thermos-luminescence which can date objects over a period of several hundred years.

Potassium/Argon dating is used to measure the age of the earth and therefore an invaluable tool for geology.

Since the elucidation of the structure of DNA by Watson and Crick in 1953 genetic analysis has vastly improved such that we can now date with increasing accuracy the times of evolutionary divergence of branches on the tree of evolution.

The discovery of cosmic background radiation in 1964 has enabled us to date the current age of the universe at 13.8 billion years.

The psychological and intellectual leap that has occurred since the 19th century is vast moving from a ‘biblical framework’ to one in which:

. . . the universe began 13.72 billion years ago, life 4 billion years ago, hominins 6 million years ago, Homo 2 million years ago, anatomically modern humans c. 300 kya, behaviourally modern humans (about) 100 kya, settled agriculture 10 kya, cities 5 kya – none of these formerly known or even guessed.[7]

Timeline of Chronometry

c. 1600 Archbishop of Ireland James Ussher (1581-1656) dates the Creation to around 6 pm on 22 October 4004 BCE according to the proleptic Julian Calendar
1800 – the dating of the collective human knowledge of ancient history is restricted to, essentially, four written sources: the Indian Vedas, the Chinese Five Classics, the Hebrew Bible, and the Greek poet Homer.
1820s – Egyptian hieroglyphics deciphered taking known history back to about 3000 BCE
1840s – British scholars master Old Persian, Assyrian, and Babylonian languages of Mesopotamia
1900 – still impossible to assign reliable dates to any events prior to the written record
1905 – Ernest Rutherford pioneers radioactive decay
1939 – Americans L.W. Alvarez and Robert Cornog first use an accelerator mass spectrometer (AMS) in a cyclotron to demonstrate that 3He was stable and that the other mass-3 isotope, tritium (3H), was radioactive. AMS uses smaller sample sizes (c. 50 mg) and has become an advance on radiocarbon dating covering samples ranging from around 80,000 to 100 years old
1940s – radiocarbon dating pioneered by Willard Libby (receives Nobel Prize in 1960). This transforms archaeology, allowing key transitions in prehistory to be dated and applied as geochronology, the dating of rocks, using the constant rate of decay of radioactive impurities e.g.the last ice age and commencement of the Neolithic and Bronze Age in different regions. The half-life of radioactive isotope carbon fourteen (C14) is relatively short at 5,730 years and this is reliable for dating organic remains up to 50,000 years old (conveniently covers much of the period when anatomically modern humans migrated across the globe).
1962 – Émile Zuckerkandl and Linus Pauling notice that the number of amino acid differences in hemoglobin between different lineages changes roughly linearly with time, as estimated from fossil evidence. They generalized this observation to assert that the rate of evolutionary change of any specified protein was approximately constant over time and over different lineages (known as the molecular clock hypothesis, gene clock, or evolutionary clock). It is an important tool in molecular systematics used to determine scientific classifications and study variation in selective forces. Knowledge of rates of molecular evolution in particular lineages also helps estimate the dates of phylogenetic events, including those not documented by fossils. However, as yet it is limited and estimates may vary by 50% or more.
1964 – discovery of cosmic background radiation enables dating age of the universe to 13.8 billion years
1967 – A.C. Wilson promotes the idea of a ‘molecular clock’ – a technique that uses the mutation rate of biomolecules (usually nucleotide sequences for DNA, RNA, or amino acid sequences for proteins) to deduce the time in prehistory when two or more life forms (lineages) diverged. Essentially a way of dating the branching points of the evolutionary tree
1969 – molecular clocking applied to anthropoid evolution. V. Sarich and A.C. Wilson find albumin and hemoglobin have comparable rates of evolution, indicating chimps and humans split about 4 to 5 million years ago
molecular anthropology has been extremely useful in establishing the evolutionary tree of humans and other primates
1970s – mitochondrial DNA became an area of research in phylogenetics in the late 1970s. Unlike genomic DNA, it offered advantages in that it did not undergo recombination.

Print Friendly, PDF & Email