The Chronometric Revolution
Three centuries ago our collective human knowledge of ancient history – meaning the period to about 1000 BCE – was restricted to four written sources: the Indian Vedas, the Chinese Five Classics, the Hebrew Bible and the Greek poet Homer.
The Bible, it was assumed, presented literal historical truth, and with little science to prove otherwise, the claim by Archbishop of Ireland James Ussher (1581-1656) that the world was created in 4004 BCE was widely believed.
Archaeology changed all this when French linguists in the 1820s unlocked the secrets of Egyptian hieroglyphics, taking (Western) history back to about 3000 BCE. British scholars of the 1840s then mastered the Old Persian, Assyrian, and Babylonian languages of Mesopotamia. The geological evidence of rocks, which was dismissed by religious belief, became scientifically accepted in the stratigraphy of the 1920s. The 19th century had unearthed pre-classical civilizations and the investigation of human evolution.
There has, since WWII, been a little-acknowledged Chronometric Revolution as scientific technology has expanded our biologically given senses at both the macro- and micro-scales.
‘. . . at the end of the nineteenth century it was still impossible to assign reliable absolute dates to any events before the appearance of the first written records’ but ‘There now exist no serious intellectual or scientific or philosophical barriers to a broad unification of historical scholarship’.
The Chronometric Revolution over the last few decades has allowed us to date the age of the universe, individual rocks and fossils, archaeological remains, and the divergence of lineages in biological evolution.
In 1905 Ernest Rutherford pioneered the study of radioactive decay over time in what would subsequently develop, after 1945, into what we now call radiometric dating. Geochronology, the dating of rocks, uses the constant rate of decay of radioactive impurities for dating. The half-life of radioactive isotope carbon fourteen (C14) is relatively short at 5,730 years and this is reliable for dating organic remains up to 50,000 years old which conveniently covers a large portion of the time during which anatomically modern humans migrated across the globe.
Willard Libby pioneered radiocarbon dating in the 1940s and received Nobel prize for this work in 1960. This has transformed the study of archaeology. Accelerator mass spectrometry extending this period to 80,000 years, and this time span was expanded again with the advent of thermos-luminescence which can date objects over a period of several hundred years.
Potassium/Argon dating is used to measure the age of the earth and therefore an invaluable tool for geology.
Since the elucidation of the structure of DNA by Watson and Crick in 1953 genetic analysis has vastly improved such that we can now date with increasing accuracy the times of evolutionary divergence of branches on the tree of evolution.
The discovery of cosmic background radiation in 1964 has enabled us to date the current age of the universe at 13.8 billion years.
The psychological and intellectual leap that has occurred since the 19th century is vast moving from a ‘biblical framework’ to one in which:
. . . the universe began 13.72 years ago, life 4 billion years ago, hominins 6 million years ago, Homo 2 million years ago, anatomically modern humans c. 300 kya, behaviourally modern humans (about) 100 kya, settled agriculture 10 kya, cities 5 kya – none of these formerly known or even guessed.