Computing today is almost entirely digital. The vast informational catacombs of the internet, the algorithms that power AI, the screen you’re reading this on — all are powered by electronic circuits manipulating binary digits — 0 and 1, off and on. We live, it has been said, in the digital age.
But it’s not obvious why a system that operates using discrete chunks of information would be good at modeling our continuous, analog world. And indeed, for millennia humans have used analog computing devices to understand and predict the ebbs and flows of nature.
Among the earliest known analog computers is the Antikythera mechanism from ancient Greece, which used dozens of gears to predict eclipses and calculate the positions of the sun and moon. Slide rules, invented in the 17th century, executed the mathematical operations that would one day send men to the moon. (The abacus, however, doesn’t count as analog: Its discrete “counters” make it one of the earliest digital computers.) And in the late 19th century, William Thomson, who later became Lord Kelvin, designed a machine that used shafts, cranks and pulleys to model the influence of celestial bodies on the tides. Its successors were used decades later to plan for the beach landings on Normandy on D-Day.
What do these devices have in common? They are all physical systems set up to obey the same mathematical equations behind the phenomena you want to understand. Thomson’s tide-calculating computer, for example, was inspired by 19th-century mathematical advances that turned the question of predicting the tide into a complex trigonometric expression. Calculating that expression by hand was both laborious and error-prone. The cranks and pulleys in Thomson’s machine were configured so that by spinning them, the user would get an output that was identical to the result of the expression that needed to be solved.