In 1977, David Mills, an eccentric engineer and computer scientist, took a job at comsat, a satellite corporation headquartered in Washington, D.C. Mills was an inveterate tinkerer: he’d once built a hearing aid for a girlfriend’s uncle, and had consulted for Ford on how paper-tape computers might be put into cars. Now, at comsat, Mills became involved in the arpanet, the computer network that would become the precursor to the Internet. A handful of researchers were already using the network to connect their distant computers and trade information. But the fidelity of that exchanged data was threatened by a distinct deficiency: the machines did not share a single, reliable synchronized time.
Over decades, Mills had gained wide-ranging expertise in mathematics, engineering, and computer science. In the early seventies, as a lecturer at the University of Edinburgh, he’d written programs that decoded shortwave radio and telegraph signals. Later, largely for fun, he’d studied how the clocks in a power grid could wander several seconds in the course of a hot summer’s day. (The extent of their shifts depended not just on the temperature but on whether the grid used coal or hydropower.) Now he concentrated on the problem of keeping time across a far-flung computer network. Clock time, Mills learned, is the result of an unending search for consensus. Even the times told by the world’s most precise government-maintained “master clocks” are composites of the readings of several atomic clocks. The master clocks, in turn, are averaged to help create international civil time, known as Coördinated Universal Time and initialized as U.T.C.