Ever since I started programming, I’ve thought of milliseconds as a distinct boundary between human experience and—I’m not sure what to call it—machine experience? Code experience?
You can’t feel a millisecond. You don’t have a sense, not really, of how long it lasts. (At least I don’t.) It’s too tiny. And yet, for a computer, a millisecond might as well be one of those big blocks of time in school—fourth period, fifth period. You can cram all sorts of stuff into fifth period.
We can talk and think about milliseconds—you do it all the time with computer graphics—but once we dispatch our commands into the millisecond regime, it’s like sending them off into a hostile environment where we cannot tread. The bottom of the ocean or the inside of a volcano. Our super-fast agents do their business, then return to us with news of what’s transpired.
“Sir: I have successfully drawn a 3D picture of the Sistine Chapel. Take a look. What’s that? Another one, from a slightly different angle? Very good, sir.”
The milliseconds were always there! We just couldn’t use them!
In the late fifteenth century, clocks acquired minute hands. A century later, second hands appeared. But it wasn’t until the 1850s that instruments could recognize a tenth of a second, and, once they did, the impact on modern science and society was profound.
The book is called A Tenth of a Second. What a cool idea.