William Poundstone: Review of George Dyson's "Turing's Cathedral: The Origins of the Digital Universe"Roundup: Books
It’s anyone’s guess whether our digital world ends with a bang, a whimper or a singularity. One thing’s for sure: It began with a double entendre.
The digital age can be traced to a machine built circa 1951 in Princeton, N.J. That machine was given the bureaucratic-sounding name the Mathematical and Numerical Integrator and Computer, and was known by the acronym Maniac, meaning something wild and uncontrollable — which it proved to be. But the crucial double entendre was contained in the computer’s memory. For the first time, numbers could mean numbers or instructions. Data could be a noun or a verb.
That turned out to be incredibly important, as George Dyson makes clear in his latest book, “Turing’s Cathedral,” a groundbreaking history of the Princeton computer. Though the English mathematician Alan Turing gets title billing, Dyson’s true protagonist is the Hungarian-American John von Neumann, presented here as the Steve Jobs of early computers — a man who invented almost nothing, yet whose vision changed the world.
Von Neumann was no stereotypical mathematician. He was urbane, witty, wealthy and (literally) entitled. At his 1926 doctoral exam, the mathematician David Hilbert is said to have asked but one question: “Pray, who is the candidate’s tailor?” He had never seen such beautiful evening clothes....
comments powered by Disqus
- Voting opens soon for the leaders of the OAH in 2017
- A team of science historians are attempting to re-create recipes from sixteenth-century alchemy texts
- David Kennedy recalls his dinners with President Obama
- When Kellie Jones Wanted To Study Black Art History, The Field Didn’t Exist. So She Created It Herself.
- Michael Honey: The 60’s activist turned historian