William Poundstone: Review of George Dyson's "Turing's Cathedral: The Origins of the Digital Universe"
It’s anyone’s guess whether our digital world ends with a bang, a whimper or a singularity. One thing’s for sure: It began with a double entendre.
The digital age can be traced to a machine built circa 1951 in Princeton, N.J. That machine was given the bureaucratic-sounding name the Mathematical and Numerical Integrator and Computer, and was known by the acronym Maniac, meaning something wild and uncontrollable — which it proved to be. But the crucial double entendre was contained in the computer’s memory. For the first time, numbers could mean numbers or instructions. Data could be a noun or a verb.
That turned out to be incredibly important, as George Dyson makes clear in his latest book, “Turing’s Cathedral,” a groundbreaking history of the Princeton computer. Though the English mathematician Alan Turing gets title billing, Dyson’s true protagonist is the Hungarian-American John von Neumann, presented here as the Steve Jobs of early computers — a man who invented almost nothing, yet whose vision changed the world.
Von Neumann was no stereotypical mathematician. He was urbane, witty, wealthy and (literally) entitled. At his 1926 doctoral exam, the mathematician David Hilbert is said to have asked but one question: “Pray, who is the candidate’s tailor?” He had never seen such beautiful evening clothes....
comments powered by Disqus
- Did a historian who said he’s a victim of McCarthyism get the story wrong?
- Stephanie Coontz’s work on the history of marriage cited by the Supreme Court.
- How Does It Feel To Have One’s Work as a Historian Cited by the Supreme Court? Cool. Very Cool. Thank You Very Much.
- NYT History Book Reviews: Who Got Noticed this Week?
- David Hackett Fischer wins $100,000 prize for lifetime achievement in military writing