Wrapped in an Enigma
The Imitation Game cemented Turing’s close association with British wartime efforts at Bletchley Park to crack the Nazi Enigma code. But he was far from alone, despite the impression left by the movie. The film’s most blatant historical oversight is the lack of any real credit given to the Polish mathematicians who paved the way for British efforts.
Enigma had existed in some form since 1918. By the 1920s, it was employed to encipher virtually all official German communication, and by 1928 Polish code breakers were studying it to learn the intentions of their potential enemy. The original device contained three wheels, which rotated like those in an old-fashioned adding machine, encrypting the letters typed in by the operator. With an additional plugboard capable of swapping letters, Enigma had about 159 quintillion settings.
Using inspired guesswork and elementary group theory, three Polish mathematicians — Jerzy Rózycki, Henryk Zygalski and Marian Rejewski — fully cracked this version of Enigma in 1932. And in 1938 they invented the bombes: electromechanical rotating drums that, ticking noisily, replicated possible Enigma settings.
Eventually, the trio lacked the resources to proceed further — being on the run after the Nazi invasion of Poland in 1939 certainly didn’t help. However, they were able to turn over all their results to the British, including every decipherment technique glimpsed in the film, as well as perhaps their major contribution: persuading the British to employ mathematicians rather than linguists as code breakers. Turing definitely improved on the Polish efforts, including the bombes, but it is unlikely that efforts at Bletchley Park would have gone as far and as fast without their contributions.
By the way, Enigma was not the only Nazi code in use during WWII. Beginning in 1941, the Germans also used Lorenz, a much more sophisticated cipher than Enigma, and to crack it required the first fully electronic computers, the famous Colossus machines, each of which employed several thousand vacuum tubes. But these were the brainchild of Turing’s colleague Tommy Flowers, and Turing played no significant role in their development.
Man and Machine
While the layperson may now be most familiar with Turing’s work on Enigma and the later test that bears his name, Turing’s most important mathematical contribution is his 1936 paper “On Computable Numbers,” which introduced the famous Turing machine.
The question Turing was addressing — inspired by logician Kurt Gödel’s legendary “undecidability” theorem (that there are statements in any mathematical system that cannot be proved) — was whether an algorithm composed of a finite number of instructions can compute any function to any desired precision. (Turing chose numbers rather than functions for simplicity. Pi, for example, is computable; even the ancient Greeks knew simple algorithms that can churn out pi to any number of decimal places.)
As it happens, the question had already been answered by Alonzo Church, who soon became Turing’s graduate adviser at Princeton. But Turing’s version is more widely remembered because Church couched his response in purely mathematical terms. Turing introduced the notion of a “machine” — today’s algorithms — that could decide the question. His ideas provided the foundation for modern computer architecture, first realized in machines like the one at Princeton’s Institute for Advanced Study, completed in 1952.
This work has occasionally led the popular media to suggest that Turing invented the computer. Here the copywriters should blush, or their noses grow. The earliest binary machine builders — such as John Atanasoff, George Stibitz and Conrad Zuse, who introduced many of the features incorporated into today’s computers — were working almost simultaneously with Turing and could not have been aware of his results.
What seems universally overlooked in casual conversation is that, much in the way of Gödel’s theorem, Church and Turing answered the computability question in the negative: Algorithms cannot compute all numbers or functions to arbitrary precision. Indeed, one can prove there are many more functions that cannot be computed algorithmically than those that can.
In other words, computers — much to the present generation’s surprise — cannot solve all problems.