Testing Turing's Legacy

Decoding popular myth to discover what the mathematician and computer science pioneer really did — and didn’t — do.

By Tony Rothman|Thursday, August 27, 2015
RELATED TAGS: MATH, COMPUTERS
turing-top
turing-top
Brilliant though he was, Alan Turing had lots of help breaking the Enigma code.
Granger NYC

Alan Turing has had a good run in recent decades. The great British mathematician, who died in 1954, was the focus of last year’s acclaimed biopic, The Imitation Game. The film, ostensibly based on Andrew Hodges’ 1983 biography Alan Turing, the Enigma, was only the latest in a string of plays, TV dramas and novels that told Turing’s story primarily through the lens of his wartime code-breaking activities at Bletchley Park and his struggles with his homosexuality.

Popular reductions, however, tend to attribute every scientific development in a field to a single individual. When you read, “It is only a slight exaggeration to say that the British mathematician Alan Turing saved the Allies from the Nazis,” — to quote from the book-jacket copy and, one supposes, the mission statement of the film — the immediate reaction is to wonder what the Russians, who lost many times more people than the British or Americans during World War II, would say about that. Or the Poles, whose particular contributions to Allied decryption efforts are criminally ignored in the film and frequently overlooked by armchair historians. Without detracting from Turing’s reputation in the slightest, it is nevertheless worth taking a brief look at what he did, and what a few others did, too.

The Real Imitation Game

Hollywood productions aside, Turing is probably best known for the Turing Test, a proposal designed to provide a criterion for deciding whether a machine is thinking. Turing was by no means the only one considering the question: During the 1940s, when the early electronic computers were being created, it was a popular topic of debate among mathematicians. The question, “Can a machine think?” though, was invariably confused — as it is today — with, “Can a machine be conscious?”

In 1950, Turing published “Computing Machinery and Intelligence,” written largely in response, not to fellow mathematicians but to neurologist Geoffrey Jefferson, who in 1949 famously declared, “Not until a machine can write a sonnet or compose a concerto because of thoughts and emotions felt, and not by the chance fall of symbols, could we agree that machine equals brain.”

Turing proposed an “imitation game” with which he intended to create an operational definition of “to think.” In its original form, the game consisted of three players, an interrogator and two subjects, a man and a woman. The interrogator can interact with the subject only through a teletype (today a computer screen). The interrogator’s job is to decide, by asking questions and getting (likely deceptive) answers, which subject is a man and which is a woman.

Turing then proposed replacing one of the subjects with a machine and asked whether the machine could dupe the interrogator into believing it was human. The vague, “Can machines think?” was replaced by the specific, “Are there imaginable [machines] which would do well in the imitation game?”

Turing’s paper was philosophical, not mathematical, and his answer non-rigorous: He believed that by century’s end, computers would think, in the sense of being able to pass the test. Whether that has been achieved, even by a chatbot posing as a Ukrainian teenager, we leave to the never-ending debate.

enigma-bombe
enigma-bombe
Two Machines
Left, Germany had used Enigma devices to encrypt military communications since the 1920s. Later devices had about 159 quintillion settings, making them almost uncrackable. Right, the now-famous bombe machines enabled Allied cryptographers to decode more than 3,000 enemy messages per day.
Left: Bonhams/Splash News/Corbis; Right: ©Crown. Reproduced by kind permission, Director, GCHQ

Wrapped in an Enigma

The Imitation Game cemented Turing’s close association with British wartime efforts at Bletchley Park to crack the Nazi Enigma code. But he was far from alone, despite the impression left by the movie. The film’s most blatant historical oversight is the lack of any real credit given to the Polish mathematicians who paved the way for British efforts.

Enigma had existed in some form since 1918. By the 1920s, it was employed to encipher virtually all official German communication, and by 1928 Polish code breakers were studying it to learn the intentions of their potential enemy. The original device contained three wheels, which rotated like those in an old-fashioned adding machine, encrypting the letters typed in by the operator. With an additional plugboard capable of swapping letters, Enigma had about 159 quintillion settings.

Using inspired guesswork and elementary group theory, three Polish mathematicians — Jerzy Rózycki, Henryk Zygalski and Marian Rejewski — fully cracked this version of Enigma in 1932. And in 1938 they invented the bombes: electromechanical rotating drums that, ticking noisily, replicated possible Enigma settings.

Eventually, the trio lacked the resources to proceed further — being on the run after the Nazi invasion of Poland in 1939 certainly didn’t help. However, they were able to turn over all their results to the British, including every decipherment technique glimpsed in the film, as well as perhaps their major contribution: persuading the British to employ mathematicians rather than linguists as code breakers. Turing definitely improved on the Polish efforts, including the bombes, but it is unlikely that efforts at Bletchley Park would have gone as far and as fast without their contributions.

By the way, Enigma was not the only Nazi code in use during WWII. Beginning in 1941, the Germans also used Lorenz, a much more sophisticated cipher than Enigma, and to crack it required the first fully electronic computers, the famous Colossus machines, each of which employed several thousand vacuum tubes. But these were the brainchild of Turing’s colleague Tommy Flowers, and Turing played no significant role in their development.

Man and Machine

While the layperson may now be most familiar with Turing’s work on Enigma and the later test that bears his name, Turing’s most important mathematical contribution is his 1936 paper “On Computable Numbers,” which introduced the famous Turing machine.

The question Turing was addressing — inspired by logician Kurt Gödel’s legendary “undecidability” theorem (that there are statements in any mathematical system that cannot be proved) — was whether an algorithm composed of a finite number of instructions can compute any function to any desired precision. (Turing chose numbers rather than functions for simplicity. Pi, for example, is computable; even the ancient Greeks knew simple algorithms that can churn out pi to any number of decimal places.)

As it happens, the question had already been answered by Alonzo Church, who soon became Turing’s graduate adviser at Princeton. But Turing’s version is more widely remembered because Church couched his response in purely mathematical terms. Turing introduced the notion of a “machine” — today’s algorithms — that could decide the question. His ideas provided the foundation for modern computer architecture, first realized in machines like the one at Princeton’s Institute for Advanced Study, completed in 1952.

This work has occasionally led the popular media to suggest that Turing invented the computer. Here the copywriters should blush, or their noses grow. The earliest binary machine builders — such as John Atanasoff, George Stibitz and Conrad Zuse, who introduced many of the features incorporated into today’s computers — were working almost simultaneously with Turing and could not have been aware of his results.

What seems universally overlooked in casual conversation is that, much in the way of Gödel’s theorem, Church and Turing answered the computability question in the negative: Algorithms cannot compute all numbers or functions to arbitrary precision. Indeed, one can prove there are many more functions that cannot be computed algorithmically than those that can.

In other words, computers — much to the present generation’s surprise — cannot solve all problems.

ADVERTISEMENT
Comment on this article
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
DSC-CV0517web
+