This past summer, at an undisclosed location in a northeastern metropolis, the U.S. Department of Homeland Security (DHS) was trying to predict the future. There were no psychics or crystal balls, just a battery of sensors designed to determine human intention through the subtlest of changes in heart rate, gaze, and other physiological markers.
Together, the sensors are called Future Attribute Screening Technology, or FAST, a $20 million federal project that aims to highlight airport passengers whose bodies betray hostile intentions. In theory, FAST has the potential to detect terrorists in the final minutes before they act, but critics warn that the system may have other consequences, such as flagging innocent travelers through false positives while letting some with ill intent sneak by through false negatives. The DHS, for its part, maintains that FAST is merely improving on a far older and more fallible crime predictor: human judgment.
About 3,000 DHS officers already roam the nation’s airports scanning for suspicious behavior and facial expressions in a program called Screening of Passengers by Observational Techniques, or SPOT. The automated FAST system is intended to supplement SPOT by catching signals that are undetectable to the naked eye. FAST is not designed to replace the decision-making of human screeners, but government officials hope it will eventually be able to passively scan airport passengers and single out those worth pulling aside for additional screening.
In recent trials, DHS recruited subjects and had them attend a mock event, such as a technology expo. Some of the subjects, chosen at random, were asked to perform an objectionable action at the event—not bring in a bomb, obviously, but perhaps steal a CD. Before entering the expo, subjects reported to a kiosk containing a suite of body sensors, each able to take precise measurements from about 20 feet away: A cardiovascular and respiratory sensor measured heart rate and breathing, an eye tracker followed gaze and position of the eyes, thermal cameras measured heat on the face, and floor sensors and a high resolution video system tracked body movement.
That first round of measurements is an essential step, says FAST program manager Robert Middleton, since it assures that individuals are measured against their own baseline rather than some universal standard of agitation. “The system was designed this way from the beginning to avoid simply identifying individuals who enter screening already anxious or angry,” he says.
After the baseline was established, volunteers were then asked a series of questions ranging from innocuous (“Have you been in the area all day?”) to direct (“Are you planning to commit a crime?”). The interview acts like a stimulus: Theoretically, it should trigger a more robust physiological response from conspirators than from innocent passengers. John Verrico, the DHS spokesman for the project, acknowledges the impracticality of a screening system that relies on interrogation but suggests commercial versions will be better. Ultimately, he says, the measurements would be taken in a process akin to passing through a metal detector, and only people with suspicious vital signs would be taken aside for questioning.
DHS’s faith in the technology is based on the controversial theory of malintent, developed
in 2007 by clinical psychologist and FAST research consultant Daniel Martin. By combining ideas from neuroscience, psychophysiology, psychology, and counterterrorism, Martin concluded that the physiological signs of a future hostile actor would increase with
the severity of the impending act and as the moment of the crime approaches. If so, a terrorist who plans to blow up a plane in an hour should be easier to detect than a man who plans to cheat on his wife during a business trip. Martin also concluded that the physiological signs, such as heart rate and skin temperature, would be too minute to manipulate, eliminating the possibility that terrorists would outsmart the system. “The system analyzes responses that people have little or no control over,” he claims. “And even if someone can avoid detection on one sensor, it is unlikely he can avoid detection on all.”
Next page: Testing the theory