Criminals should start investing their ill-gotten gains in a new pair of kicks. That’s because computer scientists have just gotten a lot better at linking the prints of well-worn shoes back to their owners.
With each step you take, you not only leave an impression on your surroundings, but you also affect the sole of the shoe itself. This can be just the wearing of the tread caused by your weight and walking gait, or it can be something as distinctive as a slash or puncture.
However, most footwear forensics is still done manually, unlike fingerprints, and with thousands of tread patterns on the market, that’s an eye-straining job. Basically, an investigator sits in a dark room wading through thousands of images of soles, hunting for similarities between, say, the bloody shoe prints photographed from one crime scene and those from another.
Now, computer vision researcher Yi Tang and his colleagues at the University at Buffalo in Amherst, N.Y., have developed a way to automate this process. Tang’s system detects the geometric features on a shoe print, such as a tread pattern’s ellipses and lines, and records their positions relative to each other. Potential matches are sorted based on their similarity, and forensic experts only need to review a handful of prints by hand to identify the culprit.
To test the system’s reliability, Tang compared 300 real crime-scene prints and tried to match them with a database of 2,660 known prints. He found that 99 percent of the time, the correct match was within the top 5 percent of results. Tang says the principles underlying his method could also be applied to other forensic problems, such as handwriting or car tire tread analysis.
[This article originally appeared in print as "The Shoe Fits the Crime."]