Creating a Touch-Screen on a Countertop

By translating shapes into computerized images, this system can turn any surface into a touch-screen.

By Gregory Mone|Monday, March 18, 2013
RELATED TAGS: COMPUTERS
computing-1
computing-1

Graduate students use hand gestures and motions on a simple tabletop to input various commands into the computer at Purdue’s C-Design lab. Infrared depth sensors (in the white bar on the table's far side) measure the position of hands and fingers in 3-D space. An image-processing algorithm tracks and interprets hand gestures and motions over time.

C-Design Lab/Purdue University

The next step in touch-screens may be to ditch the actual screen, according to researchers at Purdue University. Engineers Karthik Ramani and Niklas Elmqvist and their colleagues recently unveiled a projector-based computer input system that can transform desks, office walls, or even kitchen islands into interactive surfaces. 

The system, called extended multitouch, consists of a computer, a projector, and a Microsoft Kinect depth-sensing camera. (Also used for the Xbox 360, Kinect enables users to interact with games and devices optically, without touching anything.) The projector displays the contents of a typical computer screen onto a surface like your refrigerator or stone countertop, while the Kinect’s infrared laser and receiver estimate the distance to the surface. 

computing-2
computing-2

Any surface transforms into a touch-screen with the extended multitouch system. It can decipher several hands, (left) translating their shapes into computerized images (center) and maps of each hand's touchpoints (right). 

C-Design Lab/Purdue University

As a user moves his hand into view, the Kinect snaps 30 frames per second, and an image-processing algorithm in the computer tracks and deciphers the changes from frame to frame. “Basically, we know where your hand is and what your hand is doing,” Ramani says. If the user’s finger comes within a few centimeters of the projected image (i.e., the surface), the system interprets this as a touch.

Other researchers have demonstrated technology that can make use of a similar kind of virtual click, but the Purdue system is unique: It can distinguish between thumbs and fingers, pick out a right hand from a left, and even track the hands of several users at once. In one experiment, several subjects drew virtual pictures on a table, and the computer marked each individual’s rendering in a unique color.

computing-4
computing-4

The touchpoints of hands are based on computerized snapshots of hand positions.

C-Design Lab/Purdue University

The extended multitouch system can pick up 16 different hand gestures, which could be translated into more complex commands—the equivalent of having a mouse with 16 buttons within easy reach. Elmqvist envisions engineers and architects designing virtual 3-D structures purely with their hands—no mouse clicks or keyboard strokes necessary. 

Consumers can also benefit from the technology, turning almost any surface into a gigantic iPad; a virtual game of air hockey on the kitchen table could be wiped away to read an interactive edition of the newspaper or recipes on Pinterest. “We’re going through a revolution in the space of humans interacting with physical and virtual things,” says Ramani. “I see a whole new world emerging.”

Comment on this article
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

ADVERTISEMENT
ADVERTISEMENT
Collapse bottom bar
DSCSeptCover
+

Log in to your account

X
Email address:
Password:
Remember me
Forgot your password?
No problem. Click here to have it emailed to you.

Not registered yet?

Register now for FREE. It takes only a few seconds to complete. Register now »