Creating a Touch-Screen on a Countertop

By translating shapes into computerized images, this system can turn any surface into a touch-screen.

By Gregory Mone
Mar 18, 2013 4:34 PMNov 14, 2019 9:18 PM
computing-1.jpg
Graduate students use hand gestures and motions on a simple tabletop to input various commands into the computer at Purdue’s C-Design lab. Infrared depth sensors (in the white bar on the table's far side) measure the position of hands and fingers in 3-D space. An image-processing algorithm tracks and interprets hand gestures and motions over time. | C-Design Lab/Purdue University

Newsletter

Sign up for our email newsletter for the latest science news
 

The next step in touch-screens may be to ditch the actual screen, according to researchers at Purdue University. Engineers Karthik Ramani and Niklas Elmqvist and their colleagues recently unveiled a projector-based computer input system that can transform desks, office walls, or even kitchen islands into interactive surfaces. 

The system, called extended multitouch, consists of a computer, a projector, and a Microsoft Kinect depth-sensing camera. (Also used for the Xbox 360, Kinect enables users to interact with games and devices optically, without touching anything.) The projector displays the contents of a typical computer screen onto a surface like your refrigerator or stone countertop, while the Kinect’s infrared laser and receiver estimate the distance to the surface. 

Any surface transforms into a touch-screen with the extended multitouch system. It can decipher several hands, (left) translating their shapes into computerized images (center) and maps of each hand's touchpoints (right).  | C-Design Lab/Purdue University

As a user moves his hand into view, the Kinect snaps 30 frames per second, and an image-processing algorithm in the computer tracks and deciphers the changes from frame to frame. “Basically, we know where your hand is and what your hand is doing,” Ramani says. If the user’s finger comes within a few centimeters of the projected image (i.e., the surface), the system interprets this as a touch.

Other researchers have demonstrated technology that can make use of a similar kind of virtual click, but the Purdue system is unique: It can distinguish between thumbs and fingers, pick out a right hand from a left, and even track the hands of several users at once. In one experiment, several subjects drew virtual pictures on a table, and the computer marked each individual’s rendering in a unique color.

The touchpoints of hands are based on computerized snapshots of hand positions. | C-Design Lab/Purdue University

The extended multitouch system can pick up 16 different hand gestures, which could be translated into more complex commands—the equivalent of having a mouse with 16 buttons within easy reach. Elmqvist envisions engineers and architects designing virtual 3-D structures purely with their hands—no mouse clicks or keyboard strokes necessary. 

Consumers can also benefit from the technology, turning almost any surface into a gigantic iPad; a virtual game of air hockey on the kitchen table could be wiped away to read an interactive edition of the newspaper or recipes on Pinterest. “We’re going through a revolution in the space of humans interacting with physical and virtual things,” says Ramani. “I see a whole new world emerging.”

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 LabX Media Group