“Press F1 for help.” When you’re working in a computer program and you get stuck, you can press a button and a handy guide pops up alongside whatever you’re puzzling over to clue you in. Wouldn’t it be great if real life worked that way? Out on the town, you could look at a restaurant, press a button, and get advice on whether the staff is surly or what wine to order. Motorola Labs, the research arm of the Schaumburg, Illinois, telecommunications giant, is developing a method to make that possible.The key is augmented reality (AR), a technology that combines stored digital data and information from the real world. A car navigational system is a simple form of AR: It works out your physical location using the Global Positioning System (GPS), consults a map database ,figures out a route to your destination, and provides a constant stream of directions to guide you.
Motorola is taking the next step by combining MySpace-style social networking with your cell phone’s camera and display. The idea is that you’ll be able to snap a picture of, say, a neighborhood bar counter with your cell phone and send it to Motorola’s service (called MARMS, for Mobile Augmented Reality Messaging System). Your cell phone will add information about where the phone is and in what direction it is pointed. MARMS will send back your bar-scene photo with new, computer-generated objects pasted into it. For example, a foaming beer stein could appear sitting on the counter (as depicted in the simulated image to the left), or a brightly colored banner might materialize over the jukebox.
These objects are visual references to notes left during some earlier visit to that bar by members of your online social network. Clicking on the beer stein could call up a note recommending a particular microbrew—or a warning about bad service. The jukebox banner might be tied to video from a friend’s party. You could then continue the process, adding an object to the scene and linking it to a note of your own creation. In essence, MARMS turns a cell phone into a mobile window on a virtual world filled with a rich collection of location-specific comments, video clips, and more. Motorola believes a basic version of the cell phone camera system could be available as soon as 2010.
How it Works
MARMS uses GPS-based technology to get a fix on a cell phone’s location. The phone has additional sensors, such as a compass, built into it to determine which way the cell phone’s camera is pointing when it takes a photograph. The location and orientation data are sent, along with the image, to a central server. The central server then checks through its catalog of virtual objects for any that are associated with the location photographed by the phone.
These virtual objects are entered into the system by other users, who create them using a MARMS-enabled phone when they visit a location in person. They can also add objects later through a map-based website. The objects act as markers, or tags, linking to location-specific messages that are also stored on the central server. If the server decides that there are objects with geographic coordinates that fall within the camera’s field of view, it superimposes these objects on the picture from the phone camera, in a fashion similar to the way the director of a science fiction movie might use special effects to add a spacecraft to a filmed scene. The server sends the modified image back to the cell phone with its overlay of virtual objects. Clicking on one of these objects will call up the specific snippet of information tied to that tag.
This car navigation system not only speaks to you but can also listen to your spoken requests for directions to things like the nearest fuel station or hospital.