May 18, 2015
Opening the way for new applications of smart devices, Dartmouth researchers have created the first form of real-time communication that allows screens and cameras to talk to each other without the user knowing it.
Using off-the-shelf smart devices, the new system supports an unobtrusive, flexible and lightweight communication channel between screens (of TVs, laptops, tablets, smartphones and other electronic devices) and cameras. The system, called HiLight, will enable new context-aware applications for smart devices. Such applications include smart glasses communicating with screens to realize augmented reality or acquire personalized information without affecting the content that users are currently viewing. The system also provides far-reaching implications for new security and graphics applications.
The findings will be presented May 20 at the ACM MobiSys’15, a top conference in mobile systems, applications and services. A PDF of the study, further information and demonstration videos are available at the HiLight project website.
In a world of ever-increasing smart devices, enabling screens and cameras to communicate has been attracting growing interest. The idea is simple: information is encoded into a visual frame shown on a screen, and any camera-equipped device can turn to the screen and immediately fetch the information. Operating on the visible light spectrum band, screen-camera communication is free of electromagnetic interference, offering a promising alternative for acquiring short-range information. But these efforts commonly require displaying visible coded images, which interfere with the content the screen is playing and create unpleasant viewing experiences.
The Dartmouth team studied how to enable screens and cameras to communicate without the need to show any coded images like QR code, a mobile phone readable barcode. In the HiLight system, screens display content as they normally do and the content can change as users interact with the screens. At the same time, screens transmit dynamic data instantaneously to any devices equipped with cameras behind the scene, unobtrusively, in real time.
HiLight supports communication atop any screen content, such as an image, movie, video clip, game, web page or any other application window, so that camera-equipped devices can fetch the data by turning their cameras to the screen. HiLight leverages the alpha channel, a well-known concept in computer graphics, to encode bits into the pixel translucency change. HiLight overcomes the key bottleneck of existing designs by removing the need to directly modify pixel color values. It decouples communication and screen content image layers.
“Our work provides an additional way for devices to communicate with one another without sacrificing their original functionality,” says senior author Xia Zhou, an assistant professor of computer science and co-director of the DartNets (Dartmouth Networking and Ubiquitous Systems) Lab. “It works on off-the-shelf smart devices. Existing screen-camera work either requires showing coded images obtrusively or cannot support arbitrary screen content that can be generated on the fly. Our work advances the state-of-the-art by pushing screen-camera communication to the maximal flexibility.”
Assistant Professor Xia Zhou is available to comment at Xia.Zhou@dartmouth.edu.