Posted by
Steve Hodges (opens in new tab) and his colleagues in the Sensors and Devices (opens in new tab) group at Microsoft Research Cambridge (opens in new tab) spend their time pursuing novel sensing technologies and new devices that make it easier for people to interact with computer systems and digital content.
on-demand event
The team’s successes have been many, and among the most notable have been SenseCam (opens in new tab)—a wearable camera that takes photos automatically, thereby enabling users to review a series of snapshots and recall events as they transpired—and .NET Gadgeteer (opens in new tab), a rapid prototyping platform for small electronic gadgets and embedded hardware devices.
Now, these creative researchers have unveiled their latest concept via a note titled (opens in new tab) during the Association for Computing Machinery’s 2013 SIGCHI Conference on Human Factors in Computing Systems (opens in new tab), being held in Paris through May 2.
The note was co-written by Norman Pohl of the University of Stuttgart, Hodges and his Microsoft Research Cambridge colleagues Nicolas Villar (opens in new tab) and John Helmes, along with Tim Paek (opens in new tab) of Microsoft Research Redmond (opens in new tab).
“We interact with digital content more and more—such as electronic diaries, emails, traffic status, and weather info,” says Hodges, principal hardware engineer at the Cambridge lab. “But even if you have your mobile phone in your pocket, it can be a pain to interact with this content in many cases.
“The badge is always on hand and lets you navigate to the content you want simply by moving it to the right place relative to your body, using your spatial muscle memory. It’s much easier for quick ‘snacking’ on small amounts of digital content.”
The lightweight, interactive badge prototype includes an embedded LCD that presents dynamic information to the wearer. Sensing-based input capabilities are built into the badge’s retractable string, enabling single-hand interaction.
“When you pull the badge away from your body,” Hodges explains, “the sensor detects how far the badge is pulled out and at what angle, enabling the system to know where the badge is in relation to your body. Depending on this location, it displays different content. If the content to be displayed is too big to fit on the badge display, it’s possible to pan around it.”
Paek, in particular, served as a catalyst for moving the research project to where it is today.
“Our work on augmented-reality systems—like HoloDesk (opens in new tab) and the mobile projector (opens in new tab),” Hodges says, “got us thinking about a lightweight display which could sense its location relative to the body and which could act like a ‘lens’ onto virtual digital content.
“But it was a conversation with Tim that actually spurred us to turn all these thoughts into a full-fledged research project when we realized that his vision about displaying automatically mined context information on a low-power, wearable device like a badge were similar to ours.”
Of course, the exploration of a new, intriguing research project has its own attractions.
“This research still has many open questions on design, form factor, and content to be explored, but the challenge of working on technology that has promising, unexplored potential is what makes this exciting,” he enthused. “So many people already wear a badge on a regular basis—in offices, hospitals, or schools and universities—yet they are currently just pieces of plastic with static images on them.
“Let’s turn them into interactive devices!”