Hinckley Paper Makes Lasting Impact

Published

Posted by Rob Knies

Lasting Impact Award (opens in new tab) (opens in new tab)

Can you remember details about what you were doing 10 years ago? Ken Hinckley (opens in new tab) certainly can.

Spotlight: On-demand video

AI Explainer: Foundation models ​and the next era of AI

Explore how the transformer architecture, larger models and more data, and in-context learning have helped advance AI from perception to creation.

In 2001, he and co-authors Jeff Pierce (opens in new tab), Mike Sinclair (opens in new tab), and Eric Horvitz (opens in new tab), had a paper called (opens in new tab) accepted for the 13th annual Association for Computing Machinery (ACM) Symposium on User Interface Software and Technology (UIST). In fact, the paper won the conference’s Best Paper Award that year.

That alone would make it memorable, but the stream of research leading to the paper and publication gained current acclaim last month, when, during UIST 2011 (opens in new tab), the paper received the Lasting Impact Award, for papers at least 10 years old that have been the most influential since publication.

“It was with great pleasure that I attended the 2011 24th annual ACM UIST Symposium,” Hinckley wrote in a subsequent blog post, “and received a Lasting Impact Award, presented to me by Stanford professor Dr. Scott Klemmer, for the contributions of our paper.”

The award’s citation reads: “Awarded for its scientific exploration of mobile interaction, investigating new interaction techniques for handheld mobile devices supported by hardware sensors, and laying the groundwork for new research and industrial applications.”

Hinckley, now a principal researcher at Microsoft Research Redmond (opens in new tab), generously allowed me to excerpt portions of his blog post (opens in new tab), which proved illuminating, as did the video, also called Sensing Techniques for Mobile Interaction (opens in new tab).

He and his colleagues had been exploring various ways to use sensors to make mobile devices smarter, and while thinking about screen orientation, they devised the idea of using an accelerometer to reorient the display automatically based on how the device was held. The co-authors also showed how the accelerometer could be used to provide new kinds of “physicality” that merged the digital and real worlds in new ways—for example, enabling users to tilt a device in different ways to move a digital pinball through a maze, just as if the pinball were on a surface being tilted by the user.

“The accelerometer gave us a constant signal of ‘this way up,’” he explains, “and, at some point, we realized it would make a great way to switch between portrait and landscape display formats without any need for buttons or menus—or, indeed, without even explicitly having to think about the interaction at all. … The user could simply move the device to the desired orientation, and our sensors and our software would automatically optimize the display accordingly.”

In the paper, Hinckley and his co-authors also showed how the accelerometers, as well as onboard infared and touch sensors, could be used to provide new perceptual abilities to mobile devices, enabling them to detect human activities. For example, they showed how sequences of signals from multiple sensors could be detected to interpret that a user was picking up the device or wished to record a voice message.

It took about five years before such concepts began to reach the marketplace, demonstrating how ahead of its time the paper was. Today, numerous handheld devices include accelerometers and other sensing capabilities—underscoring the paper’s ongoing utility. The perceptual competencies that Hinckley and his co-authors first created and demonstrated to the world have become near expectations about how mobile devices work.

“We had a total blast working on the sensing mobile device,” recalls Horvitz, a Microsoft distinguished scientist and deputy managing director of Microsoft Research Redmond. “I remember how magical it felt to pick up the device and to play a game with a little digital marble simply by tilting the device to and fro, or to scroll an Excel document by tilting the device. It was pretty clear that the unsightly prototype—with all of that duct tape and glue holding the sensors and microprocessors together—would soon evolve into an elegant mobile-computing platform.

“I felt as if the team was inventing the future—and that would soon be shared by many.”

Pierce, who in 2000 was on a Microsoft Research Graduate Fellowship, worked on the software infrastructure and contributed many ideas and nuances to the project. Sinclair planted the seed of thinking about accelerometers and helped build the sensing hardware. And Horvitz served to help shape the effort with ongoing collaboration and brainstorming on directions and possibilities.

“It took about five years longer than I expected,” Hinckley concludes, “but we have finally reached an age where clever uses of sensors abound.”