“The way we design computers today,” Microsoft researcher Hong Tan says, “it would seem that people only use their eyes.”
Sure, we tap on our device screens, slide our fingertips across the glass, and type on on-screen keyboards. Sometimes, we give voice commands and listen back. Our phones vibrate when a text arrives, and we feel a rumble from a joystick when we play a video game.
But still, it’s almost entirely about the eyes.
Tan believes we have barely begun to engage our other senses, particularly the one in which she specializes: touch. Her field of expertise is computer haptics—developing hardware and software to add tactile sensations to the computing experience.
Imagine experiencing a clicking sensation when pressing an on-screen button, sensing the weight of folders when dragging and dropping, and perhaps even feeling the texture of a sweater for sale online.
“With sight alone, most people are perfectly fine interacting with computing devices today,” Tan says, “but how much more efficiently, how much more enjoyably, can we interact with computers? How much more accessible can we make them? We won’t know until this becomes taken for granted.”
Based at Microsoft Research Asia, Tan recently introduced her work and the field of computer haptics to her largest audience yet via a popular technology column on Sina.com, the largest Chinese-language web portal.
Tan begins with the basic assumption that we are multisensory beings—a perspective she absorbed at the start of her research career. As a graduate student in electrical engineering at the Massachusetts Institute of Technology, she worked with an adviser, Nat Durlach, who was studying psychoacoustics—the psychology of hearing—and performing work on hearing-aid technology.
“He was studying people who were both deaf and blind,” Tan says. “They have a way of communicating by placing their hands on speakers’ faces. They can feel how your mouth opens and closes, the air flow, the muscle tension in the cheeks. They can also pick up laryngeal vibration.”
Tan explains that people who are both deaf and blind also place their hands back on their own faces and try to articulate the same sounds until the sensation in their hands felt similar—much the way Helen Keller learned to talk a century earlier.
“We were really amazed and inspired by that method,” she says. “That’s how I all of a sudden got interested in touch, because I just realized how little we knew about touch, about the richness of touch, and also how much information touch can transmit. Touch can convey speech—that’s a lot of information.”
Almost Magic
Tan spent 13 years on the Purdue University faculty before joining Microsoft Research in 2011 on a leave of absence. She is working in two areas of computer haptics: providing key-click feedback on flat keyboards and creating feelings of texture and traction on glass surfaces using electrovibration.
“The thing that’s really, really cool,” she says, “is to take a smooth piece of glass but make it feel different—it’s almost magic.”
The way users currently interact with device screens is entirely two-dimensional, Tan notes, as opposed to a multisensory experience.
Tan’s vision, and that of the small but growing field of computer haptics, is to “fully engage the sense of touch in user-device interaction.”
Haptics researchers are investigating a range of tactile-feedback mechanisms that involve both hardware and software. Some are based on skin-receptor stimulation, others on muscle-receptor stimulation. The work is heavily interdisciplinary. In the case of vibration feedback, for example, it involves electronic engineering and mechanics, as well as knowledge about human sensitivity to various vibration frequencies.
In the case of haptic keyboards, the key-click feedback is not just symbolic or abstract: It simulates the feeling of pressing a key. One method is to put a layer of material under the glass that bends under electric voltage, which, in turn, bends the glass ever so slightly to simulate a key click.
Tan’s Human-Computer Interaction group has developed a number of prototypes that produce haptic keyboard feedback. In a recent research paper, she and Jin Ryong Kim, a graduate student at Purdue, showed that haptic keyboards are far superior to conventional on-screen keyboards in terms of typing speed and accuracy.
Tan’s group also is working on haptic approaches that simulate sticky and smooth sensations. This can be accomplished through electrovibration—alternating voltage applied to the glass surface—to change the friction between the fingertip and the screen.
Searching for the Killer App
So how will haptics change the computing experience?
“It’s all up to the imagination right now,” Tan says. “We haven’t really designed anything that has become mainstream enough.”
The field has come a long way since 1996, when Tan earned her Ph.D. and had to explain the term “haptics” in her job talks at various computer-science and engineering departments. That was only a couple of years after the first computer-haptics symposium.
“We’ve made a lot of progress since then, because now everybody has some notion of what haptics is,” she says. “At the time, pursuing a career in haptics research was a huge risk.”
But in many ways, the field is still in its infancy.
“It’s not growing as fast as I would hope,” Tan says. “The burden is on those of us who are working on it to really show the value of the work.”
One group likely to welcome advances in computer haptics: people with sensory impairments. Tan sees tremendous prospects for haptics to help not only people living with sight loss, but any person.
“My philosophy,” she says, “is to design something that’s so universal—that’s useful to everybody—that everybody benefits.”
Tan cites the example of corner cutouts on sidewalk curbs, which were intended to help wheelchair users but have proven valuable to many other people, including delivery drivers with dollies and parents pushing strollers.
“If we really design multimodally, I think everybody benefits,” she says. “Then it’s truly useful to people with hearing or visual impairments because it’s all built into the same product.”
Tan has started interacting closely with the Accessibility group at Microsoft, which provides her with insights into how blind people read maps, for example, or navigate a space. She also communicates with designers throughout the company on how haptics might help improve everyday interactions between people and their loved ones over a distance.
“I don’t think anybody has the killer app yet,” Tan says. “We’re just trying lots of different things and seeing how people respond to them.”