Toward Realistic Hands Gesture Interface: Keeping it Simple for Developers and Machines

  • ,
  • Kfir Karmon ,
  • Noam Bloom ,
  • Daniel Freedman ,
  • Ilya Gurvich ,
  • ,
  • Ido Leichter ,
  • Yoni Smolin ,
  • Yuval Tzairi ,
  • Alon Vinnikov ,
  • Aharon Bar Hillel

CHI'17 Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems |

Published by ACM

Development of a rich hand-gesture-based interface is currently a tedious process, requiring expertise in computer vision and/or machine learning. We address this problem by introducing a simple language for pose and gesture description, a set of development tools for using it, and an algorithmic pipeline that recognizes it with high accuracy. The language is based on a small set of basic propositions, obtained by applying four predicate types to the fingers and to palm center: direction, relative location, finger touching and finger folding state. This enables easy development of a gesture-based interface, using coding constructs, gesture definition files or an editing GUI. The language is recognized from 3D camera input with an algorithmic pipeline composed of multiple classification/regression stages, trained on a large annotated dataset. Our experimental results indicate that the pipeline enables successful gesture recognition with a very low computational load, thus enabling a gesture-based interface on low-end processors.

Publication Downloads

Tagged Hands Data Set

August 1, 2017

This file contains datasets accompanying the paper: Krupka and Karmon et al. (2017). Toward Realistic Hands Gesture Interface: Keeping it Simple for Developers and Machines. There are two datasets within: FingersData - This dataset contains 3,500 labeled depth frames of various hand poses. GestureClips - This dataset contains 140 gesture clips, of which 100 contain labeled hand gestures and 40 contain non-gesture activity.