Computer vision and inertial measurement have made it possible for people to interact with computers using whole-body gestures. Although there has been rapid growth in the uses and applications of these systems, their ubiquity has been limited by the high cost of heavily instrumenting either the environment or the user. In this paper, we use the human body as an antenna for sensing whole-body gestures. Such an approach requires no instrumentation to the environment, and only minimal instrumentation to the user, thus enabling truly mobile applications. We show robust gesture recognition with an average accuracy of 93% across 12 whole-body gestures, and promising results for robust location classification within a building. In addition, we demonstrate a real-time interactive system which allows a user to interact with a computer using whole-body gestures.