Gestures for Mobile Interaction

The Ruiz HCI Lab has had several projects that have examined the use of novel gesturing paradigms to interact with smartphones and smartwatches. This work includes our work using motion gestures (i.e., physically moving a device), gestures on the back of a device, and gestures around a device.

Hand motion -- pointing, gesturing, grasping, shaking, tapping -- is a rich channel of communication. We point and gesture while we talk; we grasp tools to extend our capabilities; we grasp, rotate, and shake items to explore them. Yet, the rich repertoire of hand motion is largely ignored in interfaces to mobile computation: the user of a modern smartphone generally holds the device stationary while tapping or swiping its surface. Why are so many possible affordances ignored? Certainly not for technical reasons, as smartphones contain an evolving set of sensors for recognizing movement of the phone, including accelerometers, gyroscopes and cameras. However, beyond rotating to change screen orientation or shaking to shuffle songs, little has been done to enable rich gestural input through device motion. This research project includes my on-going work in design, recognition and characterizing of motion gestures to control modern smartphones

Exploring User-Defined Back-Of-Device Gestures for Mobile Devices
Shaikh Shawon Arefin Shimon, Sarah Morrison-Smith, Noah John, Ghazal Fahimi, and Jaime Ruiz. 2015. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '15). ACM, New York, NY, USA, 227-232. DOI:

Exploring Non-touchscreen Gestures for Smartwatches
Shaikh Shawon Arefin Shimon, Courtney Lutton, Zichun Xu, Sarah Morrison-Smith, Christina Boucher, and Jaime Ruiz. 2016. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 3822-3833. DOI: