How Hand Constraints Influence User-Defined Gestures in Mixed RealityAlexander Barquero, Niriksha Regmi, Oluwatomisin Obajemu, Rohith Verkatakrishnan, Christina Boucher, Lisa Anthony, and Jaime Ruiz
How do user-defined gestures for mixed reality change when users’ hands are engaged in tasks? To address this question, we conducted a gesture elicitation study to understand user preferences and the characteristics of gestures conceptualized in three scenarios with varying levels of hand constraints, namely: “both hands free”, “one hand fixed”, and “both hands busy”. We analyzed these gestures across multiple dimensions and compared our findings with those from prior research. Our results indicate that when both hands are occupied, users tend to favor head gestures over those involving other body parts, such as the eyes or legs. Additionally, we found that most of the proposed gestures were metaphorical, with many influenced by legacy bias. These insights enhance our understanding of how hand constraints influence gesture choices in mixed reality scenarios.
Citation
Alexander Barquero, Niriksha Regmi, Oluwatomisin Obajemu, Rohith Verkatakrishnan, Christina Boucher, Lisa Anthony, & Jaime Ruiz. (2025). How Hand Constraints Influence User-Defined Gestures in Mixed Reality. In Proceedings of Graphics Interface 2025.