The Ruiz HCI Lab at UF main research agenda is to establish an understanding of users to develop technology to support user interaction. As such, we have a wide variety of current projects.
A main focus of the Ruiz HCI Lab's research has been exploring the use of gestures in various types of interactions. Projects under this category have examined the use of gestures in various different interactions including: mobile interaction, interactions with wearable computers, and large-display interactions.
In order to conduct research on gesture-based interactions, a focus of our work is on developing tools to support gestural analysis. This includes tools to assist in the annotation, analysis, and visualization of gesture data, with an emphasis on creating tools that automate part of the work and thus leverage both machine and human capabilities to complete tasks.
The Ruiz HCI Lab has had several projects that have examined the use of novel gesturing paradigms to interact with smartphones and smartwatches. This work includes our work using motion gestures (i.e., physically moving a device), gestures on the back of a device, and gestures around a device.
Augmented Reality (AR) keeps users situated in reality while allowing interactions with virtual objects, thus having the potential to increase users’ situational awareness through providing a secondary channel of information that can be overlaid over the real world. In our work, we focus on examining the design of information in AR headsets in the context of increasing users’ situational awareness.
Personal assistants such as Siri have changed the way people interact with computers by introducing virtual assistants that collaborate with humans through natural speech-based interfaces. However, relying on speech alone as the medium of communication can be a limitation; non-verbal aspects of communication also play a vital role in natural human discourse. Thus, it is necessary to identify the use of gesture and other non-verbal aspects in order to apply them towards the development of computer systems. The goal of this project is to understand how systems can model non-verbal communication as a means to mimic human-human communication.
In collaboration with Dr. Lisa Anthony's INIT Lab, this work explores the challenges children have interacting with Natural User Interfaces (NUIs), i.e., interfaces that use touch, speech, and gesture. Our goal is build better technology and interactions for children by understanding how children interactions differ from adults.
We have established a research program that aims to support life science research that requires extensive use of computational tools (often referred to e-Science in the literature). Our research in this area consists of two main thrusts; research supporting user engagement with genomic analysis tools, and understanding how technology can support large multidisciplinary research projects. The research projects in this domain includes collaborators in computer science, clinical sciences, veterinary medicine, microbiology and immunology, and food science.