Back to Projects

Natural Multimodal Authentication in Smart Environments

In collaboration with the INIT Lab, this project focuses on comprehending user preferences, attitudes, expectations, and needs in relation to natural multimodal authentication within the context of smart environments. This research endeavor, emerged from the concept of leveraging multimodal interactions seamlessly and unobtrusively for authentication and security purposes. The project draws inspiration from the passive utilization of personal biometrics, such as fingerprints and facial recognition, where explicit user input is unnecessary. By exploring this avenue, the Ruiz HCI Lab aims to advance the field of natural multimodal authentication in future smart environments.

Collaborative Research: SaTC: CORE: Medium: Toward Age-Aware Continuous Authentication on Personal Computing Devices

MMGatorAuth: A Novel Multimodal Dataset for Authentication Interactions in Gesture and Voice

Sarah Morrison-Smith, Aishat Aloba, Hangwei Lu, Brett Benda, Shaghayegh Esmaeili, Gianne Flores, Jesse Smith, Nikita Soni, Isaac Wang, Rejin Joy, Damon L. Woodard, Jaime Ruiz, and Lisa Anthony. 2020. In Proceedings of the 2020 International Conference on Multimodal Interaction (ICMI ’20). Association for Computing Machinery, New York, NY, USA, 370–377. DOI:https://doi.org/10.1145/3382507.3418881