Back to Projects

Electromyography (EMG)-based Assistive Virtual Reality Human-Machine Interface

Over 100,000 people in the United States have upper limb amputations. Such amputations are usually associated with substantial disabilities. Activities of daily living may no longer be possible or require additional effort and time. To restore functional ability as fully as possible, amputee patients use upper-limb prostheses controlled by Electromyography (EMG)-based human-machine interfaces (HMIs). However, over 50% of prosthetic users report device dissatisfaction and abandonment due to frustration in use. In order to increase accessibility and utility of EMG-based assistive HMIs, it is critical to investigate the cognitive workload associated with using these systems for supporting motor skill rehabilitation and activities of daily living. In addition, with differences in individual muscle make-up and control, it is necessary to customize EMG-based assistive HMIs to particular patient conditions. However, at this time, there exists little to no general guidance on what interface control features may be more or less conducive to supporting learning and performance of psychomotor tasks.

In collaboration with Drs. Maryam Zahabi (Texas A&M) and David Kaber (UF), we developed a virtual reality human-machine interface (HMI) developed to evaluate the cognitive workload and compare performance metrics between these three control configurations using virtual dexterity exercises that reflect activities of daily living (ADLs). This interface aims to create a research and development environment that can provide insight into the usability and functional performance of recognized myoelectric upper-limb prosthesis control methods without the need for a physical prosthetic device. The EMG-based HMI developed as part of this project will allow researchers to evaluate task performance and cognitive workload rapidly and cost-effectively between current prosthetic control configurations.

Headshot of Jaime Ruiz wearing a Hololens

Jaime Ruiz

Associate Professor

jaime.ruiz@ufl.edu
Headshot of Austin Music

Austin Music

URA and MS Student (2019-2022)

CHS: Medium: Collaborative Research: Electromyography (EMG)- based Assistive Human-Machine Interface Design: Cognitive Workload and Motor Skill Learning Assessment