2025 SHORTLISTED PARTICIPANTS

Miss. Amber Hsiao-Yang Chou

PhD student

University of Washington

Human-machine interfaces (HMI) record biosignals generated by humans – such as neural (brain), visual (gaze), or limb movements – and decode them into control inputs for assistive devices such as prosthetic limbs. However, their clinical adoption remains limited because most conventional interfaces cannot adapt to diverse users, resulting in low usability and high abandonment among people with motor impairments. There is a need to seamlessly tailor biosignal-based HMI to diverse users and tasks, providing an “out-of-the-box” solution thatrequires no expert setup. This problem has traditionally been challenging due to the high variability of biosignals – humans interact with interfaces differently and adapt over time – requiring extensive interface calibration. My research addresses this challenge by modeling complex human behaviors in HMI and personalizing interfaces based on individual needs and abilities. I leveraged theoretical frameworks in control theory, neuroengineering, and data-driven methods to (1) model humans as control systems interacting with machines to investigate user strategies and adaptations in multimodal HMI, and use these insights to (2) design interfaces that co-adapt in real time – that is, adapt in response to user’s ongoing adaptation. Understanding human control strategies not only informs the underlying dynamics of behaviors, but also provides interpretable objective functions to apply in optimization tools. For instance, in robot-assisted rehabilitation, modeling patient behaviors interacting with the robot will help caregivers predict recovery and provide optimized robotic intervention. During my PhD, I integrated multimodal inputs, including electromyography (EMG), eye-tracking, gestures, and haptics, and conducted experiments to model user strategies and adaptation. I then applied the model to design interfaces that adapt machine dynamics to optimize assistance for the user. These studies provide new theoretical and experimental frameworks to personalize multimodal HMI; moving forward, I plan to implement the frameworks in rehabilitation programs, such as robot-guided therapy, to deliver personalized health interventions.