On-body Sensing and Signal Analysis for User Experience Recognition in HMI Project

Development of intelligent devices and AI algorithms for recognition of user experience through emotion detection using physiological signals are explored in this project. The designed intelligent device would recognize user’s emotion quality and intensity in a two dimensional emotion space continuously. The continuous recognition of the user’s emotion during human-machine interaction (HMI) will enable the machine to adapt its activity based on the user’s emotion in a real-time manner, thus improving user experience.

Experience of emotion is one of the key aspects of user experience affecting to all aspects of the HMI, including utility, ease of use, and efficiency. The machine’s ability to recognize user’s emotion during user-machine interaction would improve the overall HMI usability. The machines that are aware of the user’s emotion could adapt their activity features such as speed based on user’s emotional state. This project focuses on emotion recognition through physiological signals, as this bypasses social masking and the prediction is more reliable.

Prediction of emotion through physiological signals has the advantage of elimination of social masking and making the prediction more reliable. The key advantage of this project over others presented to date is the use of the least number of modalities (only two physiological signals) to predict the quality and intensity of emotion continuously in time, and using the most recent widely accepted emotion model.

If you are interested to collaborate or know more about this project please contact Roya Haratian, lecturer in Department of Design and Engineering, Science and Technology Faculty.