Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics

  • Lim Jia Zheng, James Mountstephens, Jason Teo

Abstract

This paper presents a novel emotion recognition approach using electroencephalography (EEG) brainwave signals augmented with eye-tracking data in virtual reality (VR) to classify 4-quadrant circumplex model of emotions. 3600 videos are used as the stimuli to evoke user’s emotions (happy, angry, bored, calm) with a VR headset and a pair of earphones. EEG signals are recorded via a wearable EEG brain-computer interfacing (BCI) device and pupil diameter is collected also from a wearable portable eye-tracker. We extract 5 frequency bands which are Delta, Theta, Alpha, Beta, and Gamma from EEG data as well as obtaining pupil diameter from the eye-tracker as the chosen as the eye-related feature for this investigation. Support Vector Machine (SVM) with Radial Basis Function (RBF) kernel is used as the classifier. The best accuracies based on EEG brainwave signals and pupil diameter are 98.44% and 58.30% respectively.

Published
2020-04-14
How to Cite
Lim Jia Zheng, James Mountstephens, Jason Teo. (2020). Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics. International Journal of Advanced Science and Technology, 29(6s), 1492 - 1497. Retrieved from http://sersc.org/journals/index.php/IJAST/article/view/9289