Abstract:
Atypical behavioral viewing pattern is one of the core deficits of individuals with Autism Spectrum Disorder (ASD). This diminishes their ability to understand the communicator's facial emotional expression and often misinterpret one's intended emotion. Here, we investigated the feasibility of using one's gaze-related indices to estimate distinctive changes corresponding to various emotions. We designed a usability study with nine individuals with ASD and Typically Developing (TD) individuals who were exposed to Virtual Reality (VR) based social scenarios. The VR scenes presented virtual characters who narrated their social experience in the form of short stories with context-relevant emotional expressions. Simultaneously, we collected one's gaze-related physiological indices (PIs) and behavioral looking pattern indices (BIs) using a technologically-enhanced eye-tracker. Subsequently, these PIs and BIs were used to classify the implications of the emotional expressions both within and across the ASD and TD groups. Results of the usability study indicate that one's gaze-related indices can be discriminated with 97% accuracy for various emotions for intra-group analysis and 100% accuracy for inter-group analysis.