1 code implementation • 18 Nov 2018 • Yu Yin, Mohsen Nabian, Miolin Fan, Chun-An Chou, Maria Gendron, Sarah Ostadabbas
In this paper, we present a multimodal approach to simultaneously analyze facial movements and several peripheral physiological signals to decode individualized affective experiences under positive and negative emotional contexts, while considering their personalized resting dynamics.