We present a data-driven deep neural algorithm for detecting deceptive walking behavior using nonverbal cues like gaits and gestures.
For the annotated data, we also train a classifier to map the latent embeddings to emotion labels.
We also investigate the perception of a user in an AR setting and observe that an FVA has a statistically significant improvement in terms of the perceived friendliness and social presence of a user compared to an agent without the friendliness modeling.
We also present an EWalk (Emotion Walk) dataset that consists of videos of walking individuals with gaits and labeled emotions.
We present a Pedestrian Dominance Model (PDM) to identify the dominance characteristics of pedestrians for robot navigation.
We also present a novel interactive multi-agent simulation algorithm to model entitative groups and conduct a VR user study to validate the socio-emotional predictive power of our algorithm.
Graphics Human-Computer Interaction
We present a novel approach to automatically identify driver behaviors from vehicle trajectories and use them for safe navigation of autonomous vehicles.