Search Results for author: Angelica Lim

Found 10 papers, 3 papers with code

MotionScript: Natural Language Descriptions for Expressive 3D Human Motions

no code implementations19 Dec 2023 Payam Jome Yazdian, Eric Liu, Li Cheng, Angelica Lim

This paper proposes MotionScript, a motion-to-text conversion algorithm and natural language representation for human body motions.

Emotional Theory of Mind: Bridging Fast Visual Processing with Slow Linguistic Reasoning

no code implementations30 Oct 2023 Yasaman Etesam, Ozge Nilay Yalcin, Chuxuan Zhang, Angelica Lim

Nevertheless, a gap remains in the zero-shot emotional theory of mind task compared to prior work trained on the EMOTIC dataset.

Emotion Recognition Language Modelling

Contextual Emotion Estimation from Image Captions

no code implementations22 Sep 2023 Vera Yang, Archita Srivastava, Yasaman Etesam, Chuxuan Zhang, Angelica Lim

In this paper, we explore whether Large Language Models (LLMs) can support the contextual emotion estimation task, by first captioning images, then using an LLM for inference.

Image Captioning Language Modelling +1

Towards Inclusive HRI: Using Sim2Real to Address Underrepresentation in Emotion Expression Recognition

no code implementations15 Aug 2022 Saba Akhyani, Mehryar Abbasi Boroujeni, Mo Chen, Angelica Lim

Robots and artificial agents that interact with humans should be able to do so without bias and inequity, but facial perception systems have notoriously been found to work more poorly for certain groups of people than others.

Read the Room: Adapting a Robot's Voice to Ambient and Social Contexts

no code implementations10 May 2022 Paige Tuttosi, Emma Hughson, Akihiro Matsufuji, Angelica Lim

By designing robots to speak in a more social and ambient-appropriate manner we can improve perceived awareness and intelligence for these agents.

Speech Synthesis Voice Conversion

Data-driven emotional body language generation for social robotics

1 code implementation2 May 2022 Mina Marmpena, Fernando Garcia, Angelica Lim, Nikolas Hemion, Thomas Wennekers

In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration, since humans attribute, and perhaps subconsciously anticipate, such traces to perceive an agent as engaging, trustworthy, and socially present.

Attribute Text Generation

The Many Faces of Anger: A Multicultural Video Dataset of Negative Emotions in the Wild (MFA-Wild)

1 code implementation10 Dec 2021 Roya Javadi, Angelica Lim

The portrayal of negative emotions such as anger can vary widely between cultures and contexts, depending on the acceptability of expressing full-blown emotions rather than suppression to maintain harmony.

Cultural Vocal Bursts Intensity Prediction Emotion Classification +1

Gesture2Vec: Clustering Gestures using Representation Learning Methods for Co-speech Gesture Generation

1 code implementation29 Sep 2021 Payam Jome Yazdian, Mo Chen, Angelica Lim

We propose a vector-quantized variational autoencoder structure as well as training techniques to learn a rigorous representation of gesture sequences.

Clustering Gesture Generation +3

SFU-Store-Nav: A Multimodal Dataset for Indoor Human Navigation

no code implementations28 Oct 2020 Zhitian Zhang, Jimin Rhim, Taher Ahmadi, Kefan Yang, Angelica Lim, Mo Chen

This article describes a dataset collected in a set of experiments that involves human participants and a robot.

Robot Navigation

The OMG-Empathy Dataset: Evaluating the Impact of Affective Behavior in Storytelling

no code implementations30 Aug 2019 Pablo Barros, Nikhil Churamani, Angelica Lim, Stefan Wermter

In this paper, we propose a novel dataset composed of dyadic interactions designed, collected and annotated with a focus on measuring the affective impact that eight different stories have on the listener.

Cannot find the paper you are looking for? You can Submit a new open access paper.