Search Results for author: Mihai Burzo

Found 7 papers, 0 papers with code

In-the-Wild Video Question Answering

no code implementations COLING 2022 Santiago Castro, Naihao Deng, Pingxuan Huang, Mihai Burzo, Rada Mihalcea

Existing video understanding datasets mostly focus on human interactions, with little attention being paid to the “in the wild” settings, where the videos are recorded outdoors.

Evidence Selection Question Answering +2

WildQA: In-the-Wild Video Question Answering

no code implementations14 Sep 2022 Santiago Castro, Naihao Deng, Pingxuan Huang, Mihai Burzo, Rada Mihalcea

Existing video understanding datasets mostly focus on human interactions, with little attention being paid to the "in the wild" settings, where the videos are recorded outdoors.

Evidence Selection Question Answering +2

MUSER: MUltimodal Stress Detection using Emotion Recognition as an Auxiliary Task

no code implementations NAACL 2021 Yiqun Yao, Michalis Papakostas, Mihai Burzo, Mohamed Abouelenien, Rada Mihalcea

The capability to automatically detect human stress can benefit artificial intelligent agents involved in affective computing and human-computer interaction.

Emotion Recognition Multi-Task Learning

MuSE: a Multimodal Dataset of Stressed Emotion

no code implementations LREC 2020 Mimansa Jaiswal, Cristian-Paul Bara, Yuanhang Luo, Mihai Burzo, Rada Mihalcea, Emily Mower Provost

Endowing automated agents with the ability to provide support, entertainment and interaction with human beings requires sensing of the users{'} affective state.

Emotion Classification General Classification

A Multimodal Dataset for Deception Detection

no code implementations LREC 2014 Ver{\'o}nica P{\'e}rez-Rosas, Rada Mihalcea, Alexis Narvaez, Mihai Burzo

This paper presents the construction of a multimodal dataset for deception detection, including physiological, thermal, and visual responses of human subjects under three deceptive scenarios.

Deception Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.