Search Results for author: Sarthak Malik

Found 3 papers, 3 papers with code

Synthesizing Sentiment-Controlled Feedback For Multimodal Text and Image Data

1 code implementation12 Feb 2024 Puneet Kumar, Sarthak Malik, Balasubramanian Raman, Xiaobai Li

It implements an interpretability technique to analyze the contribution of textual and visual features during the generation of uncontrolled and controlled feedback.

Marketing Sentiment Analysis +1

Interpretable Multimodal Emotion Recognition using Hybrid Fusion of Speech and Image Data

1 code implementation25 Aug 2022 Puneet Kumar, Sarthak Malik, Balasubramanian Raman

A new interpretability technique has been developed to identify the important speech & image features leading to the prediction of particular emotion classes.

Multimodal Emotion Recognition

Hybrid Fusion Based Interpretable Multimodal Emotion Recognition with Limited Labelled Data

1 code implementation24 Aug 2022 Puneet Kumar, Sarthak Malik, Balasubramanian Raman, Xiaobai Li

This paper proposes a multimodal emotion recognition system, VIsual Spoken Textual Additive Net (VISTA Net), to classify emotions reflected by multimodal input containing image, speech, and text into discrete classes.

Multimodal Emotion Recognition

Cannot find the paper you are looking for? You can Submit a new open access paper.