Search Results for author: Rishi Hazra

Found 9 papers, 5 papers with code

SayCanPay: Heuristic Planning with Large Language Models using Learnable Domain Knowledge

no code implementations24 Aug 2023 Rishi Hazra, Pedro Zuidberg Dos Martires, Luc De Raedt

Large Language Models (LLMs) have demonstrated impressive planning abilities due to their vast "world knowledge".

World Knowledge

Deep Explainable Relational Reinforcement Learning: A Neuro-Symbolic Approach

no code implementations17 Apr 2023 Rishi Hazra, Luc De Raedt

By resorting to a neuro-symbolic approach, DERRL combines relational representations and constraints from symbolic planning with deep learning to extract interpretable policies.

reinforcement-learning

EgoTV: Egocentric Task Verification from Natural Language Task Descriptions

1 code implementation ICCV 2023 Rishi Hazra, Brian Chen, Akshara Rai, Nitin Kamra, Ruta Desai

The goal in EgoTV is to verify the execution of tasks from egocentric videos based on the natural language description of these tasks.

Zero-Shot Generalization using Intrinsically Motivated Compositional Emergent Protocols

1 code implementation11 May 2021 Rishi Hazra, Sonu Dixit, Sayambhu Sen

Human language has been described as a system that makes \textit{use of finite means to express an unlimited array of thoughts}.

Zero-shot Generalization

gComm: An environment for investigating generalization in Grounded Language Acquisition

1 code implementation9 May 2021 Rishi Hazra, Sonu Dixit

It comprises a 2-d grid environment with a set of agents (a stationary speaker and a mobile listener connected via a communication channel) exposed to a continuous array of tasks in a partially observable setting.

Language Acquisition

Active$^2$ Learning: Actively reducing redundancies in Active Learning methods for Sequence Tagging and Machine Translation

no code implementations NAACL 2021 Rishi Hazra, Parag Dutta, Shubham Gupta, Mohammed Abdul Qaathir, Ambedkar Dukkipati

We empirically demonstrate that the proposed approach is further able to reduce the data requirements of state-of-the-art AL strategies by an absolute percentage reduction of $\approx\mathbf{3-25\%}$ on multiple NLP tasks while achieving the same performance with no additional computation overhead.

Active Learning Machine Translation +1

Intrinsically Motivated Compositional Language Emergence

1 code implementation9 Dec 2020 Rishi Hazra, Sonu Dixit, Sayambhu Sen

To deal with this, existing works have proposed a limited channel capacity as an important constraint for learning highly compositional languages.

Language Acquisition

Active$^2$ Learning: Actively reducing redundancies in Active Learning methods for Sequence Tagging and Machine Translation

1 code implementation1 Nov 2019 Rishi Hazra, Parag Dutta, Shubham Gupta, Mohammed Abdul Qaathir, Ambedkar Dukkipati

We empirically demonstrate that the proposed approach is further able to reduce the data requirements of state-of-the-art AL strategies by $\approx \mathbf{3-25\%}$ on an absolute scale on multiple NLP tasks while achieving the same performance with virtually no additional computation overhead.

Active Learning Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.