1 code implementation • 10 Apr 2024 • Rajarshi Haldar, Julia Hockenmaier
We show that for the task of code summarization, the performance of these models on individual examples often depends on the amount of (subword) token overlap between the code and the corresponding reference natural language descriptions in the dataset.
no code implementations • 7 Feb 2024 • Ikhyun Cho, Changyeon Park, Julia Hockenmaier
Machine unlearning (MUL) is an arising field in machine learning that seeks to erase the learned information of specific training data points from a trained model.
no code implementations • 17 Jan 2024 • Yoonhwa Jung, Ikhyun Cho, Shun-Hsiang Hsu, Julia Hockenmaier
With growing concerns surrounding privacy and regulatory compliance, the concept of machine unlearning has gained prominence, aiming to selectively forget or erase specific learned information from a trained model.
1 code implementation • 21 May 2023 • Marc E. Canby, Julia Hockenmaier
Transformer-based encoder-decoder models that generate outputs in a left-to-right fashion have become standard for sequence-to-sequence tasks.
1 code implementation • 25 Aug 2022 • Qingyun Wang, Manling Li, Hou Pong Chan, Lifu Huang, Julia Hockenmaier, Girish Chowdhary, Heng Ji
Goal-oriented generative script learning aims to generate subsequent steps to reach a particular goal, which is an essential task to assist robots or humans in performing stereotypical activities.
no code implementations • 19 Jul 2022 • Harsha Kokel, Mayukh Das, Rakibul Islam, Julia Bonn, Jon Cai, Soham Dan, Anjali Narayan-Chen, Prashant Jayannavar, Janardhan Rao Doppa, Julia Hockenmaier, Sriraam Natarajan, Martha Palmer, Dan Roth
We consider the problem of human-machine collaborative problem solving as a planning task coupled with natural language communication.
1 code implementation • Findings (ACL) 2021 • Liliang Ren, Chenkai Sun, Heng Ji, Julia Hockenmaier
Text-to-Graph extraction aims to automatically extract information graphs consisting of mentions and types from natural language texts.
Ranked #1 on Relation Extraction on ACE 2005 (Sentence Encoder metric)
no code implementations • WS 2020 • Marc Canby, Aidana Karipbayeva, Bryan Lunt, Sah Mozaffari, , Charlotte Yoder, Julia Hockenmaier
The objective of this shared task is to produce an inflected form of a word, given its lemma and a set of tags describing the attributes of the desired form.
no code implementations • ACL 2020 • Prashant Jayannavar, Anjali Narayan-Chen, Julia Hockenmaier
The Minecraft Collaborative Building Task is a two-player game in which an Architect (A) instructs a Builder (B) to construct a target structure in a simulated Blocks World Environment.
no code implementations • ACL 2020 • Rajarshi Haldar, Lingfei Wu, JinJun Xiong, Julia Hockenmaier
The ability to match pieces of code to their corresponding natural language descriptions and vice versa is fundamental for natural language search interfaces to software repositories.
1 code implementation • IJCNLP 2019 • Jiacheng Liu, Julia Hockenmaier
In this paper, we formulate phrase grounding as a sequence labeling task where we treat candidate regions as potential labels, and use neural chain Conditional Random Fields (CRFs) to model dependencies among regions for adjacent mentions.
Ranked #8 on Phrase Grounding on Flickr30k Entities Test
no code implementations • ACL 2019 • Anjali Narayan-Chen, Prashant Jayannavar, Julia Hockenmaier
We wish to develop interactive agents that can communicate with humans to collaboratively solve tasks in grounded scenarios.
no code implementations • IJCNLP 2017 • Alice Lai, Yonatan Bisk, Julia Hockenmaier
We define a novel textual entailment task that requires inference over multiple premise sentences.
no code implementations • WS 2017 • Anjali Narayan-Chen, Colin Graber, Mayukh Das, Md. Rakibul Islam, Soham Dan, Sriraam Natarajan, Janardhan Rao Doppa, Julia Hockenmaier, Martha Palmer, Dan Roth
Agents that communicate back and forth with humans to help them execute non-linguistic tasks are a long sought goal of AI.
no code implementations • EACL 2017 • Alice Lai, Julia Hockenmaier
We propose a framework that captures the denotational probabilities of words and phrases by embedding them in a vector space, and present a method to induce such an embedding from a dataset of denotational probabilities.
1 code implementation • ICCV 2017 • Bryan A. Plummer, Arun Mallya, Christopher M. Cervantes, Julia Hockenmaier, Svetlana Lazebnik
This paper presents a framework for localization or grounding of phrases in images using a large collection of linguistic and visual cues.
1 code implementation • EMNLP 2016 • Yonatan Bisk, Siva Reddy, John Blitzer, Julia Hockenmaier, Mark Steedman
We compare the effectiveness of four different syntactic CCG parsers for a semantic slot-filling task to explore how much syntactic supervision is required for downstream semantic analysis.
2 code implementations • ICCV 2015 • Bryan A. Plummer, Li-Wei Wang, Chris M. Cervantes, Juan C. Caicedo, Julia Hockenmaier, Svetlana Lazebnik
The Flickr30k dataset has become a standard benchmark for sentence-based image description.
Ranked #17 on Image Retrieval on Flickr30K 1K test
no code implementations • TACL 2014 • Peter Young, Alice Lai, Micah Hodosh, Julia Hockenmaier
We propose to use the visual denotations of linguistic expressions (i. e. the set of images they describe) to define novel denotational similarity metrics, which we show to be at least as beneficial as distributional similarities for two tasks that require semantic inference.
no code implementations • TACL 2013 • Yonatan Bisk, Julia Hockenmaier
We introduce a novel nonparametric Bayesian model for the induction of Combinatory Categorial Grammars from POS-tagged text.