Search Results for author: Julia Hockenmaier

Found 26 papers, 8 papers with code

Analyzing the Performance of Large Language Models on Code Summarization

1 code implementation10 Apr 2024 Rajarshi Haldar, Julia Hockenmaier

We show that for the task of code summarization, the performance of these models on individual examples often depends on the amount of (subword) token overlap between the code and the corresponding reference natural language descriptions in the dataset.

Code Generation Code Summarization

ViT-MUL: A Baseline Study on Recent Machine Unlearning Methods Applied to Vision Transformers

no code implementations7 Feb 2024 Ikhyun Cho, Changyeon Park, Julia Hockenmaier

Machine unlearning (MUL) is an arising field in machine learning that seeks to erase the learned information of specific training data points from a trained model.

Machine Unlearning

Attack and Reset for Unlearning: Exploiting Adversarial Noise toward Machine Unlearning through Parameter Re-initialization

no code implementations17 Jan 2024 Yoonhwa Jung, Ikhyun Cho, Shun-Hsiang Hsu, Julia Hockenmaier

With growing concerns surrounding privacy and regulatory compliance, the concept of machine unlearning has gained prominence, aiming to selectively forget or erase specific learned information from a trained model.

Machine Unlearning

A Framework for Bidirectional Decoding: Case Study in Morphological Inflection

1 code implementation21 May 2023 Marc E. Canby, Julia Hockenmaier

Transformer-based encoder-decoder models that generate outputs in a left-to-right fashion have become standard for sequence-to-sequence tasks.

Decoder LEMMA +1

Multimedia Generative Script Learning for Task Planning

1 code implementation25 Aug 2022 Qingyun Wang, Manling Li, Hou Pong Chan, Lifu Huang, Julia Hockenmaier, Girish Chowdhary, Heng Ji

Goal-oriented generative script learning aims to generate subsequent steps to reach a particular goal, which is an essential task to assist robots or humans in performing stereotypical activities.

Contrastive Learning Decoder +4

HySPA: Hybrid Span Generation for Scalable Text-to-Graph Extraction

1 code implementation Findings (ACL) 2021 Liliang Ren, Chenkai Sun, Heng Ji, Julia Hockenmaier

Text-to-Graph extraction aims to automatically extract information graphs consisting of mentions and types from natural language texts.

 Ranked #1 on Relation Extraction on ACE 2005 (Sentence Encoder metric)

Decoder Joint Entity and Relation Extraction

Learning to execute instructions in a Minecraft dialogue

no code implementations ACL 2020 Prashant Jayannavar, Anjali Narayan-Chen, Julia Hockenmaier

The Minecraft Collaborative Building Task is a two-player game in which an Architect (A) instructs a Builder (B) to construct a target structure in a simulated Blocks World Environment.

Learning to Execute

A Multi-Perspective Architecture for Semantic Code Search

no code implementations ACL 2020 Rajarshi Haldar, Lingfei Wu, JinJun Xiong, Julia Hockenmaier

The ability to match pieces of code to their corresponding natural language descriptions and vice versa is fundamental for natural language search interfaces to software repositories.

Code Search Text Matching

Phrase Grounding by Soft-Label Chain Conditional Random Field

1 code implementation IJCNLP 2019 Jiacheng Liu, Julia Hockenmaier

In this paper, we formulate phrase grounding as a sequence labeling task where we treat candidate regions as potential labels, and use neural chain Conditional Random Fields (CRFs) to model dependencies among regions for adjacent mentions.

Phrase Grounding Structured Prediction

Collaborative Dialogue in Minecraft

no code implementations ACL 2019 Anjali Narayan-Chen, Prashant Jayannavar, Julia Hockenmaier

We wish to develop interactive agents that can communicate with humans to collaboratively solve tasks in grounded scenarios.

Natural Language Inference from Multiple Premises

no code implementations IJCNLP 2017 Alice Lai, Yonatan Bisk, Julia Hockenmaier

We define a novel textual entailment task that requires inference over multiple premise sentences.

Natural Language Inference

Learning to Predict Denotational Probabilities For Modeling Entailment

no code implementations EACL 2017 Alice Lai, Julia Hockenmaier

We propose a framework that captures the denotational probabilities of words and phrases by embedding them in a vector space, and present a method to induce such an embedding from a dataset of denotational probabilities.

Coreference Resolution Natural Language Inference

Evaluating Induced CCG Parsers on Grounded Semantic Parsing

1 code implementation EMNLP 2016 Yonatan Bisk, Siva Reddy, John Blitzer, Julia Hockenmaier, Mark Steedman

We compare the effectiveness of four different syntactic CCG parsers for a semantic slot-filling task to explore how much syntactic supervision is required for downstream semantic analysis.

Semantic Parsing slot-filling +1

From image descriptions to visual denotations: New similarity metrics for semantic inference over event descriptions

no code implementations TACL 2014 Peter Young, Alice Lai, Micah Hodosh, Julia Hockenmaier

We propose to use the visual denotations of linguistic expressions (i. e. the set of images they describe) to define novel denotational similarity metrics, which we show to be at least as beneficial as distributional similarities for two tasks that require semantic inference.

Descriptive Semantic Textual Similarity

An HDP Model for Inducing Combinatory Categorial Grammars

no code implementations TACL 2013 Yonatan Bisk, Julia Hockenmaier

We introduce a novel nonparametric Bayesian model for the induction of Combinatory Categorial Grammars from POS-tagged text.


Cannot find the paper you are looking for? You can Submit a new open access paper.