Search Results for author: Julia Hockenmaier

Found 20 papers, 4 papers with code

HySPA: Hybrid Span Generation for Scalable Text-to-Graph Extraction

1 code implementation30 Jun 2021 Liliang Ren, Chenkai Sun, Heng Ji, Julia Hockenmaier

Text-to-Graph extraction aims to automatically extract information graphs consisting of mentions and types from natural language texts.

Joint Entity and Relation Extraction

Learning to execute instructions in a Minecraft dialogue

no code implementations ACL 2020 Prashant Jayannavar, Anjali Narayan-Chen, Julia Hockenmaier

The Minecraft Collaborative Building Task is a two-player game in which an Architect (A) instructs a Builder (B) to construct a target structure in a simulated Blocks World Environment.

Learning to Execute Minecraft

A Multi-Perspective Architecture for Semantic Code Search

no code implementations ACL 2020 Rajarshi Haldar, Lingfei Wu, JinJun Xiong, Julia Hockenmaier

The ability to match pieces of code to their corresponding natural language descriptions and vice versa is fundamental for natural language search interfaces to software repositories.

Code Search Text Matching

Phrase Grounding by Soft-Label Chain Conditional Random Field

1 code implementation IJCNLP 2019 Jiacheng Liu, Julia Hockenmaier

In this paper, we formulate phrase grounding as a sequence labeling task where we treat candidate regions as potential labels, and use neural chain Conditional Random Fields (CRFs) to model dependencies among regions for adjacent mentions.

Phrase Grounding Structured Prediction

Collaborative Dialogue in Minecraft

no code implementations ACL 2019 Anjali Narayan-Chen, Prashant Jayannavar, Julia Hockenmaier

We wish to develop interactive agents that can communicate with humans to collaboratively solve tasks in grounded scenarios.


Natural Language Inference from Multiple Premises

no code implementations IJCNLP 2017 Alice Lai, Yonatan Bisk, Julia Hockenmaier

We define a novel textual entailment task that requires inference over multiple premise sentences.

Natural Language Inference

Learning to Predict Denotational Probabilities For Modeling Entailment

no code implementations EACL 2017 Alice Lai, Julia Hockenmaier

We propose a framework that captures the denotational probabilities of words and phrases by embedding them in a vector space, and present a method to induce such an embedding from a dataset of denotational probabilities.

Coreference Resolution Natural Language Inference

Evaluating Induced CCG Parsers on Grounded Semantic Parsing

1 code implementation EMNLP 2016 Yonatan Bisk, Siva Reddy, John Blitzer, Julia Hockenmaier, Mark Steedman

We compare the effectiveness of four different syntactic CCG parsers for a semantic slot-filling task to explore how much syntactic supervision is required for downstream semantic analysis.

Semantic Parsing Slot Filling

From image descriptions to visual denotations: New similarity metrics for semantic inference over event descriptions

no code implementations TACL 2014 Peter Young, Alice Lai, Micah Hodosh, Julia Hockenmaier

We propose to use the visual denotations of linguistic expressions (i. e. the set of images they describe) to define novel denotational similarity metrics, which we show to be at least as beneficial as distributional similarities for two tasks that require semantic inference.

Semantic Textual Similarity

An HDP Model for Inducing Combinatory Categorial Grammars

no code implementations TACL 2013 Yonatan Bisk, Julia Hockenmaier

We introduce a novel nonparametric Bayesian model for the induction of Combinatory Categorial Grammars from POS-tagged text.


Cannot find the paper you are looking for? You can Submit a new open access paper.