Search Results for author: Daniel Gildea

Found 58 papers, 9 papers with code

Efficient Outside Computation

no code implementations CL (ACL) 2020 Daniel Gildea

Weighted deduction systems provide a framework for describing parsing algorithms that can be used with a variety of operations for combining the values of partial derivations.

Strictly Breadth-First AMR Parsing

no code implementations8 Nov 2022 Chen Yu, Daniel Gildea

AMR parsing is the task that maps a sentence to an AMR semantic graph automatically.

AMR Parsing Sentence

Hierarchical Context Tagging for Utterance Rewriting

1 code implementation22 Jun 2022 Lisa Jin, Linfeng Song, Lifeng Jin, Dong Yu, Daniel Gildea

HCT (i) tags the source string with token-level edit actions and slotted rules and (ii) fills in the resulting rule slots with spans from the dialogue context.

TAG

Tree Decomposition Attention for AMR-to-Text Generation

no code implementations27 Aug 2021 Lisa Jin, Daniel Gildea

Text generation from AMR requires mapping a semantic graph to a string that it annotates.

AMR-to-Text Generation Text Generation

Latent Tree Decomposition Parsers for AMR-to-Text Generation

no code implementations27 Aug 2021 Lisa Jin, Daniel Gildea

Graph encoders in AMR-to-text generation models often rely on neighborhood convolutions or global vertex attention.

AMR-to-Text Generation Clustering +4

Outside Computation with Superior Functions

no code implementations NAACL 2021 Parker Riley, Daniel Gildea

We show that a general algorithm for efficient computation of outside values under the minimum of superior functions framework proposed by Knuth (1977) would yield a sub-exponential time algorithm for SAT, violating the Strong Exponential Time Hypothesis (SETH).

Generalized Shortest-Paths Encoders for AMR-to-Text Generation

no code implementations COLING 2020 Lisa Jin, Daniel Gildea

Instead of feeding shortest paths to the vertex self-attention module, we train a model to learn them using generalized shortest-paths algorithms.

AMR-to-Text Generation Text Generation

Tensors over Semirings for Latent-Variable Weighted Logic Programs

no code implementations WS 2020 Esma Balkir, Daniel Gildea, Shay Cohen

Semiring parsing is an elegant framework for describing parsers by using semiring weighted logic programs.

Unsupervised Bilingual Lexicon Induction Across Writing Systems

no code implementations31 Jan 2020 Parker Riley, Daniel Gildea

Recent embedding-based methods in unsupervised bilingual lexicon induction have shown good results, but generally have not leveraged orthographic (spelling) information, which can be helpful for pairs of related languages.

Bilingual Lexicon Induction

AMR-to-Text Generation with Cache Transition Systems

no code implementations3 Dec 2019 Lisa Jin, Daniel Gildea

To enforce a sentence-aligned graph traversal and provide local graph context, we predict transition-based parser actions in addition to English words.

AMR-to-Text Generation Sentence +1

Ordered Tree Decomposition for HRG Rule Extraction

no code implementations CL 2019 Daniel Gildea, Giorgio Satta, Xiaochang Peng

Our algorithms are based on finding a tree decomposition of smallest width, relative to the vertex order, and then extracting one rule for each node in this structure.

Tree Decomposition

Predicting TED Talk Ratings from Language and Prosody

no code implementations21 May 2019 Md. Iftekhar Tanveer, Md Kamrul Hassan, Daniel Gildea, M. Ehsan Hoque

We use the largest open repository of public speaking---TED Talks---to predict the ratings of the online viewers.

BIG-bench Machine Learning

A Causality-Guided Prediction of the TED Talk Ratings from the Speech-Transcripts using Neural Networks

no code implementations21 May 2019 Md. Iftekhar Tanveer, Md. Kamrul Hasan, Daniel Gildea, M. Ehsan Hoque

Automated prediction of public speaking performance enables novel systems for tutoring public speaking skills.

Semantic Neural Machine Translation using AMR

1 code implementation TACL 2019 Linfeng Song, Daniel Gildea, Yue Zhang, Zhiguo Wang, Jinsong Su

It is intuitive that semantic representations can be useful for machine translation, mainly because they can help in enforcing meaning preservation and handling data sparsity (many sentences correspond to one meaning) of machine translation models.

Machine Translation NMT +1

N-ary Relation Extraction using Graph-State LSTM

no code implementations EMNLP 2018 Linfeng Song, Yue Zhang, Zhiguo Wang, Daniel Gildea

Cross-sentence $n$-ary relation extraction detects relations among $n$ entities across multiple sentences.

Relation Relation Extraction +1

Exploring Graph-structured Passage Representation for Multi-hop Reading Comprehension with Graph Neural Networks

no code implementations6 Sep 2018 Linfeng Song, Zhiguo Wang, Mo Yu, Yue Zhang, Radu Florian, Daniel Gildea

Multi-hop reading comprehension focuses on one type of factoid question, where a system needs to properly integrate multiple pieces of evidence to correctly answer a question.

Multi-Hop Reading Comprehension Question Answering

Feature-Based Decipherment for Machine Translation

no code implementations CL 2018 Iftekhar Naim, Parker Riley, Daniel Gildea

The existing decipherment models, however, are not well suited for exploiting these orthographic similarities.

Decipherment Machine Translation +2

N-ary Relation Extraction using Graph State LSTM

2 code implementations28 Aug 2018 Linfeng Song, Yue Zhang, Zhiguo Wang, Daniel Gildea

Cross-sentence $n$-ary relation extraction detects relations among $n$ entities across multiple sentences.

Relation Relation Extraction +1

Orthographic Features for Bilingual Lexicon Induction

no code implementations ACL 2018 Parker Riley, Daniel Gildea

Recent embedding-based methods in bilingual lexicon induction show good results, but do not take advantage of orthographic features, such as edit distance, which can be helpful for pairs of related languages.

Bilingual Lexicon Induction Multilingual Word Embeddings +2

Sequence-to-sequence Models for Cache Transition Systems

1 code implementation ACL 2018 Xiaochang Peng, Linfeng Song, Daniel Gildea, Giorgio Satta

In this paper, we present a sequence-to-sequence based approach for mapping natural language sentences to AMR semantic graphs.

AMR Parsing Hard Attention +1

A Graph-to-Sequence Model for AMR-to-Text Generation

1 code implementation ACL 2018 Linfeng Song, Yue Zhang, Zhiguo Wang, Daniel Gildea

The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.

 Ranked #1 on Graph-to-Sequence on LDC2015E86: (using extra training data)

AMR-to-Text Generation Graph-to-Sequence +1

Weighted DAG Automata for Semantic Graphs

no code implementations CL 2018 David Chiang, Frank Drewes, Daniel Gildea, Adam Lopez, Giorgio Satta

Graphs have a variety of uses in natural language processing, particularly as representations of linguistic meaning.

A Notion of Semantic Coherence for Underspecified Semantic Representation

no code implementations CL 2018 Mehdi Manshadi, Daniel Gildea, James F. Allen

The general problem of finding satisfying solutions to constraint-based underspecified representations of quantifier scope is NP-complete.

Sentence

Cache Transition Systems for Graph Parsing

no code implementations CL 2018 Daniel Gildea, Giorgio Satta, Xiaochang Peng

Motivated by the task of semantic parsing, we describe a transition system that generalizes standard transition-based dependency parsing techniques to generate a graph rather than a tree.

Semantic Parsing Transition-Based Dependency Parsing +1

AMR-to-text generation as a Traveling Salesman Problem

no code implementations EMNLP 2016 Linfeng Song, Yue Zhang, Xiaochang Peng, Zhiguo Wang, Daniel Gildea

The task of AMR-to-text generation is to generate grammatical text that sustains the semantic meaning for a given AMR graph.

AMR-to-Text Generation Text Generation +2

Exploring phrase-compositionality in skip-gram models

no code implementations21 Jul 2016 Xiaochang Peng, Daniel Gildea

In this paper, we introduce a variation of the skip-gram model which jointly learns distributed word vector representations and their way of composing to form phrase embeddings.

Dependency Parsing

Human languages order information efficiently

no code implementations9 Oct 2015 Daniel Gildea, T. Florian Jaeger

Most languages use the relative order between words to encode meaning relations.

Feature-based Decipherment for Large Vocabulary Machine Translation

no code implementations10 Aug 2015 Iftekhar Naim, Daniel Gildea

Our results show that the proposed log-linear model with contrastive divergence scales to large vocabularies and outperforms the existing generative decipherment models by exploiting the orthographic features.

Decipherment Machine Translation +1

Parsing Linear Context-Free Rewriting Systems with Fast Matrix Multiplication

no code implementations CL 2016 Shay B. Cohen, Daniel Gildea

Our result provides another proof for the best known result for parsing mildly context sensitive formalisms such as combinatory categorial grammars, head grammars, linear indexed grammars, and tree adjoining grammars, which can be parsed in time $O(n^{4. 76})$.

Automated Analysis and Prediction of Job Interview Performance

1 code implementation14 Apr 2015 Iftekhar Naim, M. Iftekhar Tanveer, Daniel Gildea, Mohammed, Hoque

We present a computational framework for automatically quantifying verbal and nonverbal behaviors in the context of job interviews.

Synchronous Context-Free Grammars and Optimal Linear Parsing Strategies

no code implementations25 Nov 2013 Pierluigi Crescenzi, Daniel Gildea, Andrea Marino, Gianluca Rossi, Giorgio Satta

Synchronous Context-Free Grammars (SCFGs), also known as syntax-directed translation schemata, are unlike context-free grammars in that they do not have a binary normal form.

Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.