Search Results for author: Jason Eisner

Found 68 papers, 13 papers with code

Online Semantic Parsing for Latency Reduction in Task-Oriented Dialogue

no code implementations ACL 2022 Jiawei Zhou, Jason Eisner, Michael Newman, Emmanouil Antonios Platanios, Sam Thomson

Standard conversational semantic parsing maps a complete user utterance into an executable program, after which the program is executed to respond to the user.

Machine Translation Semantic Parsing +1

Transformer Embeddings of Irregularly Spaced Events and Their Participants

1 code implementation ICLR 2022 Chenghao Yang, Hongyuan Mei, Jason Eisner

The neural Hawkes process (Mei & Eisner, 2017) is a generative model of irregularly spaced sequences of discrete events.

Searching for More Efficient Dynamic Programs

no code implementations Findings (EMNLP) 2021 Tim Vieira, Ryan Cotterell, Jason Eisner

To this end, we describe a set of program transformations, a simple metric for assessing the efficiency of a transformed program, and a heuristic search procedure to improve this metric.

Learning How to Ask: Querying LMs with Mixtures of Soft Prompts

2 code implementations NAACL 2021 Guanghui Qin, Jason Eisner

We explore the idea of learning prompts by gradient descent -- either fine-tuning prompts taken from previous work, or starting from random initialization.

Language Modelling Pretrained Language Models

Limitations of Autoregressive Models and Their Alternatives

no code implementations NAACL 2021 Chu-Cheng Lin, Aaron Jaech, Xin Li, Matthew R. Gormley, Jason Eisner

Standard autoregressive language models perform only polynomial-time computation to compute the probability of the next symbol.

Language Modelling

Evaluation of Logic Programs with Built-Ins and Aggregation: A Calculus for Bag Relations

1 code implementation20 Oct 2020 Matthew Francis-Landau, Tim Vieira, Jason Eisner

We present a scheme for translating logic programs, which may use aggregation and arithmetic, into algebraic expressions that denote bag relations over ground terms of the Herbrand universe.

Programming Languages Symbolic Computation

Neural Datalog Through Time: Informed Temporal Modeling via Logical Specification

1 code implementation ICML 2020 Hongyuan Mei, Guanghui Qin, Minjie Xu, Jason Eisner

Learning how to predict future events from patterns of past events is difficult when the set of possible event types is large.

A Corpus for Large-Scale Phonetic Typology

no code implementations ACL 2020 Elizabeth Salesky, Eleanor Chodroff, Tiago Pimentel, Matthew Wiesner, Ryan Cotterell, Alan W. black, Jason Eisner

A major hurdle in data-driven research on typology is having sufficient data in many languages to draw meaningful conclusions.

Spelling-Aware Construction of Macaronic Texts for Teaching Foreign-Language Vocabulary

no code implementations IJCNLP 2019 Adithya Renduchintala, Philipp Koehn, Jason Eisner

We present a machine foreign-language teacher that modifies text in a student{'}s native language (L1) by replacing some word tokens with glosses in a foreign language (L2), in such a way that the student can acquire L2 vocabulary simply by reading the resulting macaronic text.

Language Modelling

Specializing Word Embeddings (for Parsing) by Information Bottleneck

no code implementations IJCNLP 2019 Xiang Lisa Li, Jason Eisner

Pre-trained word embeddings like ELMo and BERT contain rich syntactic and semantic information, resulting in state-of-the-art performance on various tasks.

Dimensionality Reduction POS +2

Simple Construction of Mixed-Language Texts for Vocabulary Learning

no code implementations WS 2019 Adithya Renduchintala, Philipp Koehn, Jason Eisner

We accomplish this by modifying a cloze language model to incrementally learn new vocabulary items, and use this language model as a proxy for the word guessing and learning ability of real students.

Language Modelling

A Generative Model for Punctuation in Dependency Trees

no code implementations TACL 2019 Xiang Lisa Li, Dingquan Wang, Jason Eisner

When the tree's yield is rendered as a written sentence, a string rewriting mechanism transduces the underlying marks into "surface" marks, which are part of the observed (surface) string but should not be regarded as part of the tree.

What Kind of Language Is Hard to Language-Model?

no code implementations ACL 2019 Sabrina J. Mielke, Ryan Cotterell, Kyle Gorman, Brian Roark, Jason Eisner

Trying to answer the question of what features difficult languages have in common, we try and fail to reproduce our earlier (Cotterell et al., 2018) observation about morphological complexity and instead reveal far simpler statistics of the data that seem to drive complexity in a much larger sample.

Language Modelling

Neural Finite-State Transducers: Beyond Rational Relations

no code implementations NAACL 2019 Chu-Cheng Lin, Hao Zhu, Matthew R. Gormley, Jason Eisner

We introduce neural finite state transducers (NFSTs), a family of string transduction models defining joint and conditional probability distributions over pairs of strings.

Imputing Missing Events in Continuous-Time Event Streams

1 code implementation14 May 2019 Hongyuan Mei, Guanghui Qin, Jason Eisner

On held-out incomplete sequences, our method is effective at inferring the ground-truth unobserved events, with particle smoothing consistently improving upon particle filtering.

The CoNLL--SIGMORPHON 2018 Shared Task: Universal Morphological Reinflection

no code implementations CONLL 2018 Ryan Cotterell, Christo Kirov, John Sylak-Glassman, Géraldine Walther, Ekaterina Vylomova, Arya D. McCarthy, Katharina Kann, Sabrina J. Mielke, Garrett Nicolai, Miikka Silfverberg, David Yarowsky, Jason Eisner, Mans Hulden

Apart from extending the number of languages involved in earlier supervised tasks of generating inflected forms, this year the shared task also featured a new second task which asked participants to inflect words in sentential context, similar to a cloze task.

Inference of unobserved event streams with neural Hawkes particle smoothing

no code implementations27 Sep 2018 Hongyuan Mei, Guanghui Qin, Jason Eisner

Particle smoothing is an extension of particle filtering in which proposed events are conditioned on the future as well as the past.

Are All Languages Equally Hard to Language-Model?

no code implementations NAACL 2018 Ryan Cotterell, Sabrina J. Mielke, Jason Eisner, Brian Roark

For general modeling methods applied to diverse languages, a natural question is: how well should we expect our models to work on languages with differing typological profiles?

Language Modelling

Neural Particle Smoothing for Sampling from Conditional Sequence Models

no code implementations NAACL 2018 Chu-Cheng Lin, Jason Eisner

We introduce neural particle smoothing, a sequential Monte Carlo method for sampling annotations of an input string from a given probability model.

Spell Once, Summon Anywhere: A Two-Level Open-Vocabulary Language Model

1 code implementation23 Apr 2018 Sabrina J. Mielke, Jason Eisner

By invoking the second RNN to generate spellings for novel words in context, we obtain an open-vocabulary language model.

Language Modelling

On the Diachronic Stability of Irregularity in Inflectional Morphology

no code implementations23 Apr 2018 Ryan Cotterell, Christo Kirov, Mans Hulden, Jason Eisner

Many languages' inflectional morphological systems are replete with irregulars, i. e., words that do not seem to follow standard inflectional rules.

Surface Statistics of an Unknown Language Indicate How to Parse It

no code implementations TACL 2018 Dingquan Wang, Jason Eisner

We show experimentally across multiple languages: (1) Features computed from the unparsed corpus improve parsing accuracy.

Dependency Parsing POS

Fine-Grained Prediction of Syntactic Typology: Discovering Latent Structure with Supervised Learning

no code implementations TACL 2017 Dingquan Wang, Jason Eisner

We show how to predict the basic word-order facts of a novel language given only a corpus of part-of-speech (POS) sequences.

POS

The Galactic Dependencies Treebanks: Getting More Data by Synthesizing New Languages

1 code implementation TACL 2016 Dingquan Wang, Jason Eisner

We release Galactic Dependencies 1. 0---a large set of synthetic languages not found on Earth, but annotated in Universal Dependencies format.

Knowledge Tracing in Sequential Learning of Inflected Vocabulary

no code implementations CONLL 2017 Adithya Renduchintala, Philipp Koehn, Jason Eisner

We present a feature-rich knowledge tracing method that captures a student{'}s acquisition and retention of knowledge during a foreign language phrase learning task.

Knowledge Tracing Structured Prediction

CoNLL-SIGMORPHON 2017 Shared Task: Universal Morphological Reinflection in 52 Languages

no code implementations CONLL 2017 Ryan Cotterell, Christo Kirov, John Sylak-Glassman, Géraldine Walther, Ekaterina Vylomova, Patrick Xia, Manaal Faruqui, Sandra Kübler, David Yarowsky, Jason Eisner, Mans Hulden

In sub-task 2, systems were given a lemma and some of its specific inflected forms, and asked to complete the inflectional paradigm by predicting all of the remaining inflected forms.

Data Augmentation

Approximation-Aware Dependency Parsing by Belief Propagation

no code implementations TACL 2015 Matthew R. Gormley, Mark Dredze, Jason Eisner

We show how to adjust the model parameters to compensate for the errors introduced by this approximation, by following the gradient of the actual loss on training data.

Dependency Parsing

Modeling Word Forms Using Latent Underlying Morphs and Phonology

no code implementations TACL 2015 Ryan Cotterell, Nanyun Peng, Jason Eisner

Given some surface word types of a concatenative language along with the abstract morpheme sequences that they express, we show how to recover consistent underlying forms for these morphemes, together with the (stochastic) phonology that maps each concatenation of underlying forms to a surface form.

Imitation Learning by Coaching

no code implementations NeurIPS 2012 He He, Jason Eisner, Hal Daume

However, it is important to note that these guarantees depend on how well the policy we found can imitate the oracle on the training data.

Imitation Learning online learning

Learned Prioritization for Trading Off Accuracy and Speed

no code implementations NeurIPS 2012 Jiarong Jiang, Adam Teichert, Jason Eisner, Hal Daume

Users want natural language processing (NLP) systems to be both fast and accurate, but quality often comes at the cost of speed.

Imitation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.