Search Results for author: Ethan A. Chi

Found 7 papers, 2 papers with code

Stanford MLab at SemEval-2021 Task 8: 48 Hours Is All You Need

no code implementations SEMEVAL 2021 Patrick Liu, Niveditha Iyer, Erik Rozi, Ethan A. Chi

This paper presents our system for the Quantity span identification, Unit of measurement identification and Value modifier classification subtasks of the MeasEval 2021 task.

Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT

1 code implementation EACL 2021 Isabel Papadimitriou, Ethan A. Chi, Richard Futrell, Kyle Mahowald

Further examining the characteristics that our classifiers rely on, we find that features such as passive voice, animacy and case strongly correlate with classification decisions, suggesting that mBERT does not encode subjecthood purely syntactically, but that subjecthood embedding is continuous and dependent on semantic and discourse factors, as is proposed in much of the functional linguistics literature.

Align-Refine: Non-Autoregressive Speech Recognition via Iterative Realignment

no code implementations NAACL 2021 Ethan A. Chi, Julian Salazar, Katrin Kirchhoff

Non-autoregressive models greatly improve decoding speed over typical sequence-to-sequence models, but suffer from degraded performance.

Speech Recognition

Finding Universal Grammatical Relations in Multilingual BERT

1 code implementation ACL 2020 Ethan A. Chi, John Hewitt, Christopher D. Manning

Recent work has found evidence that Multilingual BERT (mBERT), a transformer-based multilingual masked language model, is capable of zero-shot cross-lingual transfer, suggesting that some aspects of its representations are shared cross-lingually.

Language Modelling Zero-Shot Cross-Lingual Transfer

SGVAE: Sequential Graph Variational Autoencoder

no code implementations17 Dec 2019 Bowen Jing, Ethan A. Chi, Jillian Tang

Generative models of graphs are well-known, but many existing models are limited in scalability and expressivity.

Cannot find the paper you are looking for? You can Submit a new open access paper.