Search Results for author: Shijie Wu

Found 18 papers, 11 papers with code

Zero-shot Cross-lingual Transfer is Under-specified Optimization

no code implementations RepL4NLP (ACL) 2022 Shijie Wu, Benjamin Van Durme, Mark Dredze

Pretrained multilingual encoders enable zero-shot cross-lingual transfer, but often produce unreliable models that exhibit high performance variance on the target language.

Zero-Shot Cross-Lingual Transfer

Differentiable Generative Phonology

1 code implementation10 Feb 2021 Shijie Wu, Edoardo Maria Ponti, Ryan Cotterell

As the main contribution of our work, we implement the phonological generative system as a neural model differentiable end-to-end, rather than as a set of rules or constraints.

Do Explicit Alignments Robustly Improve Multilingual Encoders?

1 code implementation EMNLP 2020 Shijie Wu, Mark Dredze

Multilingual BERT (mBERT), XLM-RoBERTa (XLMR) and other unsupervised multilingual encoders can effectively learn cross-lingual representation.

The SIGMORPHON 2020 Shared Task on Multilingual Grapheme-to-Phoneme Conversion

no code implementations WS 2020 Kyle Gorman, Lucas F.E. Ashby, Aaron Goyzueta, Arya McCarthy, Shijie Wu, Daniel You

We describe the design and findings of the SIGMORPHON 2020 shared task on multilingual grapheme-to-phoneme conversion.

Applying the Transformer to Character-level Transduction

1 code implementation EACL 2021 Shijie Wu, Ryan Cotterell, Mans Hulden

The transformer has been shown to outperform recurrent neural network-based sequence-to-sequence models in various word-level NLP tasks.

Morphological Inflection Transliteration

Are All Languages Created Equal in Multilingual BERT?

1 code implementation WS 2020 Shijie Wu, Mark Dredze

Multilingual BERT (mBERT) trained on 104 languages has shown surprisingly good cross-lingual performance on several NLP tasks, even without explicit cross-lingual signals.

Cross-Lingual Transfer Dependency Parsing +2

The Paradigm Discovery Problem

1 code implementation ACL 2020 Alexander Erdmann, Micha Elsner, Shijie Wu, Ryan Cotterell, Nizar Habash

Our benchmark system first makes use of word embeddings and string similarity to cluster forms by cell and by paradigm.

Word Embeddings

Emerging Cross-lingual Structure in Pretrained Language Models

no code implementations ACL 2020 Shijie Wu, Alexis Conneau, Haoran Li, Luke Zettlemoyer, Veselin Stoyanov

We study the problem of multilingual masked language modeling, i. e. the training of a single model on concatenated text from multiple languages, and present a detailed study of several factors that influence why these models are so effective for cross-lingual transfer.

Cross-Lingual Transfer Language Modelling +3

The SIGMORPHON 2019 Shared Task: Morphological Analysis in Context and Cross-Lingual Transfer for Inflection

no code implementations WS 2019 Arya D. McCarthy, Ekaterina Vylomova, Shijie Wu, Chaitanya Malaviya, Lawrence Wolf-Sonkin, Garrett Nicolai, Christo Kirov, Miikka Silfverberg, Sabrina J. Mielke, Jeffrey Heinz, Ryan Cotterell, Mans Hulden

The SIGMORPHON 2019 shared task on cross-lingual transfer and contextual analysis in morphology examined transfer learning of inflection between 100 language pairs, as well as contextual lemmatization and morphosyntactic description in 66 languages.

Cross-Lingual Transfer Lemmatization +3

Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT

2 code implementations IJCNLP 2019 Shijie Wu, Mark Dredze

Pretrained contextual representation models (Peters et al., 2018; Devlin et al., 2018) have pushed forward the state-of-the-art on many NLP tasks.

Cross-Lingual NER Dependency Parsing +5

Hard Non-Monotonic Attention for Character-Level Transduction

2 code implementations EMNLP 2018 Shijie Wu, Pamela Shapiro, Ryan Cotterell

We compare soft and hard non-monotonic attention experimentally and find that the exact algorithm significantly improves performance over the stochastic approximation and outperforms soft attention.

Hard Attention Image Captioning

Cannot find the paper you are looking for? You can Submit a new open access paper.