no code implementations • LREC 2022 • Daniel Edmiston, Phillip Keung, Noah A. Smith
Cross-lingual transfer learning without labeled target language data or parallel text has been surprisingly effective in zero-shot cross-lingual classification, question answering, unsupervised machine translation, etc.
no code implementations • 30 Nov 2022 • Daniel Edmiston, Phillip Keung, Noah A. Smith
Cross-lingual transfer learning without labeled target language data or parallel text has been surprisingly effective in zero-shot cross-lingual classification, question answering, unsupervised machine translation, etc.
1 code implementation • 6 Apr 2020 • Daniel Edmiston
This work describes experiments which probe the hidden representations of several BERT-style models for morphological content.
1 code implementation • ICLR 2020 • Taeuk Kim, Jihun Choi, Daniel Edmiston, Sang-goo Lee
With the recent success and popularity of pre-trained language models (LMs) in natural language processing, there has been a rise in efforts to understand their inner workings.
2 code implementations • 7 Sep 2018 • Taeuk Kim, Jihun Choi, Daniel Edmiston, Sanghwan Bae, Sang-goo Lee
Most existing recursive neural network (RvNN) architectures utilize only the structure of parse trees, ignoring syntactic tags which are provided as by-products of parsing.
no code implementations • WS 2018 • Daniel Edmiston, Karl Stratos
StAffNet, the name of our architecture, shows competitive performance with the state-of-the-art on this task.