2 code implementations • 11 Feb 2020 • Jeremy Howard, Sylvain Gugger
These abstractions can be expressed concisely and clearly by leveraging the dynamism of the underlying Python language and the flexibility of the PyTorch library.
4 code implementations • IJCNLP 2019 • Julian Martin Eisenschlos, Sebastian Ruder, Piotr Czapla, Marcin Kardas, Sylvain Gugger, Jeremy Howard
Pretrained language models are promising particularly for low-resource languages as they only require unlabelled data.
Ranked #1 on Zero-shot Cross-Lingual Document Classification on Cross-Lingual Sentiment (CLS)- English to German - DVD
Cross-Lingual Document Classification Document Classification +2
1 code implementation • 6 Jul 2019 • Bobak Farzin, Piotr Czapla, Jeremy Howard
Our entry into the HAHA 2019 Challenge placed $3^{rd}$ in the classification task and $2^{nd}$ in the regression task.
2 code implementations • 24 Oct 2018 • Piotr Czapla, Jeremy Howard, Marcin Kardas
Universal Language Model for Fine-tuning [arXiv:1801. 06146] (ULMFiT) is one of the first NLP methods for efficient inductive transfer learning.
5 code implementations • 5 Feb 2018 • Terence Parr, Jeremy Howard
This paper is an attempt to explain all the matrix calculus you need in order to understand the training of deep neural networks.
65 code implementations • ACL 2018 • Jeremy Howard, Sebastian Ruder
Inductive transfer learning has greatly impacted computer vision, but existing approaches in NLP still require task-specific modifications and training from scratch.
Ranked #4 on Text Classification on AG News