no code implementations • 30 Mar 2023 • Daniel Campos, Alexandre Marques, Mark Kurtz, ChengXiang Zhai
In this paper, we introduce the range of oBERTa language models, an easy-to-use set of language models which allows Natural Language Processing (NLP) practitioners to obtain between 3. 8 and 24. 3 times faster models without expertise in model compression.
no code implementations • 25 May 2022 • Daniel Campos, Alexandre Marques, Tuan Nguyen, Mark Kurtz, ChengXiang Zhai
Our experimentation shows that models that are pruned during pretraining using general domain masked language models can transfer to novel domains and tasks without extensive hyperparameter exploration or specialized approaches.
no code implementations • NeurIPS 2018 • Alexandre Marques, Remi Lam, Karen Willcox
We introduce an algorithm to locate contours of functions that are expensive to evaluate.