no code implementations • RANLP 2021 • Koichi Nagatsuka, Clifford Broni-Bediako, Masayasu Atsumi
Recently, pre-trained language representation models such as BERT and RoBERTa have achieved significant results in a wide range of natural language processing (NLP) tasks, however, it requires extremely high computational cost.
1 code implementation • 27 May 2020 • Clifford Broni-Bediako, Yuki Murata, Luiz Henrique Mormille, Masayasu Atsumi
The renaissance of neural architecture search (NAS) has seen classical methods such as genetic algorithms (GA) and genetic programming (GP) being exploited for convolutional neural network (CNN) architectures.