no code implementations • 14 Jul 2022 • Javier de la Rosa, Eduardo G. Ponferrada, Paulo Villegas, Pablo Gonzalez de Prado Salas, Manu Romero, Marıa Grandury
The pre-training of large language models usually requires massive amounts of resources, both in terms of computation and data.