Search Results for author: Piotr Nawrot

Found 5 papers, 4 papers with code

nanoT5: A PyTorch Framework for Pre-training and Fine-tuning T5-style Models with Limited Resources

1 code implementation5 Sep 2023 Piotr Nawrot

With the introduction of this open-source framework, we hope to widen the accessibility to language modelling research and cater to the community's demand for more user-friendly T5 (Encoder-Decoder) implementations.

Decoder Language Modelling

Efficient Transformers with Dynamic Token Pooling

1 code implementation17 Nov 2022 Piotr Nawrot, Jan Chorowski, Adrian Łańcucki, Edoardo M. Ponti

Transformers achieve unrivalled performance in modelling language, but remain inefficient in terms of memory and time complexity.

Cannot find the paper you are looking for? You can Submit a new open access paper.