Wikipedia Summarization

2 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

IT5: Text-to-text Pretraining for Italian Language Understanding and Generation

gsarti/it5 7 Mar 2022

We introduce IT5, the first family of encoder-decoder transformer models pretrained specifically on Italian.