Search Results for author: Dimitris Mamakas

Found 1 papers, 0 papers with code

Processing Long Legal Documents with Pre-trained Transformers: Modding LegalBERT and Longformer

no code implementations2 Nov 2022 Dimitris Mamakas, Petros Tsotsi, Ion Androutsopoulos, Ilias Chalkidis

Even sparse-attention models, such as Longformer and BigBird, which increase the maximum input length to 4, 096 sub-words, severely truncate texts in three of the six datasets of LexGLUE.

Document Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.