Search Results for author: Brandon R. Anderson

Found 1 papers, 1 papers with code

When Does Pretraining Help? Assessing Self-Supervised Learning for Law and the CaseHOLD Dataset

2 code implementations18 Apr 2021 Lucia Zheng, Neel Guha, Brandon R. Anderson, Peter Henderson, Daniel E. Ho

While a Transformer architecture (BERT) pretrained on a general corpus (Google Books and Wikipedia) improves performance, domain pretraining (using corpus of approximately 3. 5M decisions across all courts in the U. S. that is larger than BERT's) with a custom legal vocabulary exhibits the most substantial performance gains with CaseHOLD (gain of 7. 2% on F1, representing a 12% improvement on BERT) and consistent performance gains across two other legal tasks.

Multiple-choice Question Answering +3

Cannot find the paper you are looking for? You can Submit a new open access paper.