Search Results for author: Shean Wang

Found 1 papers, 1 papers with code

LoRA: Low-Rank Adaptation of Large Language Models

48 code implementations ICLR 2022 Edward J. Hu, Yelong Shen, Phillip Wallis, Zeyuan Allen-Zhu, Yuanzhi Li, Shean Wang, Lu Wang, Weizhu Chen

We propose Low-Rank Adaptation, or LoRA, which freezes the pre-trained model weights and injects trainable rank decomposition matrices into each layer of the Transformer architecture, greatly reducing the number of trainable parameters for downstream tasks.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.