1 code implementation • ICCV 2023 • Dongjun Lee, Seokwon Song, Jihee Suh, Joonmyung Choi, Sanghyeok Lee, Hyunwoo J. Kim
RPO leverages masked attention to prevent the internal representation shift in the pre-trained model.
Ranked #6 on Prompt Engineering on Caltech-101