Search Results for author: Sara Kangaslahti

Found 2 papers, 1 papers with code

Continuous Language Model Interpolation for Dynamic and Controllable Text Generation

2 code implementations10 Apr 2024 Sara Kangaslahti, David Alvarez-Melis

We empirically show that varying the interpolation weights yields predictable and consistent change in the model outputs with respect to all of the controlled attributes.

Language Modelling Text Generation

Can You Label Less by Using Out-of-Domain Data? Active & Transfer Learning with Few-shot Instructions

no code implementations21 Nov 2022 Rafal Kocielnik, Sara Kangaslahti, Shrimai Prabhumoye, Meena Hari, R. Michael Alvarez, Anima Anandkumar

Finally, we find that not all transfer scenarios yield a positive gain, which seems related to the PLMs initial performance on the target-domain task.

Active Learning Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.