Search Results for author: Afra Amini

Found 14 papers, 6 papers with code

Conditional Poisson Stochastic Beams

no code implementations EMNLP 2021 Clara Meister, Afra Amini, Tim Vieira, Ryan Cotterell

Beam search is the default decoding strategy for many sequence generation tasks in NLP.

Variational Best-of-N Alignment

no code implementations8 Jul 2024 Afra Amini, Tim Vieira, Ryan Cotterell

To the extent this fine-tuning is successful and we end up with a good approximation, we have reduced the inference cost by a factor of N. Our experiments on a controlled generation task suggest that while variational BoN is not as effective as BoN in aligning language models, it is close to BoN performance as vBoN appears more often on the Pareto frontier of reward and KL divergence compared to models trained with KL-constrained RL objective.

Language Modelling Variational Inference

The Role of $n$-gram Smoothing in the Age of Neural Networks

no code implementations25 Mar 2024 Luca Malagutti, Andrius Buinovskij, Anej Svete, Clara Meister, Afra Amini, Ryan Cotterell

For nearly three decades, language models derived from the $n$-gram assumption held the state of the art on the task.

Language Modelling Machine Translation

Direct Preference Optimization with an Offset

1 code implementation16 Feb 2024 Afra Amini, Tim Vieira, Ryan Cotterell

DPO, as originally formulated, relies on binary preference data and fine-tunes a language model to increase the likelihood of a preferred response over a dispreferred response.

Language Modelling

Principled Gradient-based Markov Chain Monte Carlo for Text Generation

no code implementations29 Dec 2023 Li Du, Afra Amini, Lucas Torroba Hennigen, Xinyan Velocity Yu, Jason Eisner, Holden Lee, Ryan Cotterell

Recent papers have demonstrated the possibility of energy-based text generation by adapting gradient-based sampling algorithms, a paradigm of MCMC algorithms that promises fast convergence.

Language Modelling Text Generation

Assessing Large Language Models on Climate Information

no code implementations4 Oct 2023 Jannis Bulian, Mike S. Schäfer, Afra Amini, Heidi Lam, Massimiliano Ciaramita, Ben Gaiarin, Michelle Chen Hübscher, Christian Buck, Niels G. Mede, Markus Leippold, Nadine Strauß

As Large Language Models (LLMs) rise in popularity, it is necessary to assess their capability in critically relevant domains.

Which Spurious Correlations Impact Reasoning in NLI Models? A Visual Interactive Diagnosis through Data-Constrained Counterfactuals

no code implementations21 Jun 2023 Robin Chan, Afra Amini, Mennatallah El-Assady

We present a human-in-the-loop dashboard tailored to diagnosing potential spurious features that NLI models rely on for predictions.

Logical Fallacies

Hexatagging: Projective Dependency Parsing as Tagging

1 code implementation8 Jun 2023 Afra Amini, Tianyu Liu, Ryan Cotterell

We introduce a novel dependency parser, the hexatagger, that constructs dependency trees by tagging the words in a sentence with elements from a finite set of possible tags.

Computational Efficiency Dependency Parsing +2

Structured Voronoi Sampling

1 code implementation NeurIPS 2023 Afra Amini, Li Du, Ryan Cotterell

In this paper, we take an important step toward building a principled approach for sampling from language models with gradient-based methods.

Text Generation

Linear-Time Modeling of Linguistic Structure: An Order-Theoretic Perspective

no code implementations24 May 2023 Tianyu Liu, Afra Amini, Mrinmaya Sachan, Ryan Cotterell

We show that these exhaustive comparisons can be avoided, and, moreover, the complexity of such tasks can be reduced to linear by casting the relation between tokens as a partial order over the string.

coreference-resolution Dependency Parsing +1

In-Context Probing: Toward Building Robust Classifiers via Probing Large Language Models

no code implementations23 May 2023 Afra Amini, Massimiliano Ciaramita

However, the effectiveness of in-context learning is dependent on the provided context, and the performance on a downstream task can vary considerably, depending on the instruction.

In-Context Learning

On Parsing as Tagging

1 code implementation14 Nov 2022 Afra Amini, Ryan Cotterell

There have been many proposals to reduce constituency parsing to tagging in the literature.

Constituency Parsing

Naturalistic Causal Probing for Morpho-Syntax

1 code implementation14 May 2022 Afra Amini, Tiago Pimentel, Clara Meister, Ryan Cotterell

Probing has become a go-to methodology for interpreting and analyzing deep neural models in natural language processing.

Sentence

Conditional Poisson Stochastic Beam Search

1 code implementation22 Sep 2021 Clara Meister, Afra Amini, Tim Vieira, Ryan Cotterell

In this work, we propose a new method for turning beam search into a stochastic process: Conditional Poisson stochastic beam search.

Cannot find the paper you are looking for? You can Submit a new open access paper.