Search Results for author: Michael Franke

Found 6 papers, 1 papers with code

Predictions from language models for multiple-choice tasks are not robust under variation of scoring methods

no code implementations1 Mar 2024 Polina Tsvilodub, Hening Wang, Sharon Grosch, Michael Franke

This paper systematically compares different methods of deriving item-level predictions of language models for multiple-choice tasks.

Multiple-choice

Evaluating Pragmatic Abilities of Image Captioners on A3DS

no code implementations22 May 2023 Polina Tsvilodub, Michael Franke

Evaluating grounded neural language model performance with respect to pragmatic qualities like the trade off between truthfulness, contrastivity and overinformativity of generated utterances remains a challenge in absence of data collected from humans.

Language Modelling

Overinformative Question Answering by Humans and Machines

no code implementations11 May 2023 Polina Tsvilodub, Michael Franke, Robert D. Hawkins, Noah D. Goodman

When faced with a polar question, speakers often provide overinformative answers going beyond a simple "yes" or "no".

Question Answering

A practical introduction to the Rational Speech Act modeling framework

no code implementations20 May 2021 Gregory Scontras, Michael Henry Tessler, Michael Franke

Recent advances in computational cognitive science (i. e., simulation-based probabilistic programs) have paved the way for significant progress in formal, implementable models of pragmatics.

Probabilistic modeling of rational communication with conditionals

no code implementations12 May 2021 Britta Grusdt, Daniel Lassiter, Michael Franke

While a large body of work has scrutinized the meaning of conditional sentences, considerably less attention has been paid to formal models of their pragmatic use and interpretation.

From partners to populations: A hierarchical Bayesian account of coordination and convention

1 code implementation12 Apr 2021 Robert D. Hawkins, Michael Franke, Michael C. Frank, Adele E. Goldberg, Kenny Smith, Thomas L. Griffiths, Noah D. Goodman

Languages are powerful solutions to coordination problems: they provide stable, shared expectations about how the words we say correspond to the beliefs and intentions in our heads.

Continual Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.