Search Results for author: Michael Hahn

Found 9 papers, 3 papers with code

Sensitivity as a Complexity Measure for Sequence Classification Tasks

1 code implementation21 Apr 2021 Michael Hahn, Dan Jurafsky, Richard Futrell

We introduce a theoretical framework for understanding and predicting the complexity of sequence classification tasks, using a novel extension of the theory of Boolean function sensitivity.

Classification General Classification +1

Theoretical Limitations of Self-Attention in Neural Sequence Models

no code implementations TACL 2020 Michael Hahn

These limitations seem surprising given the practical success of self-attention and the prominent role assigned to hierarchical structure in linguistics, suggesting that natural language can be approximated well with models that are too weak for the formal languages typically assumed in theoretical linguistics.

Hierarchical structure

Character-based Surprisal as a Model of Reading Difficulty in the Presence of Error

no code implementations2 Feb 2019 Michael Hahn, Frank Keller, Yonatan Bisk, Yonatan Belinkov

Also, transpositions are more difficult than misspellings, and a high error rate increases difficulty for all words, including correct ones.

Eye Tracking

Modeling Task Effects in Human Reading with Neural Attention

no code implementations31 Jul 2018 Michael Hahn, Frank Keller

We propose a neural architecture that combines an attention module (deciding whether to skip words) and a task module (memorizing the input).

Eye Tracking Question Answering

Cannot find the paper you are looking for? You can Submit a new open access paper.