Hard Attention

28 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Hard Attention models and implementations
2 papers
34

Most implemented papers

Recurrent Models of Visual Attention

kevinzakka/recurrent-visual-attention NeurIPS 2014

Applying convolutional neural networks to large images is computationally expensive because the amount of computation scales linearly with the number of image pixels.

Deep Attention Recurrent Q-Network

5vision/DARQN 5 Dec 2015

A deep learning approach to reinforcement learning led to a general learner able to train on visual input to play a variety of arcade games at the human and superhuman levels.

Overcoming catastrophic forgetting with hard attention to the task

joansj/hat ICML 2018

In this paper, we propose a task-based hard attention mechanism that preserves previous tasks' information without affecting the current task's learning.

Hard Non-Monotonic Attention for Character-Level Transduction

shijie-wu/neural-transducer EMNLP 2018

We compare soft and hard non-monotonic attention experimentally and find that the exact algorithm significantly improves performance over the stochastic approximation and outperforms soft attention.

Exact Hard Monotonic Attention for Character-Level Transduction

shijie-wu/neural-transducer ACL 2019

Our models achieve state-of-the-art performance on morphological inflection.

Saccader: Improving Accuracy of Hard Attention Models for Vision

google-research/google-research NeurIPS 2019

Although deep convolutional neural networks achieve state-of-the-art performance across nearly all image classification tasks, their decisions are difficult to interpret.

Progressive Attention Networks for Visual Attribute Prediction

hworang77/PAN 8 Jun 2016

We propose a novel attention model that can accurately attends to target objects of various scales and shapes in images.

Morphological Inflection Generation with Hard Monotonic Attention

roeeaharoni/morphological-reinflection ACL 2017

We present a neural model for morphological inflection generation which employs a hard attention mechanism, inspired by the nearly-monotonic alignment commonly found between the characters in a word and the characters in its inflection.

Reinforced Self-Attention Network: a Hybrid of Hard and Soft Attention for Sequence Modeling

taoshen58/DiSAN 31 Jan 2018

In this paper, we integrate both soft and hard attention into one context fusion model, "reinforced self-attention (ReSA)", for the mutual benefit of each other.

Sequence-to-sequence Models for Cache Transition Systems

xiaochang13/CacheTransition-Seq2seq ACL 2018

In this paper, we present a sequence-to-sequence based approach for mapping natural language sentences to AMR semantic graphs.