Hard Attention

34 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?


Use these libraries to find Hard Attention models and implementations
2 papers

Most implemented papers

Recurrent Models of Visual Attention

kevinzakka/recurrent-visual-attention NeurIPS 2014

Applying convolutional neural networks to large images is computationally expensive because the amount of computation scales linearly with the number of image pixels.

Deep Attention Recurrent Q-Network

5vision/DARQN 5 Dec 2015

A deep learning approach to reinforcement learning led to a general learner able to train on visual input to play a variety of arcade games at the human and superhuman levels.

Dual Attention Networks for Few-Shot Fine-Grained Recognition

msfuxian/DualAttentionNet Proceedings of the AAAI Conference on Artificial Intelligence 2022

Specifically, by producing attention guidance from deep activations of input images, our hard-attention is realized by keeping a few useful deep descriptors and forming them as a bag of multi-instance learning.

Overcoming catastrophic forgetting with hard attention to the task

joansj/hat ICML 2018

In this paper, we propose a task-based hard attention mechanism that preserves previous tasks' information without affecting the current task's learning.

Hard Non-Monotonic Attention for Character-Level Transduction

shijie-wu/neural-transducer EMNLP 2018

We compare soft and hard non-monotonic attention experimentally and find that the exact algorithm significantly improves performance over the stochastic approximation and outperforms soft attention.

Exact Hard Monotonic Attention for Character-Level Transduction

shijie-wu/neural-transducer ACL 2019

Our models achieve state-of-the-art performance on morphological inflection.

Saccader: Improving Accuracy of Hard Attention Models for Vision

google-research/google-research NeurIPS 2019

Although deep convolutional neural networks achieve state-of-the-art performance across nearly all image classification tasks, their decisions are difficult to interpret.

Progressive Attention Networks for Visual Attribute Prediction

hworang77/PAN 8 Jun 2016

We propose a novel attention model that can accurately attends to target objects of various scales and shapes in images.

Morphological Inflection Generation with Hard Monotonic Attention

roeeaharoni/morphological-reinflection ACL 2017

We present a neural model for morphological inflection generation which employs a hard attention mechanism, inspired by the nearly-monotonic alignment commonly found between the characters in a word and the characters in its inflection.

Reinforced Self-Attention Network: a Hybrid of Hard and Soft Attention for Sequence Modeling

taoshen58/DiSAN 31 Jan 2018

In this paper, we integrate both soft and hard attention into one context fusion model, "reinforced self-attention (ReSA)", for the mutual benefit of each other.