Browse > Computer Vision > Image Classification > Sequential Image Classification

Sequential Image Classification

6 papers with code · Computer Vision

Sequential image classification is the task of classifying a sequence of images.

State-of-the-art leaderboards

Trend Dataset Best Method Paper title Paper Code Compare

Greatest papers with code

An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling

4 Mar 2018locuslab/TCN

Our results indicate that a simple convolutional architecture outperforms canonical recurrent networks such as LSTMs across a diverse range of tasks and datasets, while demonstrating longer effective memory. We conclude that the common association between sequence modeling and recurrent networks should be reconsidered, and convolutional networks should be regarded as a natural starting point for sequence modeling tasks.

LANGUAGE MODELLING MACHINE TRANSLATION MUSIC MODELING SEQUENTIAL IMAGE CLASSIFICATION

Dilated Recurrent Neural Networks

NeurIPS 2017 code-terminator/DilatedRNN

There are three major challenges: 1) complex dependencies, 2) vanishing and exploding gradients, and 3) efficient parallelization. To provide a theory-based quantification of the architecture's advantages, we introduce a memory capacity measure, the mean recurrent length, which is more suitable for RNNs with long skip connections than existing measures.

SEQUENTIAL IMAGE CLASSIFICATION

Recurrent Batch Normalization

30 Mar 2016cooijmanstim/recurrent-batch-normalization

We propose a reparameterization of LSTM that brings the benefits of batch normalization to recurrent neural networks. Whereas previous works only apply batch normalization to the input-to-hidden transformation of RNNs, we demonstrate that it is both possible and beneficial to batch-normalize the hidden-to-hidden transition, thereby reducing internal covariate shift between time steps.

LANGUAGE MODELLING QUESTION ANSWERING READING COMPREHENSION SEQUENTIAL IMAGE CLASSIFICATION

Full-Capacity Unitary Recurrent Neural Networks

NeurIPS 2016 stwisdom/urnn

Unitary recurrent neural networks (uRNNs), which use unitary recurrence matrices, have recently been proposed as a means to avoid these issues. To address this question, we propose full-capacity uRNNs that optimize their recurrence matrix over all unitary matrices, leading to significantly improved performance over uRNNs that use a restricted-capacity recurrence matrix.

SEQUENTIAL IMAGE CLASSIFICATION

Unitary Evolution Recurrent Neural Networks

20 Nov 2015Avmb/lowrank-gru

When the eigenvalues of the hidden to hidden weight matrix deviate from absolute value 1, optimization becomes difficult due to the well studied issue of vanishing and exploding gradients, especially when trying to learn long-term dependencies. The challenge we address is that of parametrizing unitary matrices in a way that does not require expensive computations (such as eigendecomposition) after each weight update.

SEQUENTIAL IMAGE CLASSIFICATION

A Simple Way to Initialize Recurrent Networks of Rectified Linear Units

3 Apr 2015trevor-richardson/rnn_zoo

Learning long term dependencies in recurrent networks is difficult due to vanishing and exploding gradients. To overcome this difficulty, researchers have developed sophisticated optimization techniques and network architectures.

LANGUAGE MODELLING SEQUENTIAL IMAGE CLASSIFICATION SPEECH RECOGNITION