Sequential Image Classification
37 papers with code • 3 benchmarks • 3 datasets
Sequential image classification is the task of classifying a sequence of images.
( Image credit: TensorFlow-101 )
Libraries
Use these libraries to find Sequential Image Classification models and implementationsLatest papers
UnICORNN: A recurrent model for learning very long time dependencies
The design of recurrent neural networks (RNNs) to accurately process sequential inputs with long-time dependencies is very challenging on account of the exploding and vanishing gradient problem.
Sequential Place Learning: Heuristic-Free High-Performance Long-Term Place Recognition
Sequential matching using hand-crafted heuristics has been standard practice in route-based place recognition for enhancing pairwise similarity results for nearly a decade.
Parallelizing Legendre Memory Unit Training
For instance, our LMU sets a new state-of-the-art result on psMNIST, and uses half the parameters while outperforming DistilBERT and LSTM models on IMDB sentiment analysis.
CKConv: Continuous Kernel Convolution For Sequential Data
Convolutional networks are unable to handle sequences of unknown size and their memory horizon must be defined a priori.
DeepSeqSLAM: A Trainable CNN+RNN for Joint Global Description and Sequence-based Place Recognition
Sequence-based place recognition methods for all-weather navigation are well-known for producing state-of-the-art results under challenging day-night or summer-winter transitions.
Coupled Oscillatory Recurrent Neural Network (coRNN): An accurate and (gradient) stable architecture for learning long time dependencies
Circuits of biological neurons, such as in the functional parts of the brain can be modeled as networks of coupled oscillators.
HiPPO: Recurrent Memory with Optimal Polynomial Projections
A central problem in learning from sequential data is representing cumulative history in an incremental fashion as more data is processed.
Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neural Networks with Attention over Modules
To effectively utilize the wealth of potential top-down information available, and to prevent the cacophony of intermixed signals in a bidirectional architecture, mechanisms are needed to restrict information flow.
Lipschitz Recurrent Neural Networks
Viewing recurrent neural networks (RNNs) as continuous-time dynamical systems, we propose a recurrent unit that describes the hidden state's evolution with two parts: a well-understood linear component plus a Lipschitz nonlinearity.
Learning Long-Term Dependencies in Irregularly-Sampled Time Series
These models, however, face difficulties when the input data possess long-term dependencies.