Sequential Image Classification

37 papers with code • 3 benchmarks • 3 datasets

Sequential image classification is the task of classifying a sequence of images.

( Image credit: TensorFlow-101 )

Libraries

Use these libraries to find Sequential Image Classification models and implementations
3 papers
56

Traveling Waves Encode the Recent Past and Enhance Sequence Learning

akandykeller/wave_rnns 3 Sep 2023

Traveling waves of neural activity have been observed throughout the brain at a diversity of regions and scales; however, their precise computational role is still debated.

0
03 Sep 2023

Sequence Modeling with Multiresolution Convolutional Memory

thjashin/multires-conv 2 May 2023

Popular approaches in the space tradeoff between the memory burden of brute-force enumeration and comparison, as in transformers, the computational burden of complicated sequential dependencies, as in recurrent neural networks, or the parameter burden of convolutional networks with many or large filters.

117
02 May 2023

SMPConv: Self-moving Point Representations for Continuous Convolution

sangnekim/smpconv CVPR 2023

This paper suggests an alternative approach to building a continuous convolution without neural networks, resulting in more computationally efficient and improved performance.

48
05 Apr 2023

Resurrecting Recurrent Neural Networks for Long Sequences

bojone/rnn 11 Mar 2023

Recurrent Neural Networks (RNNs) offer fast inference on long sequences but are hard to optimize and slow to train.

46
11 Mar 2023

Efficient recurrent architectures through activity sparsity and sparse back-propagation through time

khaleelkhan/evnn 13 Jun 2022

However, there is still a need to bridge the gap between what RNNs are capable of in terms of efficiency and performance and real-world application requirements.

17
13 Jun 2022

Efficiently Modeling Long Sequences with Structured State Spaces

hazyresearch/state-spaces ICLR 2022

A central goal of sequence modeling is designing a single principled model that can address sequence data across a range of modalities and tasks, particularly on long-range dependencies.

2,098
31 Oct 2021

Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space Layers

hazyresearch/state-spaces NeurIPS 2021

Recurrent neural networks (RNNs), temporal convolutions, and neural differential equations (NDEs) are popular families of deep learning models for time-series data, each with unique strengths and tradeoffs in modeling power and computational efficiency.

2,098
26 Oct 2021

FlexConv: Continuous Kernel Convolutions with Differentiable Kernel Sizes

rjbruin/flexconv ICLR 2022

In this work, we propose FlexConv, a novel convolutional operation with which high bandwidth convolutional kernels of learnable kernel size can be learned at a fixed parameter cost.

113
15 Oct 2021

Long Expressive Memory for Sequence Modeling

tk-rusch/lem ICLR 2022

We propose a novel method called Long Expressive Memory (LEM) for learning long-term sequential dependencies.

67
10 Oct 2021

RNNs of RNNs: Recursive Construction of Stable Assemblies of Recurrent Neural Networks

ennisthemennis/sparse-combo-net 16 Jun 2021

Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of local neural activity.

3
16 Jun 2021