Sequential Image Classification
37 papers with code • 3 benchmarks • 3 datasets
Sequential image classification is the task of classifying a sequence of images.
( Image credit: TensorFlow-101 )
Libraries
Use these libraries to find Sequential Image Classification models and implementationsLatest papers
Traveling Waves Encode the Recent Past and Enhance Sequence Learning
Traveling waves of neural activity have been observed throughout the brain at a diversity of regions and scales; however, their precise computational role is still debated.
Sequence Modeling with Multiresolution Convolutional Memory
Popular approaches in the space tradeoff between the memory burden of brute-force enumeration and comparison, as in transformers, the computational burden of complicated sequential dependencies, as in recurrent neural networks, or the parameter burden of convolutional networks with many or large filters.
SMPConv: Self-moving Point Representations for Continuous Convolution
This paper suggests an alternative approach to building a continuous convolution without neural networks, resulting in more computationally efficient and improved performance.
Resurrecting Recurrent Neural Networks for Long Sequences
Recurrent Neural Networks (RNNs) offer fast inference on long sequences but are hard to optimize and slow to train.
Efficient recurrent architectures through activity sparsity and sparse back-propagation through time
However, there is still a need to bridge the gap between what RNNs are capable of in terms of efficiency and performance and real-world application requirements.
Efficiently Modeling Long Sequences with Structured State Spaces
A central goal of sequence modeling is designing a single principled model that can address sequence data across a range of modalities and tasks, particularly on long-range dependencies.
Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space Layers
Recurrent neural networks (RNNs), temporal convolutions, and neural differential equations (NDEs) are popular families of deep learning models for time-series data, each with unique strengths and tradeoffs in modeling power and computational efficiency.
FlexConv: Continuous Kernel Convolutions with Differentiable Kernel Sizes
In this work, we propose FlexConv, a novel convolutional operation with which high bandwidth convolutional kernels of learnable kernel size can be learned at a fixed parameter cost.
Long Expressive Memory for Sequence Modeling
We propose a novel method called Long Expressive Memory (LEM) for learning long-term sequential dependencies.
RNNs of RNNs: Recursive Construction of Stable Assemblies of Recurrent Neural Networks
Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of local neural activity.