Search Results for author: Chuhan Zhang

Found 6 papers, 1 papers with code

NiSNN-A: Non-iterative Spiking Neural Networks with Attention with Application to Motor Imagery EEG Classification

no code implementations9 Dec 2023 Chuhan Zhang, Wei Pan, Cosimo Della Santina

Motor imagery, an important category in electroencephalogram (EEG) research, often intersects with scenarios demanding low energy consumption, such as portable medical devices and isolated environment operations.

EEG Motor Imagery

Helping Hands: An Object-Aware Ego-Centric Video Recognition Model

1 code implementation ICCV 2023 Chuhan Zhang, Ankush Gupta, Andrew Zisserman

We demonstrate the performance of the object-aware representations learnt by our model, by: (i) evaluating it for strong transfer, i. e. through zero-shot testing, on a number of downstream video-text retrieval and classification benchmarks; and (ii) by using the representations learned as input for long-term video understanding tasks (e. g. Episodic Memory in Ego4D).

Object Text Retrieval +3

Is an Object-Centric Video Representation Beneficial for Transfer?

no code implementations20 Jul 2022 Chuhan Zhang, Ankush Gupta, Andrew Zisserman

The model learns a set of object-centric summary vectors for the video, and uses these vectors to fuse the visual and spatio-temporal trajectory 'modalities' of the video clip.

Action Classification Object +1

Temporal Query Networks for Fine-grained Video Understanding

no code implementations CVPR 2021 Chuhan Zhang, Ankush Gupta, Andrew Zisserman

It attends to relevant segments for each query with a temporal attention mechanism, and can be trained using only the labels for each query.

Action Classification Action Recognition +1

Adaptive Text Recognition through Visual Matching

no code implementations ECCV 2020 Chuhan Zhang, Ankush Gupta, Andrew Zisserman

In this work, our objective is to address the problems of generalization and flexibility for text recognition in documents.

Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.