Self-attention for raw optical Satellite Time Series Classification

23 Oct 2019  ·  Marc Rußwurm, Marco Körner ·

The amount of available Earth observation data has increased dramatically in the recent years. Efficiently making use of the entire body information is a current challenge in remote sensing and demands for light-weight problem-agnostic models that do not require region- or problem-specific expert knowledge. End-to-end trained deep learning models can make use of raw sensory data by learning feature extraction and classification in one step solely from data. Still, many methods proposed in remote sensing research require implicit feature extraction through data preprocessing or explicit design of features. In this work, we compare recent deep learning models on crop type classification on raw and preprocessed Sentinel 2 data. We concentrate on the common neural network architectures for time series, i.e., 1D-convolutions, recurrence, a shallow random forest baseline, and focus on the novel self-attention architecture. Our central findings are that data preprocessing still increased the overall classification performance for all models while the choice of model was less crucial. Self-attention and recurrent neural networks, by their architecture, outperformed convolutional neural networks on raw satellite time series. We explore this by a feature importance analysis based on gradient back-propagation that exploits the differentiable nature of deep learning models. Further, we qualitatively show how self-attention scores focus selectively on few classification-relevant observations.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here