While LSTMs show increasingly promising results for forecasting Financial
Time Series (FTS), this paper seeks to assess if attention mechanisms can
further improve performance. The hypothesis is that attention can help prevent
long-term dependencies experienced by LSTM models...
To test this hypothesis, the
main contribution of this paper is the implementation of an LSTM with
attention. Both the benchmark LSTM and the LSTM with attention were compared
and both achieved reasonable performances of up to 60% on five stocks from
Kaggle's Two Sigma dataset. This comparative analysis demonstrates that an LSTM
with attention can indeed outperform standalone LSTMs but further investigation
is required as issues do arise with such model architectures.