no code implementations • 4 Apr 2022 • Carmina Fjellström, Kaj Nyström
Stochastic gradient descent (SGD) is widely used in deep learning due to its computational efficiency, but a complete understanding of why SGD performs so well remains a major challenge.
no code implementations • 20 Jan 2022 • Carmina Fjellström
With a straightforward trading strategy, comparisons with a randomly chosen portfolio and a portfolio containing all the stocks in the index show that the portfolio resulting from the LSTM ensemble provides better average daily returns and higher cumulative returns over time.