Use of recurrent infomax to improve the memory capability of input-driven recurrent neural networks

14 Feb 2018  ·  Hisashi Iwade, Kohei Nakajima, Takuma Tanaka, Toshio Aoyagi ·

The inherent transient dynamics of recurrent neural networks (RNNs) have been exploited as a computational resource in input-driven RNNs. However, the information processing capability varies from RNN to RNN, depending on their properties. Many authors have investigated the dynamics of RNNs and their relevance to the information processing capability. In this study, we present a detailed analysis of the information processing capability of an RNN optimized by recurrent infomax (RI), which is an unsupervised learning scheme that maximizes the mutual information of RNNs by adjusting the connection strengths of the network. Thus, we observe that a delay-line structure emerges from the RI and the network optimized by the RI possesses superior short-term memory, which is the ability to store the temporal information of the input stream in its transient dynamics.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here