no code implementations • 20 Feb 2023 • Marc W. Howard, Zahra G. Esfahani, Bao Le, Per B. Sederberg
Spiking across populations of neurons in many regions of the mammalian brain maintains a robust temporal memory, a neural timeline of the recent past.
no code implementations • 5 Jan 2022 • Marc W. Howard
This chapter traces this line of thought from statistical learning theory in the 1950s, through distributed memory models in the latter part of the 20th century and early part of the 21st century through to modern models based on a scale-invariant temporal history.
1 code implementation • 9 Jul 2021 • Brandon G. Jacques, Zoran Tiganj, Aakash Sarkar, Marc W. Howard, Per B. Sederberg
This property, inspired by findings from contemporary neuroscience and consistent with findings from cognitive psychology, may enable networks that learn with fewer training examples, fewer weights and that generalize more robustly to out of sample data.
1 code implementation • NeurIPS 2021 • Brandon Jacques, Zoran Tiganj, Marc W. Howard, Per B. Sederberg
SITH modules respond to their inputs with a geometrically-spaced set of time constants, enabling the DeepSITH network to learn problems along a continuum of time-scales.
no code implementations • 26 Jan 2021 • Wei Zhong Goh, Varun Ursekar, Marc W. Howard
In recent years it has become clear that the brain maintains a temporal memory of recent events stretching far into the past.
no code implementations • 18 Feb 2018 • Zoran Tiganj, Samuel J. Gershman, Per B. Sederberg, Marc W. Howard
Widely used reinforcement learning algorithms discretize continuous time and estimate either transition functions from one step to the next (model-based algorithms) or a scalar value of exponentially-discounted future reward using the Bellman equation (model-free algorithms).
no code implementations • 19 Dec 2017 • Tyler A. Spears, Brandon G. Jacques, Marc W. Howard, Per B. Sederberg
In both the human brain and any general artificial intelligence (AI), a representation of the past is necessary to predict the future.
no code implementations • 22 Nov 2012 • Karthik H. Shankar, Marc W. Howard
If the signal has a characteristic timescale relevant to future prediction, the memory can be a simple shift register---a moving window extending into the past, requiring storage resources that linearly grows with the timescale to be represented.