no code implementations • 24 Jan 2024 • Matteo Alleman, Jack W Lindsey, Stefano Fusi
By studying the learning dynamics of networks with one hidden layer, we discovered that the network's activation function has an unexpectedly strong impact on the representational geometry: Tanh networks tend to learn representations that reflect the structure of the target outputs, while ReLU networks retain more information about the structure of the raw inputs.
no code implementations • 31 Aug 2023 • Srdjan Ostojic, Stefano Fusi
One major challenge of neuroscience is finding interesting structures in a seemingly disorganized neural activity.
no code implementations • 17 Aug 2021 • Stefano Fusi
The memory capacity depends on the complexity of the synapses, the sparseness of the representations, the spatial and temporal correlations between memories and the specific way memories are retrieved.
no code implementations • 1 Jul 2015 • Daniel Martí, Mattia Rigotti, Mingoo Seok, Stefano Fusi
We also show that the energy consumption of the IBM chip is typically 2 or more orders of magnitude lower than that of conventional digital machines when implementing classifiers with comparable performance.