no code implementations • 21 Dec 2022 • Juergen Schmidhuber
Machine learning is the science of credit assignment: finding patterns in observations that predict the consequences of actions and help to improve future performance.
no code implementations • 12 May 2020 • Juergen Schmidhuber
In 2020-2021, we celebrated that many of the basic ideas behind the deep learning revolution were published three decades ago within fewer than 12 months in our "Annus Mirabilis" or "Miraculous Year" 1990-1991 at TU Munich.
3 code implementations • 5 Dec 2019 • Juergen Schmidhuber
UDRL generalizes to achieve high rewards or other goals, through input commands such as: get lots of reward within at most so much time!
no code implementations • 11 Jun 2019 • Juergen Schmidhuber
I review unsupervised or self-supervised neural networks playing minimax games in game-theoretic settings: (i) Artificial Curiosity (AC, 1990) is based on two such networks.
no code implementations • ICLR 2019 • Robert Csordas, Juergen Schmidhuber
The Differentiable Neural Computer (DNC) can learn algorithmic and question answering tasks.
no code implementations • 30 Apr 2018 • Michael Wand, Ngoc Thang Vu, Juergen Schmidhuber
Audiovisual speech recognition (AVSR) is a method to alleviate the adverse effect of noise in the acoustic signal.
no code implementations • 24 Feb 2018 • Juergen Schmidhuber
Then ONE is retrained in PowerPlay style (2011) on stored input/output traces of (a) ONE's copy executing the new skill and (b) previous instances of ONE whose skills are still considered worth memorizing.
1 code implementation • ICLR 2019 • Paulo Rauber, Avinash Ummadisingu, Filipe Mutz, Juergen Schmidhuber
A reinforcement learning agent that needs to pursue different goals across episodes requires a goal-conditional policy.
no code implementations • 4 Aug 2017 • Michael Wand, Juergen Schmidhuber
We present a Lipreading system, i. e. a speech recognition system using only visual features, which uses domain-adversarial training for speaker independence.
no code implementations • 30 Nov 2015 • Juergen Schmidhuber
The basic ideas of this report can be applied to many other cases where one RNN-like system exploits the algorithmic information content of another.
no code implementations • NeurIPS 2015 • Marijn F. Stollenga, Wonmin Byeon, Marcus Liwicki, Juergen Schmidhuber
In contrast, Multi-Dimensional Recurrent NNs (MD-RNNs) can perceive the entire spatio-temporal context of each pixel in a few sweeps through all pixels, especially when the RNN is a Long Short-Term Memory (LSTM).
no code implementations • NeurIPS 2014 • Marijn Stollenga, Jonathan Masci, Faustino Gomez, Juergen Schmidhuber
It harnesses the power of sequential processing to improve classification performance, by allowing the network to iteratively focus its internal attention on some of its convolutional filters.
Ranked #183 on
Image Classification
on CIFAR-10
1 code implementation • 30 Apr 2014 • Juergen Schmidhuber
In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning.
no code implementations • 2 May 2013 • Somayeh Danafar, Paola M. V. Rancoita, Tobias Glasmachers, Kevin Whittingstall, Juergen Schmidhuber
Do two data samples come from different distributions?
1 code implementation • 13 Feb 2012 • Dan Cireşan, Ueli Meier, Juergen Schmidhuber
Traditional methods of computer vision and machine learning cannot match human performance on tasks such as the recognition of handwritten digits or traffic signs.
Ranked #7 on
Image Classification
on MNIST
1 code implementation • 1 Mar 2010 • Dan Claudiu Ciresan, Ueli Meier, Luca Maria Gambardella, Juergen Schmidhuber
Good old on-line back-propagation for plain multi-layer perceptrons yields a very low 0. 35% error rate on the famous MNIST handwritten digits benchmark.
2 code implementations • 23 Dec 2008 • Juergen Schmidhuber
I argue that data becomes temporarily interesting by itself to some self-improving, but computationally limited, subjective observer once he learns to predict or compress the data in a better way, thus making it subjectively simpler and more beautiful.
4 code implementations • 14 May 2007 • Alex Graves, Santiago Fernandez, Juergen Schmidhuber
Recurrent neural networks (RNNs) have proved effective at one dimensional sequence learning tasks, such as speech and online handwriting recognition.
no code implementations • 19 Jun 2006 • Juergen Schmidhuber
Artificial Intelligence (AI) has recently become a real formal science: the new millennium brought the first mathematically sound, asymptotically optimal, universal problem solvers, providing a new, rigorous foundation for the previously largely heuristic field of General AI and embedded agents.
no code implementations • 30 Nov 2000 • Juergen Schmidhuber
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE.
no code implementations • 13 Apr 1999 • Juergen Schmidhuber
Is the universe computable?
Quantum Physics Computational Complexity Computers and Society Computational Physics Popular Physics