Tomorrow's robots will need to distinguish useful information from noise when performing different tasks.
The receptive field (RF), which determines the region of time series to be ``seen'' and used, is critical to improve the performance for time series classification (TSC).
Our framework, FreeTickets, is defined as the ensemble of these relatively cheap sparse subnetworks.
In this paper, we introduce for the first time a dynamic sparse training approach for deep reinforcement learning to accelerate the training process.
A fundamental task for artificial intelligence is learning.
This method, named QuickSelection, introduces the strength of the neuron in sparse neural networks as a criterion to measure the feature importance.
In an attempt to solve this problem, the one-shot learning paradigm, which makes use of just one labeled sample per class and prior knowledge, becomes increasingly important.
Unprecedented high volumes of data are becoming available with the growth of the advanced metering infrastructure.
Through the success of deep learning in various domains, artificial neural networks are currently among the most used artificial intelligence methods.
Energy is a limited resource which has to be managed wisely, taking into account both supply-demand matching and capacity constraints in the distribution grid.
Thirdly, we show that, for a fixed number of weights, our proposed sparse models (which by design have a higher number of hidden neurons) achieve better generative capabilities than standard fully connected RBMs and GRBMs (which by design have a smaller number of hidden neurons), at no additional computational costs.