no code implementations • 30 Oct 2023 • Matthew J. Felicetti, Dianhui Wang
Stochastic Configuration Machines (SCMs) extend this to focus on reducing the memory constraints by limiting the randomized weights to a binary value with a scalar for each node and using a mechanism model to improve the learning performance and result interpretability.
no code implementations • 25 Aug 2023 • Dianhui Wang, Matthew J. Felicetti
Real-time predictive modelling with desired accuracy is highly expected in industrial artificial intelligence (IAI), where neural networks play a key role.
no code implementations • 6 Sep 2018 • Ming Li, Dianhui Wang
Stochastic configuration networks (SCNs) as a class of randomized learner model have been successfully employed in data analytics due to its universal approximation capability and fast modelling property.
no code implementations • 7 Aug 2018 • Mahardhika Pratama, Dianhui Wang
The concept of SCN offers a fast framework with universal approximation guarantee for lifelong learning of non-stationary data streams.
no code implementations • 2 Jul 2017 • Dianhui Wang, Caihao Cui
Based on the group of heterogeneous features, the block Jacobi and Gauss-Seidel methods are employed to iteratively evaluate the output weights, and a convergence analysis is given with a demonstration on the uniqueness of these iterative solutions.
no code implementations • 18 Feb 2017 • Dianhui Wang, Ming Li
This paper develops a randomized approach for incrementally building deep neural networks, where a supervisory mechanism is proposed to constrain the random assignment of the weights and biases, and all the hidden layers have direct links to the output layer.
no code implementations • 15 Feb 2017 • Dianhui Wang, Ming Li
The kernel density estimation (KDE) method is employed to set the penalty weights for each training samples, so that some negative impacts, caused by noisy data or outliers, on the resulting learner model can be reduced.
no code implementations • 10 Feb 2017 • Dianhui Wang, Ming Li
This paper contributes to a development of randomized methods for neural networks.