Search Results for author: Dianhui Wang

Found 8 papers, 0 papers with code

Stochastic Configuration Machines: FPGA Implementation

no code implementations30 Oct 2023 Matthew J. Felicetti, Dianhui Wang

Stochastic Configuration Machines (SCMs) extend this to focus on reducing the memory constraints by limiting the randomized weights to a binary value with a scalar for each node and using a mechanism model to improve the learning performance and result interpretability.

Stochastic Configuration Machines for Industrial Artificial Intelligence

no code implementations25 Aug 2023 Dianhui Wang, Matthew J. Felicetti

Real-time predictive modelling with desired accuracy is highly expected in industrial artificial intelligence (IAI), where neural networks play a key role.

Two Dimensional Stochastic Configuration Networks for Image Data Analytics

no code implementations6 Sep 2018 Ming Li, Dianhui Wang

Stochastic configuration networks (SCNs) as a class of randomized learner model have been successfully employed in data analytics due to its universal approximation capability and fast modelling property.

Face Recognition Vocal Bursts Valence Prediction

Deep Stacked Stochastic Configuration Networks for Lifelong Learning of Non-Stationary Data Streams

no code implementations7 Aug 2018 Mahardhika Pratama, Dianhui Wang

The concept of SCN offers a fast framework with universal approximation guarantee for lifelong learning of non-stationary data streams.

Continual Learning

Stochastic Configuration Networks Ensemble for Large-Scale Data Analytics

no code implementations2 Jul 2017 Dianhui Wang, Caihao Cui

Based on the group of heterogeneous features, the block Jacobi and Gauss-Seidel methods are employed to iteratively evaluate the output weights, and a convergence analysis is given with a demonstration on the uniqueness of these iterative solutions.

Ensemble Learning

Deep Stochastic Configuration Networks with Universal Approximation Property

no code implementations18 Feb 2017 Dianhui Wang, Ming Li

This paper develops a randomized approach for incrementally building deep neural networks, where a supervisory mechanism is proposed to constrain the random assignment of the weights and biases, and all the hidden layers have direct links to the output layer.

Robust Stochastic Configuration Networks with Kernel Density Estimation

no code implementations15 Feb 2017 Dianhui Wang, Ming Li

The kernel density estimation (KDE) method is employed to set the penalty weights for each training samples, so that some negative impacts, caused by noisy data or outliers, on the resulting learner model can be reduced.

Density Estimation

Stochastic Configuration Networks: Fundamentals and Algorithms

no code implementations10 Feb 2017 Dianhui Wang, Ming Li

This paper contributes to a development of randomized methods for neural networks.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.