Random vector functional link network: recent developments, applications, and future directions

13 Feb 2022  ·  A. K. Malik, Ruobin Gao, M. A. Ganaie, M. Tanveer, P. N. Suganthan ·

Neural networks have been successfully employed in various domains such as classification, regression and clustering, etc. Generally, the back propagation (BP) based iterative approaches are used to train the neural networks, however, it results in the issues of local minima, sensitivity to learning rate and slow convergence. To overcome these issues, randomization based neural networks such as random vector functional link (RVFL) network have been proposed. RVFL model has several characteristics such as fast training speed, direct links, simple architecture, and universal approximation capability, that make it a viable randomized neural network. This article presents the first comprehensive review of the evolution of RVFL model, which can serve as the extensive summary for the beginners as well as practitioners. We discuss the shallow RVFLs, ensemble RVFLs, deep RVFLs and ensemble deep RVFL models. The variations, improvements and applications of RVFL models are discussed in detail. Moreover, we discuss the different hyperparameter optimization techniques followed in the literature to improve the generalization performance of the RVFL model. Finally, we give potential future research directions/opportunities that can inspire the researchers to improve the RVFL's architecture and learning algorithm further.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here