no code implementations • 7 May 2022 • Wan-Ping Nicole Chen, Yuan-Chin Ivan Chang
In this paper, we propose a probability-based nonparametric multiple-class classification method, and integrate it with the ability of identifying high impact variables for individual class such that we can have more information about its classification rule and the character of each class as well.
no code implementations • 29 Jan 2019 • Zhanfeng Wang, Yumi Kwon, Yuan-Chin Ivan Chang
For a classification problem, this means that the essential label information may not be readily obtainable, in the data set in hands, and an extra labeling procedure is required such that we can have enough label information to be used for constructing a classification model.
no code implementations • 4 Jan 2019 • Wan-Ping Nicole Chen, Yuan-Chin Ivan Chang
Variable selection is a common way to increase the ability of model interpretation and is popularly used with some parametric classification models.
no code implementations • 22 Dec 2018 • Zhanfeng Wang, Yuan-Chin Ivan Chang
To analyse a very large data set containing lengthy variables, we adopt a sequential estimation idea and propose a parallel divide-and-conquer method.
no code implementations • 1 Feb 2018 • Hsiang-Ling Hsu, Yuan-Chin Ivan Chang, Ray-Bing Chen
Our numerical results show that the proposed procedure has competitive performance, with smaller training size and a more compact model, comparing with that of the classifier trained with all variables and a full data set.