Search Results for author: Magzhan Gabidolla

Found 5 papers, 0 papers with code

Softmax Tree: An Accurate, Fast Classifier When the Number of Classes Is Large

no code implementations EMNLP 2021 Arman Zharmagambetov, Magzhan Gabidolla, Miguel A. Carreira-Perpinan

Classification problems having thousands or more classes naturally occur in NLP, for example language models or document classification.

Document Classification

Towards Better Decision Forests: Forest Alternating Optimization

no code implementations CVPR 2023 Miguel Á. Carreira-Perpiñán, Magzhan Gabidolla, Arman Zharmagambetov

However, unlike for most other models, such as neural networks, optimizing forests or trees is not easy, because they define a non-differentiable function.

Pushing the Envelope of Gradient Boosting Forests via Globally-Optimized Oblique Trees

no code implementations CVPR 2022 Magzhan Gabidolla, Miguel Á. Carreira-Perpiñán

Ensemble methods based on decision trees, such as Random Forests or boosted forests, have long been established as some of the most powerful, off-the-shelf machine learning models, and have been widely used in computer vision and other areas.

Faster Neural Net Inference via Forests of Sparse Oblique Decision Trees

no code implementations29 Sep 2021 Yerlan Idelbayev, Arman Zharmagambetov, Magzhan Gabidolla, Miguel A. Carreira-Perpinan

We show that neural nets can be further compressed by replacing layers of it with a special type of decision forest.

Quantization

An Experimental Comparison of Old and New Decision Tree Algorithms

no code implementations8 Nov 2019 Arman Zharmagambetov, Suryabhan Singh Hada, Miguel Á. Carreira-Perpiñán, Magzhan Gabidolla

This paper presents a detailed comparison of a recently proposed algorithm for optimizing decision trees, tree alternating optimization (TAO), with other popular, established algorithms.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.