no code implementations • EMNLP 2021 • Arman Zharmagambetov, Magzhan Gabidolla, Miguel A. Carreira-Perpinan
Classification problems having thousands or more classes naturally occur in NLP, for example language models or document classification.
no code implementations • ICML 2020 • Arman Zharmagambetov, Miguel Carreira-Perpinan
We show that using TAO with the bagging approach produces much better forests than random forests, Adaboost or gradient boosting in every dataset we have tried across a wide range of input and output dimensionality and sample size.
no code implementations • 3 Oct 2023 • Aaron Ferber, Arman Zharmagambetov, Taoan Huang, Bistra Dilkina, Yuandong Tian
Generating diverse objects (e. g., images) using generative models (such as GAN or VAE) has achieved impressive results in the recent years, to help solve many design problems that are traditionally done by humans.
no code implementations • CVPR 2023 • Miguel Á. Carreira-Perpiñán, Magzhan Gabidolla, Arman Zharmagambetov
However, unlike for most other models, such as neural networks, optimizing forests or trees is not easy, because they define a non-differentiable function.
no code implementations • 29 Sep 2021 • Yerlan Idelbayev, Arman Zharmagambetov, Magzhan Gabidolla, Miguel A. Carreira-Perpinan
We show that neural nets can be further compressed by replacing layers of it with a special type of decision forest.
no code implementations • 7 Apr 2021 • Suryabhan Singh Hada, Miguel Á. Carreira-Perpiñán, Arman Zharmagambetov
The widespread deployment of deep nets in practical applications has lead to a growing desire to understand how and why such black-box methods perform prediction.
no code implementations • 8 Nov 2019 • Arman Zharmagambetov, Suryabhan Singh Hada, Miguel Á. Carreira-Perpiñán, Magzhan Gabidolla
This paper presents a detailed comparison of a recently proposed algorithm for optimizing decision trees, tree alternating optimization (TAO), with other popular, established algorithms.