Paper

Deep neural network initialization with decision trees

In this work a novel, automated process for constructing and initializing deep feed-forward neural networks based on decision trees is presented. The proposed algorithm maps a collection of decision trees trained on the data into a collection of initialized neural networks, with the structures of the networks determined by the structures of the trees. The tree-informed initialization acts as a warm-start to the neural network training process, resulting in efficiently trained, accurate networks. These models, referred to as "deep jointly-informed neural networks" (DJINN), demonstrate high predictive performance for a variety of regression and classification datasets, and display comparable performance to Bayesian hyper-parameter optimization at a lower computational cost. By combining the user-friendly features of decision tree models with the flexibility and scalability of deep neural networks, DJINN is an attractive algorithm for training predictive models on a wide range of complex datasets.

Results in Papers With Code
(↓ scroll down to see all results)