A Simple, Effective Way To Improve Neural Net Classification: Ensembling Unit Activations With A Sparse Oblique Decision Tree
Arman Zharmagambetov, Miguel ?�. Carreira-Perpi?ñ?n
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:10:05
We propose a new type of ensemble method that is specially designed for neural nets, and which produces surprising improvements in accuracy at a very small cost, without requiring to train a new neural net. The idea is to concatenate the output activations of internal layers of the neural net into an ``ensemble feature vector'', and train on this a decision tree to predict the class labels while also doing feature selection. For this to succeed we rely on a recently proposed algorithm to train decision trees - Tree Alternating Optimization. This simple procedure consistently improves over simply ensembling the nets in the traditional way, achieving relative error decreases of well over 10% of the original nets on the well known image classification benchmarks. As a subproduct, we also can obtain an architecture consisting of a neural net feature extraction followed by a tree classifier that is faster and more compact than the original net.