Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:10:05
21 Sep 2021

We propose a new type of ensemble method that is specially designed for neural nets, and which produces surprising improvements in accuracy at a very small cost, without requiring to train a new neural net. The idea is to concatenate the output activations of internal layers of the neural net into an ``ensemble feature vector'', and train on this a decision tree to predict the class labels while also doing feature selection. For this to succeed we rely on a recently proposed algorithm to train decision trees - Tree Alternating Optimization. This simple procedure consistently improves over simply ensembling the nets in the traditional way, achieving relative error decreases of well over 10% of the original nets on the well known image classification benchmarks. As a subproduct, we also can obtain an architecture consisting of a neural net feature extraction followed by a tree classifier that is faster and more compact than the original net.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00