AdaTree: Boosting a Weak Classifier into a Decision Tree
Citations Over TimeTop 10% of 2005 papers
Abstract
We present a boosting method that results in a decision tree rather than a fixed linear sequence of classifiers. An equally correct statement is that we present a tree-growing method whose performance can be analysed in the framework of Adaboost. We argue that Adaboost can be improved by presenting the input to a sequence of weak classifiers, each one tuned to the conditional probability determined by the output of previous weak classifiers. As a result, the final classifier has a tree structure, rather than being linear, thus the name "Adatree". One of the consequences of the tree structure is that different input data may have different processing time. Early experimentation shows a reduced computation cost with respect to Adaboost. One of our intended applications is real-time detection, where cascades of boosted detectors have recently become successful. The reduced computation cost of the proposed method shows some potential for being used directly in detection problems, without need of a cascade.
Related Papers
- → Analysis of the prediction performance of decision tree-based algorithms(2023)33 cited
- → Combining Classifier Based on Decision Tree(2009)18 cited
- Research and Improvement of Decision Trees Algorithm(2007)
- Decision tree induction: Priority classification(2012)
- → Data-driven decision tree learning algorithm based on rough set theory(2005)6 cited