JBoost: Free AdaBoost implementation

By | October 1, 2007

For those interested in machine learning and are looking for a nice implementation of a number of boosting algorithms including the basic AdaBoost and its variants LogitBoost, BrownBoost, BoosTexter and (soon) NormalBoost, you should check out the JBoost software from the University of California, San Diego. Other than the implementation of a large number of boosting algorithms, the software is optimized using Alternating Decision Trees (ADTrees) to significantly reduce the number of nodes in the learned classifier decision trees.