pith. machine review for the scientific record. sign in

arxiv: 1009.3702 · v1 · submitted 2010-09-20 · 💻 cs.LG

Recognition: unknown

Totally Corrective Multiclass Boosting with Binary Weak Learners

Authors on Pith no claims yet
classification 💻 cs.LG
keywords boostingmulticlassalgorithmsadaboostbinarycorrectiveduallagrange
0
0 comments X
read the original abstract

In this work, we propose a new optimization framework for multiclass boosting learning. In the literature, AdaBoost.MO and AdaBoost.ECC are the two successful multiclass boosting algorithms, which can use binary weak learners. We explicitly derive these two algorithms' Lagrange dual problems based on their regularized loss functions. We show that the Lagrange dual formulations enable us to design totally-corrective multiclass algorithms by using the primal-dual optimization technique. Experiments on benchmark data sets suggest that our multiclass boosting can achieve a comparable generalization capability with state-of-the-art, but the convergence speed is much faster than stage-wise gradient descent boosting. In other words, the new totally corrective algorithms can maximize the margin more aggressively.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.