Skip to main content
Browse by:
GROUP

Explaining AdaBoost

Event Image
Wednesday, May 07, 2014
3:30 pm - 4:30 pm
Rob Schapire, Princeton University
Machine Learning Seminar

FINAL ML TALK FOR THE SPRING SEMESTER Lunch 12:30pm, Learner Lecture 1-2pmtalk 3:30pm with reception to followBoosting is an approach to machine learning based on the idea of creating a highly accurate prediction rule by combining many relatively weak and inaccurate rules. AdaBoost, the first practical boosting algorithm, has enjoyed empirical success in a number of fields, and a remarkably rich theory has evolved to try to understand how and why it works, and under what conditions. At various times in its history, AdaBoost has been the subject of controversy for the mystery and paradox it seems to present with regard to this question. This talk will give a high-level review and comparison of the varied attempts that have been made to understand and "explain" AdaBoost as a learning method. These approaches (time permitting) will include: direct application of the classic theory of Vapnik and Chervonenkis; the margins theory; AdaBoost as a loss-minimization algorithm (possibly implicitly regularized); and AdaBoost as a universally consistent method. Both strengths and weaknesses of each of these will be discussed.

Type: LECTURE/TALK
Contact: Kathy Peterson