Skip to main content
Browse by:
GROUP

Gradient Boosting Machines: Structural Insights and Improved Algorithms

Event Image
Monday, January 28, 2019
12:00 pm - 1:00 pm
Haihao Lu (MIT)
Applied Math And Analysis Seminar

The gradient boosting machine (GBM) is one of the most successful supervised learning algorithms, and it has been the dominant method in many data science competitions, including Kaggle and KDDCup. In spite of its practical success, there has been a huge gap between practice and theoretical understanding. In this line of research, we show that GBM can be interpreted as a greedy coordinate descent method in the coefficient space and/or a mirror descent method in the "pseudo-residual" space. Armed with this structural insight, we develop two new algorithms for classification in the context of GBM: (i) the Random-then-Greedy Gradient Boosting Machine (RtGBM), which lowers the cost per iteration and achieves improved performance in theory as well as practice; and (ii) the Accelerated Gradient Boosting Machine (AGBM), which achieves the computational efficiency of acceleration schemes in general, again both in theory and in practice. These two algorithms are currently being incorporated by Google into their TensorFlow Boosted Trees software.