Unpublished conference/Abstract (Scientific congresses and symposiums)
Gradient Boosted Regression Trees in Scikit-Learn
Prettenhofer, Peter; Louppe, Gilles
2014PyData 2014
 

Files


Full Text
slides.pdf
Publisher postprint (5.29 MB)
Download

All documents in ORBi are protected by a user license.

Send to



Details



Keywords :
machine learning; boosting; python
Abstract :
[en] This talk describes Gradient Boosted Regression Trees (GBRT), a powerful statistical learning technique with applications in a variety of areas, ranging from web page ranking to environmental niche modeling. GBRT is a key ingredient of many winning solutions in data-mining competitions such as the Netflix Prize, the GE Flight Quest, or the Heritage Health Price. We give a brief introduction to the GBRT model and regression trees -- focusing on intuition rather than mathematical formulas. The majority of the talk is dedicated to an in depth discussion how to apply GBRT in practice using scikit-learn. We cover important topics such as regularization, model tuning and model interpretation that should significantly improve your score on Kaggle.
Disciplines :
Computer science
Author, co-author :
Prettenhofer, Peter
Louppe, Gilles  ;  Université de Liège - ULiège > Dép. d'électric., électron. et informat. (Inst.Montefiore) > Systèmes et modélisation
Language :
English
Title :
Gradient Boosted Regression Trees in Scikit-Learn
Publication date :
23 February 2014
Event name :
PyData 2014
Event place :
London, United Kingdom
Event date :
21-23 February 2014
By request :
Yes
Audience :
International
Available on ORBi :
since 24 February 2014

Statistics


Number of views
5171 (22 by ULiège)
Number of downloads
13002 (18 by ULiège)

Bibliography


Similar publications



Contact ORBi