-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
Home
Tianqi Chen edited this page May 17, 2014
·
80 revisions
XGBoost (short for eXtreme Gradient Boosting) is an efficient general purpose gradient boosting library. Via easy configuration, we can use different boosting models and objective functions to fit the real world data. To get started, read Binary Classification example.
- Binary Classification (read this for quick start)
- Regression
- Ranking
- See more examples in the demo folder
- Detailed parameter settings are provided in Parameters
The project have three layers, the listed files are commented headers that are useful to use code these layers.
- Booster core interface: booster/xgboost.h, booster/xgboost_data.h.
- Provides a interface to a single gradient boosters, all implements hides behind this interface
- Use this interface to add new implementation boosters, or use create booster to do specific tasks
- Booster ensemble base class: booster/xgboost_gbmbase.h
- Provides a base class that provides useful code for booster ensemble, provides the buffering scheme.
- Use this class to create customized learner with self-defined loss function. Take class GBMBase, use Predict to get predictions and calculate gradient and second order gradient, put the statistics back to DoBoost to update the model.
- Booster task wrappers: regression, rank(beta)
- Provides direct wrapper to do specific learning tasks