This learner provides fitting procedures for xgboost
models, using
the xgboost package, via xgb.train
. Such
models are classification and regression trees with extreme gradient
boosting. For details on the fitting procedure, consult the documentation of
the xgboost and Chen and Guestrin (2016)
).
A learner object inheriting from Lrnr_base
with
methods for training and prediction. For a full list of learner
functionality, see the complete documentation of Lrnr_base
.
nrounds=20
: Number of fitting iterations.
...
: Other parameters passed to xgb.train
.
Chen T, Guestrin C (2016). “Xgboost: A scalable tree boosting system.” In Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, 785--794.
Lrnr_gbm for standard gradient boosting models (via the gbm package) and Lrnr_lightgbm for the faster and more efficient gradient boosted trees from the LightGBM framework (via the lightgbm package).
Other Learners:
Custom_chain
,
Lrnr_HarmonicReg
,
Lrnr_arima
,
Lrnr_bartMachine
,
Lrnr_base
,
Lrnr_bayesglm
,
Lrnr_bilstm
,
Lrnr_caret
,
Lrnr_cv_selector
,
Lrnr_cv
,
Lrnr_dbarts
,
Lrnr_define_interactions
,
Lrnr_density_discretize
,
Lrnr_density_hse
,
Lrnr_density_semiparametric
,
Lrnr_earth
,
Lrnr_expSmooth
,
Lrnr_gam
,
Lrnr_ga
,
Lrnr_gbm
,
Lrnr_glm_fast
,
Lrnr_glm_semiparametric
,
Lrnr_glmnet
,
Lrnr_glmtree
,
Lrnr_glm
,
Lrnr_grfcate
,
Lrnr_grf
,
Lrnr_gru_keras
,
Lrnr_gts
,
Lrnr_h2o_grid
,
Lrnr_hal9001
,
Lrnr_haldensify
,
Lrnr_hts
,
Lrnr_independent_binomial
,
Lrnr_lightgbm
,
Lrnr_lstm_keras
,
Lrnr_mean
,
Lrnr_multiple_ts
,
Lrnr_multivariate
,
Lrnr_nnet
,
Lrnr_nnls
,
Lrnr_optim
,
Lrnr_pca
,
Lrnr_pkg_SuperLearner
,
Lrnr_polspline
,
Lrnr_pooled_hazards
,
Lrnr_randomForest
,
Lrnr_ranger
,
Lrnr_revere_task
,
Lrnr_rpart
,
Lrnr_rugarch
,
Lrnr_screener_augment
,
Lrnr_screener_coefs
,
Lrnr_screener_correlation
,
Lrnr_screener_importance
,
Lrnr_sl
,
Lrnr_solnp_density
,
Lrnr_solnp
,
Lrnr_stratified
,
Lrnr_subset_covariates
,
Lrnr_svm
,
Lrnr_tsDyn
,
Lrnr_ts_weights
,
Pipeline
,
Stack
,
define_h2o_X()
,
undocumented_learner
data(mtcars)
mtcars_task <- sl3_Task$new(
data = mtcars,
covariates = c(
"cyl", "disp", "hp", "drat", "wt", "qsec", "vs", "am",
"gear", "carb"
),
outcome = "mpg"
)
# initialization, training, and prediction with the defaults
xgb_lrnr <- Lrnr_xgboost$new()
xgb_fit <- xgb_lrnr$train(mtcars_task)
xgb_preds <- xgb_fit$predict()
# get feature importance from fitted model
xgb_varimp <- xgb_fit$importance()