Boosted trees with lightgbmSource:
train_lightgbm is a wrapper for
lightgbm tree-based models
where all of the model arguments are in the main function.
train_lightgbm( x, y, max_depth = -1, num_iterations = 100, learning_rate = 0.1, feature_fraction_bynode = 1, min_data_in_leaf = 20, min_gain_to_split = 0, bagging_fraction = 1, early_stopping_round = NULL, validation = 0, counts = TRUE, quiet = FALSE, ... )
A data frame or matrix of predictors
A vector (factor or numeric) or matrix (numeric) of outcome data.
An integer for the maximum depth of the tree.
An integer for the number of boosting iterations.
A numeric value between zero and one to control the learning rate.
Fraction of predictors that will be randomly sampled at each split.
A numeric value for the minimum sum of instances needed in a child to continue to split.
A number for the minimum loss reduction required to make a further partition on a leaf node of the tree.
Subsampling proportion of rows. Setting this argument to a non-default value will also set
bagging_freq = 1. See the Bagging section in
?details_boost_tree_lightgbmfor more details.
Number of iterations without an improvement in the objective function occur before training should be halted.
The proportion of the training data that are used for performance assessment and potential early stopping.
A logical; should
feature_fraction_bynodebe interpreted as the number of predictors that will be randomly sampled at each split?
mtrywill be interpreted in its sense as a count,
FALSEindicates that the argument will be interpreted in its sense as a proportion.
A logical; should logging by
Other options to pass to
lightgbm::lgb.train(). Arguments will be correctly routed to the
paramargument, or as a main argument, depending on their name.