Changelog
Source:NEWS.md
bonsai 0.3.1
CRAN release: 2024-07-23
- Fixed bug where
"aorsf"
models would not successfully fit in socket cluster workers (i.e. withplan(multisession)
) unless another engine requiring bonsai had been fitted in the worker (#85).
bonsai 0.3.0
CRAN release: 2024-06-23
Introduced support for accelerated oblique random forests for the
"classification"
and"regression"
modes using the new"aorsf"
engine (#78 by@bcjaeger
).Enabled passing Dataset Parameters to the
"lightgbm"
engine. To pass an argument that would be usually passed as an element to theparam
argument inlightgbm::lgb.Dataset()
, pass the argument directly through the ellipses inset_engine()
, e.g.boost_tree() %>% set_engine("lightgbm", linear_tree = TRUE)
(#77).Enabled case weights with the
"lightgbm"
engine (#72 by@p-schaefer
).Fixed issues in metadata for the
"partykit"
engine forrand_forest()
where some engine arguments were mistakenly protected (#74).Addressed type check error when fitting lightgbm model specifications with arguments mistakenly left as
tune()
(#79).
bonsai 0.2.1
CRAN release: 2022-11-29
- The most recent dials and parsnip releases introduced tuning integration for the lightgbm
num_leaves
engine argument! Thenum_leaves
parameter sets the maximum number of nodes per tree, and is an important tuning parameter for lightgbm (tidymodels/dials#256, tidymodels/parsnip#838). With the newest version of each of dials, parsnip, and bonsai installed, tune this argument by marking thenum_leaves
engine argument for tuning when defining your model specification:
boost_tree() %>% set_engine("lightgbm", num_leaves = tune())
- Fixed a bug where lightgbm’s parallelism argument
num_threads
was overridden when passed viaparam
rather than as a main argument. By default, then, lightgbm will fit sequentially rather than withnum_threads = foreach::getDoParWorkers()
. The user can still setnum_threads
via engine arguments withengine = "lightgbm"
:
boost_tree() %>% set_engine("lightgbm", num_threads = x)
Note that, when tuning hyperparameters with the tune package, detection of parallel backend will still work as usual.
The
boost_tree
argumentstop_iter
now maps to thelightgbm:::lgb.train()
argumentearly_stopping_round
rather than its aliasearly_stopping_rounds
. This does not affect parsnip’s interface to lightgbm (i.e. viaboost_tree() %>% set_engine("lightgbm")
), though will introduce errors for code that uses thetrain_lightgbm()
wrapper directly and sets thelightgbm::lgb.train()
argumentearly_stopping_round
by its aliasearly_stopping_rounds
viatrain_lightgbm()
’s...
.Disallowed passing main model arguments as engine arguments to
set_engine("lightgbm", ...)
via aliases. That is, if a main argument is marked for tuning and a lightgbm alias is supplied as an engine argument, bonsai will now error, rather than supplying both to lightgbm and allowing the package to handle aliases. Users can still interface with non-mainboost_tree()
arguments via their lightgbm aliases (#53).
bonsai 0.2.0
CRAN release: 2022-08-31
- Enabled bagging with lightgbm via the
sample_size
argument toboost_tree
(#32 and tidymodels/parsnip#768). The following docs now available in?details_boost_tree_lightgbm
describe the interface in detail:
The
sample_size
argument is translated to thebagging_fraction
parameter in theparam
argument oflgb.train
. The argument is interpreted by lightgbm as a proportion rather than a count, so bonsai internally reparameterizes thesample_size
argument with [dials::sample_prop()] during tuning.To effectively enable bagging, the user would also need to set the
bagging_freq
argument to lightgbm.bagging_freq
defaults to 0, which means bagging is disabled, and abagging_freq
argument ofk
means that the booster will perform bagging at everyk
th boosting iteration. Thus, by default, thesample_size
argument would be ignored without setting this argument manually. Other boosting libraries, like xgboost, do not have an analogous argument tobagging_freq
and usek = 1
when the analogue tobagging_fraction
is in . bonsai will thus automatically setbagging_freq = 1
inset_engine("lightgbm", ...)
ifsample_size
(i.e.bagging_fraction
) is not equal to 1 and nobagging_freq
value is supplied. This default can be overridden by setting thebagging_freq
argument toset_engine()
manually.
-
Corrected mapping of the
mtry
argument inboost_tree
with the lightgbm engine.mtry
previously mapped to thefeature_fraction
argument tolgb.train
but was documented as mapping to an argument more closely resemblingfeature_fraction_bynode
.mtry
now maps tofeature_fraction_bynode
.This means that code that set
feature_fraction_bynode
as an argument toset_engine()
will now error, and the user can now passfeature_fraction
toset_engine()
without raising an error. Fixed error in lightgbm with engine argument
objective = "tweedie"
and response values less than 1.A number of documentation improvements, increases in testing coverage, and changes to internals in anticipation of the 4.0.0 release of the lightgbm package. Thank you to
@jameslamb
for the effort and expertise!