The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.
A minimalist implementation of model stacking by Wolpert (1992) <doi:10.1016/S0893-6080(05)80023-1> for boosted tree models. A classic, two-layer stacking model is implemented, where the first layer generates features using gradient boosting trees, and the second layer employs a logistic regression model that uses these features as inputs. Utilities for training the base models and parameters tuning are provided, allowing users to experiment with different ensemble configurations easily. It aims to provide a simple and efficient way to combine multiple gradient boosting models to improve predictive model performance and robustness.
Version: | 0.1.0 |
Depends: | R (≥ 3.5.0) |
Imports: | pROC, progress, rlang |
Suggests: | knitr, lightgbm, msaenet, rmarkdown, xgboost |
Published: | 2024-04-30 |
DOI: | 10.32614/CRAN.package.stackgbm |
Author: | Nan Xiao [aut, cre, cph] |
Maintainer: | Nan Xiao <me at nanx.me> |
BugReports: | https://github.com/nanxstats/stackgbm/issues |
License: | MIT + file LICENSE |
URL: | https://nanx.me/stackgbm/, https://github.com/nanxstats/stackgbm |
NeedsCompilation: | no |
Materials: | README NEWS |
CRAN checks: | stackgbm results |
Reference manual: | stackgbm.pdf |
Vignettes: |
Model stacking for boosted trees |
Package source: | stackgbm_0.1.0.tar.gz |
Windows binaries: | r-devel: stackgbm_0.1.0.zip, r-release: stackgbm_0.1.0.zip, r-oldrel: stackgbm_0.1.0.zip |
macOS binaries: | r-release (arm64): stackgbm_0.1.0.tgz, r-oldrel (arm64): stackgbm_0.1.0.tgz, r-release (x86_64): stackgbm_0.1.0.tgz, r-oldrel (x86_64): stackgbm_0.1.0.tgz |
Please use the canonical form https://CRAN.R-project.org/package=stackgbm to link to this page.
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.