The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.
An implementation of hyperparameter optimization for Gradient Boosted Trees on binary classification and regression problems. The current version supports two optimization methods: Bayesian optimization and random search. Instead of returning the single best model, the final output is an ensemble of Gradient Boosted Trees constructed via the method of ensemble selection.
# Load German credit data
data(german_credit)
<- german_credit$train
train <- german_credit$test
test <- german_credit$target_idx
target_idx <- german_credit$pred_idx
pred_idx
# Train a GBT model with optimization on AUC
<- gbts(train[, pred_idx], train[, target_idx], nitr = 200, pfmc = "auc")
model
# Predict on test data
<- predict(model, test[, pred_idx])
yhat_test
# Compute AUC on test data
comperf(test[, target_idx], yhat_test, pfmc = "auc")
# Load Boston housing data
data(boston_housing)
<- boston_housing$train
train <- boston_housing$test
test <- boston_housing$target_idx
target_idx <- boston_housing$pred_idx
pred_idx
# Train a GBT model with optimization on MSE
<- gbts(train[, pred_idx], train[, target_idx], nitr = 200, pfmc = "mse")
model
# Predict on test data
<- predict(model, test[, pred_idx])
yhat_test
# Compute MSE on test data
comperf(test[, target_idx], yhat_test, pfmc = "mse")
To get the current released version from CRAN:
install.packages("gbts")
To see a list of functions and datasets provided by gbts:
help(package = "gbts")
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.