The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.
{kindling} bridges the gap between {torch}
and {tidymodels}, providing a streamlined interface for
building, training, and tuning deep learning models. This vignette will
guide you through the basic usage.
You can install {kindling} on CRAN:
Or install the development version from GitHub:
Before starting, you need to install LibTorch, the backend of PyTorch
which also the backend of {torch} R package:
{kindling} offers flexibility through four levels of
abstraction:
torch::nn_module codeparsnip, recipes, and
workflowstune and dialsGenerate PyTorch-style module code:
Train a model with one function call:
Work with neural networks like any other parsnip
model:
box::use(
parsnip[fit, augment],
yardstick[metrics]
)
nn_spec = mlp_kindling(
mode = "classification",
hidden_neurons = c(10, 7),
activations = act_funs(relu, softshrink = args(lambd = 0.5)),
epochs = 100
)
nn_fit = fit(nn_spec, Species ~ ., data = iris)
augment(nn_fit, new_data = iris) |>
metrics(truth = Species, estimate = .pred_class)These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.