The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.

Getting Started with kindling

Introduction

{kindling} bridges the gap between {torch} and {tidymodels}, providing a streamlined interface for building, training, and tuning deep learning models. This vignette will guide you through the basic usage.

Installation

You can install {kindling} on CRAN:

install.packages('kindling')

Or install the development version from GitHub:

# install.packages("pak")
pak::pak("joshuamarie/kindling")
## devtools::install_github("joshuamarie/kindling") 
library(kindling)
#> 
#> Attaching package: 'kindling'
#> The following object is masked from 'package:base':
#> 
#>     args

Before using {kindling}

Before starting, you need to install LibTorch, the backend of PyTorch which also the backend of {torch} R package:

torch::install_torch()

Four Levels of Interaction

{kindling} offers flexibility through four levels of abstraction:

  1. Code Generation - Generate raw torch::nn_module code
  2. Direct Training - Train models with simple function calls
  3. tidymodels Integration - Use with parsnip, recipes, and workflows
  4. Hyperparameter Tuning - Optimize models with tune and dials

Level 1: Code Generation

Generate PyTorch-style module code:

ffnn_generator(
    nn_name = "MyNetwork",
    hd_neurons = c(64, 32),
    no_x = 10,
    no_y = 1,
    activations = 'relu'
)

Level 2: Direct Training

Train a model with one function call:

model = ffnn(
    Species ~ .,
    data = iris,
    hidden_neurons = c(10, 15, 7),
    activations = act_funs(relu, elu), # c("relu", "elu")
    loss = "cross_entropy",
    epochs = 100
)

predictions = predict(model, newdata = iris)

Level 3: tidymodels Integration

Work with neural networks like any other parsnip model:

box::use(
    parsnip[fit, augment],
    yardstick[metrics]
)

nn_spec = mlp_kindling(
    mode = "classification",
    hidden_neurons = c(10, 7),
    activations = act_funs(relu, softshrink = args(lambd = 0.5)),
    epochs = 100
)

nn_fit = fit(nn_spec, Species ~ ., data = iris)
augment(nn_fit, new_data = iris) |> 
    metrics(truth = Species, estimate = .pred_class)

Learn More

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.