The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.
onnx/models is a repository for storing the pre-trained ONNX models. Every ONNX backend should support running these models out of the box. After downloading and extracting the tarball of each model, there should be:
test_data_*.npz
), they are numpy serialized archive.In this tutorial, you’ll learn how to use a backend to load and run a ONNX model.
First, install ONNX TensorFlow backend by following the instructions here.
Then download and extract the tarball of ResNet-50.
Next, we load the necessary R and Python libraries (via reticulate):
library(onnx)
library(reticulate)
<- import("numpy", convert = FALSE)
np <- import("onnx_tf.backend") backend
We can then use the loaded numpy Python library to define a helper function to load testing sample from numpy serialized archive.
<- function(npz_path) {
load_npz_samples <- np$load(normalizePath(npz_path), encoding = 'bytes')
sample list(
inputs = sample$items()[[0]][[1]][[0]],
outputs = sample$items()[[1]][[1]]
) }
Finally, we can load the ONNX model and the testing samples, and then run the model using ONNX TensorFlow backend:
# Specify paths to ONNX model and testing samples
<- "~/Downloads/resnet50"
onnx_model_dir <- file.path(onnx_model_dir, "model.onnx")
model_pb_path <- file.path(onnx_model_dir, "test_data_0.npz")
npz_path
# Load ONNX model
<- load_from_file(model_pb_path)
model
# Load testing sample from numpy serialized archive
<- load_npz_samples(npz_path)
samples <- samples$inputs
inputs <- samples$outputs
expected_outputs
# Run the model with an onnx backend
<- backend$run_model(model, inputs) actual_outputs
We can also use numpy to verify the result:
$testing$assert_almost_equal(expected_outputs, actual_outputs, decimal = 6) np
That’s it! Isn’t it easy? Next you can go ahead and try out different ONNX models as well as different ONNX backends, e.g. PyTorch, MXNet, Caffe2, CNTK, Chainer, etc.
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.