The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.
TensorFlow Hub is a way to share pretrained model components. See the TensorFlow Module Hub for a searchable listing of pre-trained models. This tutorial demonstrates:
library(keras)
library(tfhub)
Use layer_hub
to load a mobilenet and transform it into a Keras layer. Any TensorFlow 2 compatible image classifier URL from tfhub.dev will work here.
<- "https://tfhub.dev/google/tf2-preview/mobilenet_v2/classification/2"
classifier_url <- layer_hub(handle = classifier_url) mobilenet_layer
We can then create our Keras model:
<- layer_input(shape = c(224, 224, 3))
input <- input %>%
output mobilenet_layer()
<- keras_model(input, output) model
Download a single image to try the model on.
<- tempfile(fileext = ".jpg")
tmp download.file(
'https://storage.googleapis.com/download.tensorflow.org/example_images/grace_hopper.jpg',
tmp
)<- image_load(tmp, target_size = c(224, 224)) %>%
img image_to_array() %>%
::abind(along = 0)
abind<- img/255 img[]
<- predict(model, img)
result mobilenet_decode_predictions(result[,-1, drop = FALSE])
Using TF Hub it is simple to retrain the top layer of the model to recognize the classes in our dataset.
For this example you will use the TensorFlow flowers dataset:
if(!dir.exists("flower_photos")) {
<- "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz"
url <- tempfile(fileext = ".tgz")
tgz download.file(url, destfile = tgz)
::untar(tgz, exdir = ".")
utils
}
<- "flower_photos" data_root
The simplest way to load this data into our model is using image_data_generator
All of TensorFlow Hub’s image modules expect float inputs in the [0, 1] range. Use the image_data_generator’s rescale parameter to achieve this.
<- image_data_generator(rescale = 1/255, validation_split = 0.2)
image_generator
<- flow_images_from_directory(
training_data directory = data_root,
generator = image_generator,
target_size = c(224, 224),
subset = "training"
)
<- flow_images_from_directory(
validation_data directory = data_root,
generator = image_generator,
target_size = c(224, 224),
subset = "validation"
)
The resulting object is an iterator that returns image_batch
, label_batch pairs
.
TensorFlow Hub also distributes models without the top classification layer. These can be used to easily do transfer learning.
Any Tensorflow 2 compatible image feature vector URL from tfhub.dev will work here.
<- "https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/2"
feature_extractor_url <- layer_hub(handle = feature_extractor_url) feature_extractor_layer
Now we can create our classification model by attaching a classification head into the feature extractor layer. We define the following model:
<- layer_input(shape = c(224, 224, 3))
input <- input %>%
output feature_extractor_layer() %>%
layer_dense(units = training_data$num_classes, activation = "softmax")
<- keras_model(input, output)
model summary(model)
We can now train our model in the same way we would train any other Keras model. We first use compile
to configure the training process:
%>%
model compile(
loss = "categorical_crossentropy",
optimizer = "adam",
metrics = "acc"
)
We can then use the fit
function to fit our model.
%>%
model fit_generator(
training_data, steps_per_epoch = training_data$n/training_data$batch_size,
validation_data = validation_data
)
You can then export your model with:
save_model_tf(model, "model")
You can also reload the model_from_saved_model
function. Note that you need to pass the custom_object
with the definition of the KerasLayer since it/s not a default Keras layer.
<- load_model_tf("model") reloaded_model
We can verify that the predictions of both the trained model and the reloaded model are equal:
<- as.integer(validation_data$n/validation_data$batch_size)
steps all.equal(
predict_generator(model, validation_data, steps = steps),
predict_generator(reloaded_model, validation_data, steps = steps),
)
The saved model can also be loaded for inference later or be converted to TFLite or TFjs.
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.