The first step towards deploying a trained model is to export this model from TensorFlow. This guide demostrates how to export SavedModels from: TensorFlow, tfestimators and keras using export_savedmodel()
Exporting a SavedModel using the core TensorFlow API requires input and output tensors to be passed to export_savedmodel()
.
For instance, one can train MNIST as described by MNIST For ML Beginners, track the model’s inputs and outputs named x
and y
under that particular article and call export_savedmodel()
as follows:
export_savedmodel(
sess,
"tensorflow-mnist",
inputs = list(images = x),
outputs = list(scores = y)
)
For convinience, a sample training script is included in tfdeploy
to train and export this MNIST model as follows:
tfruns::training_run(
system.file("models/tensorflow-mnist.R", package = "tfdeploy")
)
A sample model using the mtcars
data frame is trained using tfestimators
. The resulting model is the one that will be saved to disk. To train, we can follow the TensorFlow estimators Quick Start followed by running:
export_savedmodel(
model,
"tfestimators-mtcars"
)
For convinience, a sample training script is included in tfdeploy
to train and export this mtcars
model as follows:
tfruns::training_run(
system.file("models/tfestimators-mtcars.R", package = "tfdeploy")
)
To export from Keras, first train a keras
model as described under R interface to Keras, set backend()$set_learning_phase(TRUE)
before training and then export this model as follows:
export_savedmodel(
model,
"keras-mnist"
)
For convinience, a sample training script is included in tfdeploy
to train and export this MNIST model as follows:
tfruns::training_run(
system.file("models/keras-mnist.R", package = "tfdeploy")
)