Runs a prediction over a saved model file, local service or cloudml model.
predict_savedmodel(instances, model, ...)
instances | A list of prediction instances to be passed as input tensors to the service. Even for single predictions, a list with one entry is expected. |
---|---|
model | The model as a local path, a REST url, CloudML name or graph object. A local path can be exported using Notice that predicting over a CloudML model requires a A |
... | See #' @section Implementations: |
export_savedmodel()
, serve_savedmodel()
, load_savedmodel()
# NOT RUN { # perform prediction based on an existing model tfdeploy::predict_savedmodel( list(rep(9, 784)), system.file("models/tensorflow-mnist", package = "tfdeploy") ) # }