gcloud_ml-engine_local_predict (1)
NAME
- gcloud ml-engine local predict - run prediction locally
SYNOPSIS
-
gcloud ml-engine local predict --model-dir=MODEL_DIR (--json-instances=JSON_INSTANCES | --text-instances=TEXT_INSTANCES) [--framework=FRAMEWORK] [--signature-name=SIGNATURE_NAME] [GCLOUD_WIDE_FLAG ...]
DESCRIPTION
gcloud ml-engine local predict performs prediction locally with the given
REQUIRED FLAGS
-
- --model-dir=MODEL_DIR
-
Path to the model.
-
Exactly one of these must be specified:
-
- --json-instances=JSON_INSTANCES
-
Path to a local file from which instances are read. Instances are in JSON
format; newline delimited.
An example of the JSON instances file:
- {"images": [0.0, ..., 0.1], "key": 3} {"images": [0.0, ..., 0.1], "key": 2}
This flag accepts "-" for stdin.
- --text-instances=TEXT_INSTANCES
-
Path to a local file from which instances are read. Instances are in UTF-8
encoded text format; newline delimited.
An example of the text instances file:
- 107,4.9,2.5,4.5,1.7 100,5.7,2.8,4.1,1.3
This flag accepts "-" for stdin.
-
OPTIONAL FLAGS
-
- --framework=FRAMEWORK
-
The ML framework used to train this version of the model. If not specified,
defaults to tensorflow. FRAMEWORK must be one of:
scikit-learn, tensorflow, xgboost.
- --signature-name=SIGNATURE_NAME
-
The name of the signature defined in the SavedModel to use for this job.
Defaults to DEFAULT_SERVING_SIGNATURE_DEF_KEY in
www.tensorflow.org/api_docs/python/tf/saved_model/signature_constants
which is "serving_default". Only applies to TensorFlow models.
GCLOUD WIDE FLAGS
These flags are available to all commands: --account, --configuration, --flags-file, --flatten, --format, --help, --log-http, --project, --quiet, --trace-token, --user-output-enabled, --verbosity. Run $ gcloud help for details.
NOTES
These variants are also available:
- $ gcloud alpha ml-engine local predict $ gcloud beta ml-engine local predict