gcloud_alpha_ml_vision_detect-safe-search (1)
NAME
- gcloud alpha ml vision detect-safe-search - detect explicit content in an image
SYNOPSIS
-
gcloud alpha ml vision detect-safe-search IMAGE_PATH [--model-version=MODEL_VERSION; default="builtin/stable"] [GCLOUD_WIDE_FLAG ...]
DESCRIPTION
(ALPHA) Safe Search Detection detects adult content, violent content,
POSITIONAL ARGUMENTS
-
- IMAGE_PATH
-
The path to the image to be analyzed. This can be either a local path or a URL.
If you provide a local file, the contents will be sent directly to Google Cloud
Vision. If you provide a URL, it must be in Google Cloud Storage format
(gs://bucket/object) or an HTTP URL (http://... or https://...)
FLAGS
-
- --model-version=MODEL_VERSION; default="builtin/stable"
-
The model version to use for the feature. Supported values include
"builtin/stable" and "builtin/latest".
GCLOUD WIDE FLAGS
These flags are available to all commands: --account, --configuration, --flags-file, --flatten, --format, --help, --log-http, --project, --quiet, --trace-token, --user-output-enabled, --verbosity. Run $ gcloud help for details.
API REFERENCE
This command uses the vision/v1 API. The full documentation for this API can be found at: cloud.google.com/vision
NOTES
This command is currently in ALPHA and may change without notice. If this command fails with API permission errors despite specifying the right project, you will have to apply for early access and have your projects registered on the API whitelist to use it. To do so, contact Support at cloud.google.com/support These variants are also available:
- $ gcloud ml vision detect-safe-search $ gcloud beta ml vision detect-safe-search