Google Cloud AutoML Operators

The Google Cloud AutoML makes the power of machine learning available to you even if you have limited knowledge of machine learning. You can use AutoML to build on Google’s machine learning capabilities to create your own custom machine learning models that are tailored to your business needs, and then integrate those models into your applications and web sites.

Prerequisite Tasks

To use these operators, you must do a few things:

Creating Datasets

To create a Google AutoML dataset you can use AutoMLCreateDatasetOperator. The operator returns dataset id in XCom under dataset_id key.

tests/system/providers/google/cloud/automl/example_automl_dataset.py[source]

create_dataset = AutoMLCreateDatasetOperator(
    task_id="create_dataset",
    dataset=DATASET,
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)
dataset_id = create_dataset.output["dataset_id"]

After creating a dataset you can use it to import some data using AutoMLImportDataOperator.

tests/system/providers/google/cloud/automl/example_automl_dataset.py[source]

import_dataset = AutoMLImportDataOperator(
    task_id="import_dataset",
    dataset_id=dataset_id,
    location=GCP_AUTOML_LOCATION,
    input_config=IMPORT_INPUT_CONFIG,
)

To update dataset you can use AutoMLTablesUpdateDatasetOperator.

tests/system/providers/google/cloud/automl/example_automl_dataset.py[source]

update = deepcopy(DATASET)
update["name"] = '{{ task_instance.xcom_pull("create_dataset")["name"] }}'
update["tables_dataset_metadata"][  # type: ignore
    "target_column_spec_id"
] = "{{ get_target_column_spec(task_instance.xcom_pull('list_columns_spec_task'), target) }}"

update_dataset = AutoMLTablesUpdateDatasetOperator(
    task_id="update_dataset",
    dataset=update,
    location=GCP_AUTOML_LOCATION,
)

Listing Table And Columns Specs

To list table specs you can use AutoMLTablesListTableSpecsOperator.

tests/system/providers/google/cloud/automl/example_automl_dataset.py[source]

list_tables_spec = AutoMLTablesListTableSpecsOperator(
    task_id="list_tables_spec",
    dataset_id=dataset_id,
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

To list column specs you can use AutoMLTablesListColumnSpecsOperator.

tests/system/providers/google/cloud/automl/example_automl_dataset.py[source]

list_columns_spec = AutoMLTablesListColumnSpecsOperator(
    task_id="list_columns_spec",
    dataset_id=dataset_id,
    table_spec_id="{{ extract_object_id(task_instance.xcom_pull('list_tables_spec_task')[0]) }}",
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

Operations On Models

To create a Google AutoML model you can use AutoMLTrainModelOperator. The operator will wait for the operation to complete. Additionally the operator returns the id of model in XCom under model_id key.

This Operator is deprecated when running for text, video and vision prediction and will be removed soon. All the functionality of legacy AutoML Natural Language, Vision, Video Intelligence and new features are available on the Vertex AI platform. Please use CreateAutoMLTextTrainingJobOperator, CreateAutoMLImageTrainingJobOperator or CreateAutoMLVideoTrainingJobOperator.

You can find example on how to use VertexAI operators for AutoML Natural Language classification here:

tests/system/providers/google/cloud/automl/example_automl_nl_text_classification.py[source]

create_clss_training_job = CreateAutoMLTextTrainingJobOperator(
    task_id="create_clss_training_job",
    display_name=TEXT_CLSS_DISPLAY_NAME,
    prediction_type="classification",
    multi_label=False,
    dataset_id=clss_dataset_id,
    model_display_name=MODEL_NAME,
    training_fraction_split=0.7,
    validation_fraction_split=0.2,
    test_fraction_split=0.1,
    sync=True,
    region=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

Additionally, you can find example on how to use VertexAI operators for AutoML Vision classification here:

tests/system/providers/google/cloud/automl/example_automl_vision_classification.py[source]

create_auto_ml_image_training_job = CreateAutoMLImageTrainingJobOperator(
    task_id="auto_ml_image_task",
    display_name=IMAGE_DISPLAY_NAME,
    dataset_id=image_dataset_id,
    prediction_type="classification",
    multi_label=False,
    model_type="CLOUD",
    training_fraction_split=0.6,
    validation_fraction_split=0.2,
    test_fraction_split=0.2,
    budget_milli_node_hours=8000,
    model_display_name=MODEL_DISPLAY_NAME,
    disable_early_stopping=False,
    region=REGION,
    project_id=PROJECT_ID,
)

Example on how to use VertexAI operators for AutoML Video Intelligence classification you can find here:

tests/system/providers/google/cloud/automl/example_automl_video_classification.py[source]

create_auto_ml_video_training_job = CreateAutoMLVideoTrainingJobOperator(
    task_id="auto_ml_video_task",
    display_name=VIDEO_DISPLAY_NAME,
    prediction_type="classification",
    model_type="CLOUD",
    dataset_id=video_dataset_id,
    model_display_name=MODEL_DISPLAY_NAME,
    region=REGION,
    project_id=PROJECT_ID,
)

When running Vertex AI Operator for training data, please ensure that your data is correctly stored in Vertex AI datasets. To create and import data to the dataset please use CreateDatasetOperator and ImportDataOperator

tests/system/providers/google/cloud/automl/example_automl_model.py[source]

create_model = AutoMLTrainModelOperator(
    task_id="create_model",
    model=MODEL,
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)
model_id = create_model.output["model_id"]

To get existing model one can use AutoMLGetModelOperator.

tests/system/providers/google/cloud/automl/example_automl_model.py[source]

get_model = AutoMLGetModelOperator(
    task_id="get_model",
    model_id=model_id,
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

Once a model is created it could be deployed using AutoMLDeployModelOperator.

tests/system/providers/google/cloud/automl/example_automl_model.py[source]

deploy_model = AutoMLDeployModelOperator(
    task_id="deploy_model",
    model_id=model_id,
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

If you wish to delete a model you can use AutoMLDeleteModelOperator.

tests/system/providers/google/cloud/automl/example_automl_model.py[source]

delete_model = AutoMLDeleteModelOperator(
    task_id="delete_model",
    model_id=model_id,
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

Making Predictions

To obtain predictions from Google Cloud AutoML model you can use AutoMLPredictOperator or AutoMLBatchPredictOperator. In the first case the model must be deployed.

tests/system/providers/google/cloud/automl/example_automl_model.py[source]

predict_task = AutoMLPredictOperator(
    task_id="predict_task",
    model_id=model_id,
    payload={
        "row": {
            "values": PREDICT_VALUES,
        }
    },
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

tests/system/providers/google/cloud/automl/example_automl_model.py[source]

batch_predict_task = AutoMLBatchPredictOperator(
    task_id="batch_predict_task",
    model_id=model_id,
    input_config=IMPORT_INPUT_CONFIG,
    output_config=IMPORT_OUTPUT_CONFIG,
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

Listing And Deleting Datasets

You can get a list of AutoML datasets using AutoMLListDatasetOperator. The operator returns list of datasets ids in XCom under dataset_id_list key.

tests/system/providers/google/cloud/automl/example_automl_dataset.py[source]

list_datasets = AutoMLListDatasetOperator(
    task_id="list_datasets",
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

To delete a dataset you can use AutoMLDeleteDatasetOperator. The delete operator allows also to pass list or coma separated string of datasets ids to be deleted.

tests/system/providers/google/cloud/automl/example_automl_dataset.py[source]

delete_dataset = AutoMLDeleteDatasetOperator(
    task_id="delete_dataset",
    dataset_id=dataset_id,
    location=GCP_AUTOML_LOCATION,
    project_id=GCP_PROJECT_ID,
)

Reference

For further information, look at:

Was this entry helpful?