airflow.providers.google.cloud.triggers.bigquery
¶
Module Contents¶
Classes¶
BigQueryInsertJobTrigger run on the trigger worker to perform insert operation |
|
BigQueryCheckTrigger run on the trigger worker |
|
BigQueryGetDataTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class |
|
BigQueryIntervalCheckTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class |
|
BigQueryValueCheckTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class |
|
Initialize the BigQuery Table Existence Trigger with needed parameters |
- class airflow.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger(conn_id, job_id, project_id, dataset_id=None, table_id=None, poll_interval=4.0)[source]¶
Bases:
airflow.triggers.base.BaseTrigger
BigQueryInsertJobTrigger run on the trigger worker to perform insert operation
- Parameters
conn_id (str) – Reference to google cloud connection id
job_id (str | None) – The ID of the job. It will be suffixed with hash of job configuration
project_id (str | None) – Google Cloud Project where the job is running
dataset_id (str | None) – The dataset ID of the requested table. (templated)
table_id (str | None) – The table ID of the requested table. (templated)
poll_interval (float) – polling period in seconds to check for the status
- class airflow.providers.google.cloud.triggers.bigquery.BigQueryCheckTrigger(conn_id, job_id, project_id, dataset_id=None, table_id=None, poll_interval=4.0)[source]¶
Bases:
BigQueryInsertJobTrigger
BigQueryCheckTrigger run on the trigger worker
- class airflow.providers.google.cloud.triggers.bigquery.BigQueryGetDataTrigger(conn_id, job_id, project_id, dataset_id=None, table_id=None, poll_interval=4.0)[source]¶
Bases:
BigQueryInsertJobTrigger
BigQueryGetDataTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class
- class airflow.providers.google.cloud.triggers.bigquery.BigQueryIntervalCheckTrigger(conn_id, first_job_id, second_job_id, project_id, table, metrics_thresholds, date_filter_column='ds', days_back=-7, ratio_formula='max_over_min', ignore_zero=True, dataset_id=None, table_id=None, poll_interval=4.0)[source]¶
Bases:
BigQueryInsertJobTrigger
BigQueryIntervalCheckTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class
- Parameters
conn_id (str) – Reference to google cloud connection id
first_job_id (str) – The ID of the job 1 performed
second_job_id (str) – The ID of the job 2 performed
project_id (str | None) – Google Cloud Project where the job is running
dataset_id (str | None) – The dataset ID of the requested table. (templated)
table (str) – table name
metrics_thresholds (dict[str, int]) – dictionary of ratios indexed by metrics
date_filter_column (str | None) – column name
days_back (SupportsAbs[int]) – number of days between ds and the ds we want to check against
ratio_formula (str) – ration formula
ignore_zero (bool) – boolean value to consider zero or not
table_id (str | None) – The table ID of the requested table. (templated)
poll_interval (float) – polling period in seconds to check for the status
- class airflow.providers.google.cloud.triggers.bigquery.BigQueryValueCheckTrigger(conn_id, sql, pass_value, job_id, project_id, tolerance=None, dataset_id=None, table_id=None, poll_interval=4.0)[source]¶
Bases:
BigQueryInsertJobTrigger
BigQueryValueCheckTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class
- Parameters
conn_id (str) – Reference to google cloud connection id
sql (str) – the sql to be executed
job_id (str | None) – The ID of the job
project_id (str | None) – Google Cloud Project where the job is running
tolerance (Any) – certain metrics for tolerance
dataset_id (str | None) – The dataset ID of the requested table. (templated)
table_id (str | None) – The table ID of the requested table. (templated)
poll_interval (float) – polling period in seconds to check for the status
- class airflow.providers.google.cloud.triggers.bigquery.BigQueryTableExistenceTrigger(project_id, dataset_id, table_id, gcp_conn_id, hook_params, poll_interval=4.0)[source]¶
Bases:
airflow.triggers.base.BaseTrigger
Initialize the BigQuery Table Existence Trigger with needed parameters
- Parameters
project_id (str) – Google Cloud Project where the job is running
dataset_id (str) – The dataset ID of the requested table.
table_id (str) – The table ID of the requested table.
gcp_conn_id (str) – Reference to google cloud connection id
poll_interval (float) – polling period in seconds to check for the status