airflow.providers.google.cloud.transfers.bigquery_to_gcs

This module contains Google BigQuery to Google Cloud Storage operator.

Module Contents

Classes

BigQueryToGCSOperator

Transfers a BigQuery table to a Google Cloud Storage bucket.

class airflow.providers.google.cloud.transfers.bigquery_to_gcs.BigQueryToGCSOperator(*, source_project_dataset_table, destination_cloud_storage_uris, project_id=PROVIDE_PROJECT_ID, compression='NONE', export_format='CSV', field_delimiter=',', print_header=True, gcp_conn_id='google_cloud_default', labels=None, location=None, impersonation_chain=None, result_retry=DEFAULT_RETRY, result_timeout=None, job_id=None, force_rerun=False, reattach_states=None, deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), **kwargs)[source]

Bases: airflow.models.BaseOperator

Transfers a BigQuery table to a Google Cloud Storage bucket.

See also

For more information on how to use this operator, take a look at the guide: Operator

See also

For more details about these parameters: https://cloud.google.com/bigquery/docs/reference/v2/jobs

Parameters
  • source_project_dataset_table (str) – The dotted (<project>.|<project>:)<dataset>.<table> BigQuery table to use as the source data. If <project> is not included, project will be the project defined in the connection json. (templated)

  • destination_cloud_storage_uris (list[str]) – The destination Google Cloud Storage URI (e.g. gs://some-bucket/some-file.txt). (templated) Follows convention defined here: https://cloud.google.com/bigquery/exporting-data-from-bigquery#exportingmultiple

  • project_id (str) – Google Cloud Project where the job is running

  • compression (str) – Type of compression to use.

  • export_format (str) – File format to export.

  • field_delimiter (str) – The delimiter to use when extracting to a CSV.

  • print_header (bool) – Whether to print a header for a CSV file extract.

  • gcp_conn_id (str) – (Optional) The connection ID used to connect to Google Cloud.

  • labels (dict | None) – a dictionary containing labels for the job/query, passed to BigQuery

  • location (str | None) – The location used for the operation.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • result_retry (google.api_core.retry.Retry) – How to retry the result call that retrieves rows

  • result_timeout (float | None) – The number of seconds to wait for result method before using result_retry

  • job_id (str | None) – The ID of the job. It will be suffixed with hash of job configuration unless force_rerun is True. The ID must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), or dashes (-). The maximum length is 1,024 characters. If not provided then uuid will be generated.

  • force_rerun (bool) – If True then operator will use hash of uuid as job id suffix

  • reattach_states (set[str] | None) – Set of BigQuery job’s states in case of which we should reattach to the job. Should be other than final states.

  • deferrable (bool) – Run operator in the deferrable mode

template_fields: collections.abc.Sequence[str] = ('source_project_dataset_table', 'destination_cloud_storage_uris', 'export_format', 'labels',...[source]
template_ext: collections.abc.Sequence[str] = ()[source]
ui_color = '#e4e6f0'[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

execute_complete(context, event)[source]

Return immediately and relies on trigger to throw a success event. Callback for the trigger.

Relies on trigger to throw an exception, otherwise it assumes execution was successful.

get_openlineage_facets_on_complete(task_instance)[source]

Implement on_complete as we will include final BQ job id.

Was this entry helpful?