airflow.providers.apache.beam.triggers.beam

Module Contents

Classes

BeamPipelineBaseTrigger

Base class for Beam Pipeline Triggers.

BeamPythonPipelineTrigger

Trigger to perform checking the Python pipeline status until it reaches terminate state.

BeamJavaPipelineTrigger

Trigger to perform checking the Java pipeline status until it reaches terminate state.

BeamPipelineTrigger

Trigger to perform checking the Python pipeline status until it reaches terminate state.

class airflow.providers.apache.beam.triggers.beam.BeamPipelineBaseTrigger(**kwargs)[source]

Bases: airflow.triggers.base.BaseTrigger

Base class for Beam Pipeline Triggers.

class airflow.providers.apache.beam.triggers.beam.BeamPythonPipelineTrigger(variables, py_file, py_options=None, py_interpreter='python3', py_requirements=None, py_system_site_packages=False, runner='DirectRunner')[source]

Bases: BeamPipelineBaseTrigger

Trigger to perform checking the Python pipeline status until it reaches terminate state.

Parameters
  • variables (dict) – Variables passed to the pipeline.

  • py_file (str) – Path to the python file to execute.

  • py_options (list[str] | None) – Additional options.

  • py_interpreter (str) – Python version of the Apache Beam pipeline. If None, this defaults to the python3. To track python versions supported by beam and related issues check: https://issues.apache.org/jira/browse/BEAM-1251

  • py_requirements (list[str] | None) – Additional python package(s) to install. If a value is passed to this parameter, a new virtual environment has been created with additional packages installed. You could also install the apache-beam package if it is not installed on your system, or you want to use a different version.

  • py_system_site_packages (bool) – Whether to include system_site_packages in your virtualenv. See virtualenv documentation for more information. This option is only relevant if the py_requirements parameter is not None.

  • runner (str) – Runner on which pipeline will be run. By default, “DirectRunner” is being used. Other possible options: DataflowRunner, SparkRunner, FlinkRunner, PortableRunner. See: BeamRunnerType See: https://beam.apache.org/documentation/runners/capability-matrix/

serialize()[source]

Serialize BeamPythonPipelineTrigger arguments and classpath.

async run()[source]

Get current pipeline status and yields a TriggerEvent.

class airflow.providers.apache.beam.triggers.beam.BeamJavaPipelineTrigger(variables, jar, job_class=None, runner='DirectRunner', check_if_running=False, project_id=None, location=None, job_name=None, gcp_conn_id=None, impersonation_chain=None, poll_sleep=10, cancel_timeout=None)[source]

Bases: BeamPipelineBaseTrigger

Trigger to perform checking the Java pipeline status until it reaches terminate state.

Parameters
  • variables (dict) – Variables passed to the job.

  • jar (str) – Name of the jar for the pipeline.

  • job_class (str | None) – Optional. Name of the java class for the pipeline.

  • runner (str) – Runner on which pipeline will be run. By default, “DirectRunner” is being used. Other possible options: DataflowRunner, SparkRunner, FlinkRunner, PortableRunner. See: BeamRunnerType See: https://beam.apache.org/documentation/runners/capability-matrix/

  • check_if_running (bool) – Optional. Before running job, validate that a previous run is not in process.

  • project_id (str | None) – Optional. The Google Cloud project ID in which to start a job.

  • location (str | None) – Optional. Job location.

  • job_name (str | None) – Optional. The ‘jobName’ to use when executing the Dataflow job.

  • gcp_conn_id (str | None) – Optional. The connection ID to use connecting to Google Cloud.

  • impersonation_chain (str | Sequence[str] | None) – Optional. GCP service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • poll_sleep (int) – Optional. The time in seconds to sleep between polling GCP for the dataflow job status. Default value is 10s.

  • cancel_timeout (int | None) – Optional. How long (in seconds) operator should wait for the pipeline to be successfully cancelled when task is being killed. Default value is 300s.

serialize()[source]

Serialize BeamJavaPipelineTrigger arguments and classpath.

async run()[source]

Get current Java pipeline status and yields a TriggerEvent.

class airflow.providers.apache.beam.triggers.beam.BeamPipelineTrigger(*args, **kwargs)[source]

Bases: BeamPythonPipelineTrigger

Trigger to perform checking the Python pipeline status until it reaches terminate state.

This class is deprecated. Please use airflow.providers.apache.beam.triggers.beam.BeamPythonPipelineTrigger instead.

Was this entry helpful?