airflow.providers.amazon.aws.operators.appflow

Module Contents

Classes

AppflowBaseOperator

Amazon AppFlow Base Operator class (not supposed to be used directly in DAGs).

AppflowRunOperator

Execute an AppFlow run as is.

AppflowRunFullOperator

Execute an AppFlow full run removing any filter.

AppflowRunBeforeOperator

Execute an AppFlow run after updating the filters to select only previous data.

AppflowRunAfterOperator

Execute an AppFlow run after updating the filters to select only future data.

AppflowRunDailyOperator

Execute an AppFlow run after updating the filters to select only a single day.

AppflowRecordsShortCircuitOperator

Short-circuit in case of an empty AppFlow's run.

Attributes

SUPPORTED_SOURCES

MANDATORY_FILTER_DATE_MSG

NOT_SUPPORTED_SOURCE_MSG

airflow.providers.amazon.aws.operators.appflow.SUPPORTED_SOURCES[source]
airflow.providers.amazon.aws.operators.appflow.MANDATORY_FILTER_DATE_MSG = 'The filter_date argument is mandatory for {entity}!'[source]
airflow.providers.amazon.aws.operators.appflow.NOT_SUPPORTED_SOURCE_MSG = 'Source {source} is not supported for {entity}!'[source]
class airflow.providers.amazon.aws.operators.appflow.AppflowBaseOperator(flow_name, flow_update, source=None, source_field=None, filter_date=None, poll_interval=20, max_attempts=60, wait_for_completion=True, **kwargs)[source]

Bases: airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.appflow.AppflowHook]

Amazon AppFlow Base Operator class (not supposed to be used directly in DAGs).

Parameters
  • source (str | None) – The source name (Supported: salesforce, zendesk)

  • flow_name (str) – The flow name

  • flow_update (bool) – A boolean to enable/disable a flow update before the run

  • source_field (str | None) – The field name to apply filters

  • filter_date (str | None) – The date value (or template) to be used in filters.

  • poll_interval (int) – how often in seconds to check the query status

  • max_attempts (int) – how many times to check for status before timing out

  • wait_for_completion (bool) – whether to wait for the run to end to return

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

aws_hook_class[source]
ui_color = '#2bccbd'[source]
template_fields[source]
UPDATE_PROPAGATION_TIME: int = 15[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.appflow.AppflowRunOperator(flow_name, source=None, poll_interval=20, wait_for_completion=True, **kwargs)[source]

Bases: AppflowBaseOperator

Execute an AppFlow run as is.

See also

For more information on how to use this operator, take a look at the guide: Run Flow Full

Parameters
  • source (str | None) – Obsolete, unnecessary for this operator

  • flow_name (str) – The flow name

  • poll_interval (int) – how often in seconds to check the query status

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region – aws region to use

  • wait_for_completion (bool) – whether to wait for the run to end to return

class airflow.providers.amazon.aws.operators.appflow.AppflowRunFullOperator(source, flow_name, poll_interval=20, wait_for_completion=True, **kwargs)[source]

Bases: AppflowBaseOperator

Execute an AppFlow full run removing any filter.

See also

For more information on how to use this operator, take a look at the guide: Run Flow Daily

Parameters
  • source (str) – The source name (Supported: salesforce, zendesk)

  • flow_name (str) – The flow name

  • poll_interval (int) – how often in seconds to check the query status

  • wait_for_completion (bool) – whether to wait for the run to end to return

class airflow.providers.amazon.aws.operators.appflow.AppflowRunBeforeOperator(source, flow_name, source_field, filter_date, poll_interval=20, wait_for_completion=True, **kwargs)[source]

Bases: AppflowBaseOperator

Execute an AppFlow run after updating the filters to select only previous data.

See also

For more information on how to use this operator, take a look at the guide: Run Flow After

Parameters
  • source (str) – The source name (Supported: salesforce)

  • flow_name (str) – The flow name

  • source_field (str) – The field name to apply filters

  • filter_date (str) – The date value (or template) to be used in filters.

  • poll_interval (int) – how often in seconds to check the query status

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region – aws region to use

  • wait_for_completion (bool) – whether to wait for the run to end to return

class airflow.providers.amazon.aws.operators.appflow.AppflowRunAfterOperator(source, flow_name, source_field, filter_date, poll_interval=20, wait_for_completion=True, **kwargs)[source]

Bases: AppflowBaseOperator

Execute an AppFlow run after updating the filters to select only future data.

See also

For more information on how to use this operator, take a look at the guide: Skipping Tasks For Empty Runs

Parameters
  • source (str) – The source name (Supported: salesforce, zendesk)

  • flow_name (str) – The flow name

  • source_field (str) – The field name to apply filters

  • filter_date (str) – The date value (or template) to be used in filters.

  • poll_interval (int) – how often in seconds to check the query status

  • wait_for_completion (bool) – whether to wait for the run to end to return

class airflow.providers.amazon.aws.operators.appflow.AppflowRunDailyOperator(source, flow_name, source_field, filter_date, poll_interval=20, wait_for_completion=True, **kwargs)[source]

Bases: AppflowBaseOperator

Execute an AppFlow run after updating the filters to select only a single day.

See also

For more information on how to use this operator, take a look at the guide: Run Flow Before

Parameters
  • source (str) – The source name (Supported: salesforce)

  • flow_name (str) – The flow name

  • source_field (str) – The field name to apply filters

  • filter_date (str) – The date value (or template) to be used in filters.

  • poll_interval (int) – how often in seconds to check the query status

  • wait_for_completion (bool) – whether to wait for the run to end to return

class airflow.providers.amazon.aws.operators.appflow.AppflowRecordsShortCircuitOperator(*, flow_name, appflow_run_task_id, ignore_downstream_trigger_rules=True, aws_conn_id='aws_default', region_name=None, verify=None, botocore_config=None, **kwargs)[source]

Bases: airflow.operators.python.ShortCircuitOperator, airflow.providers.amazon.aws.utils.mixins.AwsBaseHookMixin[airflow.providers.amazon.aws.hooks.appflow.AppflowHook]

Short-circuit in case of an empty AppFlow’s run.

See also

For more information on how to use this operator, take a look at the guide: Reference

Parameters
  • flow_name (str) – The flow name

  • appflow_run_task_id (str) – Run task ID from where this operator should extract the execution ID

  • ignore_downstream_trigger_rules (bool) – Ignore downstream trigger rules

  • aws_conn_id (str | None) – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name (str | None) – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify (bool | str | None) – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config (dict | None) – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

aws_hook_class[source]
template_fields[source]
ui_color = '#33ffec'[source]

Was this entry helpful?