airflow.providers.teradata.transfers.s3_to_teradata

Module Contents

Classes

S3ToTeradataOperator

Loads CSV, JSON and Parquet format data from Amazon S3 to Teradata.

class airflow.providers.teradata.transfers.s3_to_teradata.S3ToTeradataOperator(*, s3_source_key, public_bucket=False, teradata_table, aws_conn_id='aws_default', teradata_conn_id='teradata_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

Loads CSV, JSON and Parquet format data from Amazon S3 to Teradata.

See also

For more information on how to use this operator, take a look at the guide: S3ToTeradataOperator

Parameters
  • s3_source_key (str) – The URI format specifying the location of the S3 bucket.(templated) The URI format is /s3/YOUR-BUCKET.s3.amazonaws.com/YOUR-BUCKET-NAME. Refer to https://docs.teradata.com/search/documents?query=native+object+store&sort=last_update&virtual-field=title_only&content-lang=en-US

  • public_bucket (bool) – Specifies whether the provided S3 bucket is public. If the bucket is public, it means that anyone can access the objects within it via a URL without requiring authentication. If the bucket is private and authentication is not provided, the operator will throw an exception.

  • teradata_table (str) – The name of the teradata table to which the data is transferred.(templated)

  • aws_conn_id (str) – The Airflow AWS connection used for AWS credentials.

  • teradata_conn_id (str) – The connection ID used to connect to Teradata Teradata connection.

Note that s3_source_key and teradata_table are templated, so you can use variables in them if you wish.

template_fields: Sequence[str] = ('s3_source_key', 'teradata_table')[source]
ui_color = '#e07c24'[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?