airflow.providers.mysql.transfers.s3_to_mysql

Module Contents

Classes

S3ToMySqlOperator

Loads a file from S3 into a MySQL table.

class airflow.providers.mysql.transfers.s3_to_mysql.S3ToMySqlOperator(*, s3_source_key, mysql_table, mysql_duplicate_key_handling='IGNORE', mysql_extra_options=None, aws_conn_id='aws_default', mysql_conn_id='mysql_default', mysql_local_infile=False, **kwargs)[source]

Bases: airflow.models.BaseOperator

Loads a file from S3 into a MySQL table.

Parameters
  • s3_source_key (str) – The path to the file (S3 key) that will be loaded into MySQL.

  • mysql_table (str) – The MySQL table into where the data will be sent.

  • mysql_duplicate_key_handling (str) –

    Specify what should happen to duplicate data. You can choose either IGNORE or REPLACE.

  • mysql_extra_options (str | None) – MySQL options to specify exactly how to load the data.

  • aws_conn_id (str | None) – The S3 connection that contains the credentials to the S3 Bucket.

  • mysql_conn_id (str) – Reference to mysql connection id.

  • mysql_local_infile (bool) – flag to enable local_infile option on the MySQLHook. This loads MySQL directly using the LOAD DATA LOCAL INFILE command. Defaults to False.

template_fields: Sequence[str] = ('s3_source_key', 'mysql_table')[source]
template_ext: Sequence[str] = ()[source]
ui_color = '#f4a460'[source]
execute(context)[source]

Execute the transfer operation from S3 to MySQL.

Parameters

context (airflow.utils.context.Context) – The context that is being provided when executing.

Was this entry helpful?