Local Filesystem to Amazon S3¶
Use the LocalFilesystemToS3Operator
transfer to copy data from the Airflow local filesystem
to an Amazon Simple Storage Service (S3) file.
Prerequisite Tasks¶
To use these operators, you must do a few things:
Create necessary resources using AWS Console or AWS CLI.
Install API libraries via pip.
pip install 'apache-airflow[amazon]'Detailed information is available Installation of Airflow®
Operators¶
Local to Amazon S3 transfer operator¶
This operator copies data from the local filesystem to an Amazon S3 file.
To get more information about this operator visit:
LocalFilesystemToS3Operator
Example usage:
tests/system/amazon/aws/example_local_to_s3.py
create_local_to_s3_job = LocalFilesystemToS3Operator(
task_id="create_local_to_s3_job",
filename=TEMP_FILE_PATH,
dest_key=s3_key,
dest_bucket=s3_bucket_name,
replace=True,
)