airflow.providers.amazon.aws.log.s3_task_handler
¶
Module Contents¶
Classes¶
S3TaskHandler is a python log handler that handles and reads task instance logs. |
- class airflow.providers.amazon.aws.log.s3_task_handler.S3TaskHandler(base_log_folder, s3_log_folder, **kwargs)[source]¶
Bases:
airflow.utils.log.file_task_handler.FileTaskHandler
,airflow.utils.log.logging_mixin.LoggingMixin
S3TaskHandler is a python log handler that handles and reads task instance logs.
It extends airflow FileTaskHandler and uploads to and reads from S3 remote storage.
- set_context(ti, *, identifier=None)[source]¶
Provide task_instance context to airflow task handler.
Generally speaking returns None. But if attr maintain_propagate has been set to propagate, then returns sentinel MAINTAIN_PROPAGATE. This has the effect of overriding the default behavior to set propagate to False whenever set_context is called. At time of writing, this functionality is only used in unit testing.
- Parameters
ti (airflow.models.taskinstance.TaskInstance) – task instance object
identifier (str | None) – if set, adds suffix to log file. For use when relaying exceptional messages to task logs from a context other than task or trigger run
- s3_read(remote_log_location, return_error=False)[source]¶
Return the log found at the remote_log_location or ‘’ if no logs are found or there is an error.
- s3_write(log, remote_log_location, append=True, max_retry=1)[source]¶
Write the log to the remote_log_location; return True or fails silently and return False.
- Parameters
log (str) – the log to write to the remote_log_location
remote_log_location (str) – the log’s location in remote storage
append (bool) – if False, any existing log file is overwritten. If True, the new log is appended to any existing logs.
max_retry (int) – Maximum number of times to retry on upload failure
- Returns
whether the log is successfully written to remote location or not.
- Return type