Logging for Tasks¶
Airflow writes logs for tasks in a way that allows you to see the logs for each task separately in the Airflow UI. Core Airflow implements writing and serving logs locally. However, you can also write logs to remote services via community providers, or write your own loggers.
Below we describe the local task logging, the Apache Airflow Community also releases providers for many services (Provider packages) and some of them provide handlers that extend the logging capability of Apache Airflow. You can see all of these providers in Writing logs.
Writing logs Locally¶
You can specify the directory to place log files in
base_log_folder. By default, logs are placed in the
For more information on setting the configuration, see Setting Configuration Options
The default pattern is followed while naming log files for tasks:
For normal tasks:
For dynamically mapped tasks:
These patterns can be adjusted by log_filename_template.
In addition, you can supply a remote location to store current logs and backups.
In the Airflow UI, remote logs take precedence over local logs when remote logging is enabled. If remote logs can not be found or accessed, local logs will be displayed. Note that logs are only sent to remote storage once a task is complete (including failure). In other words, remote logs for running tasks are unavailable (but local logs are available).
If you want to check which task handler is currently set, you can use the
airflow info command as in
the example below.
$ airflow info ... airflow on PATH: [True] Executor: [SequentialExecutor] Task Logging Handlers: [StackdriverTaskHandler] SQL Alchemy Conn: [sqlite://///root/airflow/airflow.db] DAGs Folder: [/root/airflow/dags] Plugins Folder: [/root/airflow/plugins] Base Log Folder: [/root/airflow/logs]
You can also run
airflow config list to check that the logging configuration options have valid values.
Not all configuration options are available from the
airflow.cfg file. Some configuration options require
that the logging config class be overwritten. This can be done via the
airflow.cfg file. This option should specify the import path to a configuration compatible with
logging.config.dictConfig(). If your file is a standard import location, then you should set a
PYTHONPATH environment variable.
Follow the steps below to enable custom logging config class:
Start by setting environment variable to known directory e.g.
Create a directory to store the config file e.g.
Create file called
~/airflow/config/log_config.pywith following the contents:
from copy import deepcopy from airflow.config_templates.airflow_local_settings import DEFAULT_LOGGING_CONFIG LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
At the end of the file, add code to modify the default dictionary configuration.
[logging] remote_logging = True logging_config_class = log_config.LOGGING_CONFIG
Restart the application.
See Modules Management for details on how Python and Airflow manage modules.
When using remote logging, you can configure Airflow to show a link to an external UI within the Airflow Web UI. Clicking the link redirects you to the external UI.
Some external systems require specific configuration in Airflow for redirection to work but others do not.
Serving logs from workers¶
Most task handlers send logs upon completion of a task. In order to view logs in real time, Airflow automatically starts an HTTP server to serve the logs in the following cases:
LocalExecutoris used, then when
airflow scheduleris running.
CeleryExecutoris used, then when
airflow workeris running.
The server is running on the port specified by
worker_log_server_port option in
[logging] section. By default, it is
Communication between the webserver and the worker is signed with the key specified by
secret_key option in
[webserver] section. You must ensure that the key matches so that communication can take place without problems.
We are using Gunicorn as a WSGI server. Its configuration options can be overridden with the
GUNICORN_CMD_ARGS env variable. For details, see Gunicorn settings.