: You can override any config setting using the format AIRFLOW__{SECTION}__{KEY} (e.g., AIRFLOW__CORE__PARALLELISM ). airflow.cfg : The settings defined in the physical file.
: Contains foundational settings like dags_folder (where Airflow scans for workflows) and executor (the mechanism that actually runs your tasks, such as SequentialExecutor or CeleryExecutor ).
: The hardcoded defaults within the Airflow source code if no other value is provided. Performance Tuning data-engineering/airflow/airflow.cfg at main - GitHub airflow cfg.zip
The airflow.cfg file is the central command center for , a Python-based platform used to author, schedule, and monitor complex workflows. This configuration file dictates everything from how many tasks can run simultaneously to where your DAGs (Directed Acyclic Graphs) are stored. Core Purpose and Location
: Manages the Airflow UI settings, such as the port it runs on and security parameters like secret keys. : You can override any config setting using
: Configures the connection to the metadata database where Airflow stores state information about tasks and DAG runs. Precedence and Overrides
When you first run Airflow, it automatically generates airflow.cfg in your $AIRFLOW_HOME directory, which is typically ~/airflow by default. This file is organized into sections (e.g., [core] , [webserver] , [scheduler] ) that govern different components of the Airflow ecosystem. Key Configuration Sections : The hardcoded defaults within the Airflow source
: Controls how the scheduler identifies and kicks off tasks. Important settings include dag_dir_list_interval , which determines how often the scheduler scans for new DAG files.