airflow-config

Apache Airflow utilities for configuration of DAGs and DAG environments

Build Status codecov License PyPI

Overview

This library allows for YAML-driven configuration of Airflow, including DAGs, Operators, and declaratively defined DAGs. It is built with Pydantic, Hydra, and OmegaConf.

Consider the following basic DAG:

from airflow import DAG
from airflow.providers.standard.operators.bash import BashOperator
from datetime import datetime, timedelta

with DAG(
    dag_id="test-dag",
    default_args={
        "depends_on_past": False,
        "email": ["my.email@myemail.com"],
        "email_on_failure": False,
        "email_on_retry": False,
        "retries": 0,
    },
    description="test that dag is working properly",
    schedule=timedelta(minutes=1),
    start_date=datetime(2024, 1, 1),
    catchup=False,
    tags=["utility", "test"],
):
    BashOperator(
        task_id="test-task",
        bash_command="echo 'test'",
    )

We can already see many options that we might want to drive centrally via config, perhaps based on some notion of environment (e.g. dev, prod, etc).

  • "email": ["my.email@myemail.com"]

  • "email_on_failure": False

  • "email_on_retry": False

  • "retries": 0

  • schedule=timedelta(minutes=1)

  • tags=["utility", "test"]

If we want to change these in our DAG, we need to modify code. Now imagine we have hundreds of DAGs, this can quickly get out of hand, especially since Airflow DAGs are Python code, and we might easily inject a syntax error or a trailing comma or other common problem.

Now consider the alternative, config-driven approach:

config/dev.yaml

# @package _global_
_target_: airflow_config.Configuration
default_args:
  _target_: airflow_config.TaskArgs
  owner: test
  email: [myemail@myemail.com]
  email_on_failure: false
  email_on_retry: false
  retries: 0
  depends_on_past: false
default_dag_args:
  _target_: airflow_config.DagArgs
  schedule: "01:00"
  start_date: "2024-01-01"
  catchup: false
  tags: ["utility", "test"]
from airflow.providers.standard.operators.bash import BashOperator
from airflow_config import DAG, load_config

config = load_config(config_name="dev")

with DAG(
    dag_id="test-dag",
    description="test that dag is working properly",
    schedule=timedelta(minutes=1),
    config=config
):
    BashOperator(
        task_id="test-task",
        bash_command="echo 'test'",
    )

This has a number of benefits:

  • Make changes without code changes, with static type validation

  • Make changes across any number of DAGs without having to copy-paste

  • Organize collections of DAGs into groups, e.g. via environment like dev, prod, etc

Features

  • Configure DAGs from a central config file or…

  • from multiple env-specific config files (e.g. dev, uat, prod)

  • Specialize DAGs by dag_id from a single file (e.g. set each DAG’s schedule from a single shared file)

  • Generate entire DAGs declaratively, like astronomer/dag-factory

  • Configure other extensions like:

The airflow-laminar Ecosystem

airflow-config is the configuration backbone of the airflow-laminar ecosystem:

Library

Description

airflow-pydantic

Foundation library - Pydantic models for all Airflow constructs (DAGs, Operators, Sensors). Powers the _target_ declarations in YAML configs

airflow-balancer

Host and port management - Track infrastructure, create pools automatically, select hosts by queue/OS/tags

airflow-supervisor

Long-running jobs - Run always-on processes with supervisord, integrated with airflow-balancer for host selection

airflow-common

Common operators - Skip/Fail/Pass operators, topology helpers, library management tasks

airflow-priority

Priority tracking - Track and manage DAG priorities across your Airflow deployment

Configuration

class Configuration(BaseModel):
    # default task args
    # https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/models/baseoperator/index.html#airflow.models.baseoperator.BaseOperator
    default_task_args: TaskArgs

    # default dag args
    # https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/models/dag/index.html#airflow.models.dag.DAG
    default_dag_args: DagArgs

    # string (dag id) to Dag mapping
    dags: Optional[Dict[str, Dag]]

    # string (dag id) to Task mapping
    tasks: Optional[Dict[str, Task]]

    # used for extensions to inject arbitrary configuration.
    # See e.g.: https://github.com/airflow-laminar/airflow-supervisor?tab=readme-ov-file#example-dag-airflow-config
    extensions: Optional[Dict[str, BaseModel]]

Here is an example configuration defined via yaml:

# @package _global_
_target_: airflow_config.Configuration
default_task_args:
  _target_: airflow_config.TaskArgs
  owner: blerg
  email: []
  email_on_failure: false
  email_on_retry: false
  retries: 0
  depends_on_past: false

default_dag_args:
  _target_: airflow_config.DagArgs
  start_date: ["2025-01-01", "America/New_York"]
  catchup: false
  max_active_runs: 1

dags:
  reboot:
    tags: ["reboot", "utility"]
    description: "Reboot machines"
    schedule: "0 0 * * *"
    max_active_tasks: 1
  clean-logs:
    tags: ["celery", "utility"]
    description: "Clean worker logs"
    schedule: "0 4 * * *"

Examples

Comparison with dag-factory

airflow-config is similar to dag-factory, but with additional benefits:

  • Code Generation: Generate Python files containing DAG code, not just runtime DAG objects

  • Hydra/OmegaConf: Powerful composition and interpolation of YAML configurations

  • Pydantic Validation: Type-safe validation via airflow-pydantic

  • Ecosystem Integration: Seamless integration with airflow-balancer, airflow-supervisor, and more

License

This software is licensed under the Apache 2.0 license. See the LICENSE file for details.

Note

This library was generated using copier from the Base Python Project Template repository.