Introduction

This repo provides an Airflow Plugin for priority-driven DAG failure alerting. In layman’s terms, one need only add a tag to their DAG in P1, P2, P3, P4, P5, where P1 corresponds to highest priority and P5 corresponds to lowest, and that dag will send a notification to a backend integration.

Integrations

Integration

Metric / Tag

Docs

datadog logo

airflow.priority.p{1,2,3,4,5}.{failed,succeeded,running}

Link

discord logo

N/A

Link

logfire logo

airflow.priority.p{1,2,3,4,5}.{failed,succeeded,running}

Link

pagerduty logo

N/A

Link

newrelic logo

airflow.priority.p{1,2,3,4,5}.{failed,succeeded,running}

Link

opsgenie logo

N/A

Link

slack logo

N/A

Link

symphony logo

N/A

Link

AWS Cloudwatch

airflow.priority.p{1,2,3,4,5}.{failed,succeeded,running}

Link

Installation

You can install from pip:

pip install airflow-priority

Or via conda:

conda install airflow-priority -c conda-forge

Configuration

The primary mechanism for configuration is the standard airflow.cfg file. All configuration is under the priority header. Here is an example configuration from our tests.

[priority]
threshold = 3  # default threshold of all alerts to P1,P2, or P3 (ignore P4, P5)

[priority.datadog]
api_key = 1234567890abcdefg
host = us1.datadoghq.com
tags = environment:test,mycustom:tag
metric = custom.metric
threshold = 2  # datadog will only send a metric on P1, P2

All scoped configuration options can be set directly under priority.

For example this:

[priority.aws]
region = "us-east-1"

Can be provided as:

[priority]
aws_region = "us-east-1"

Note

AWS MWAA and terraform may require you to use the alternative configuration syntax

Configuration with airflow-config

Configuration options can also be specified via airflow-config. Place a yaml file called config.yaml in a folder called config inside to your AIRFLOW_HOME directory.