airflow_pydantic.TaskArgs¶
- pydantic model airflow_pydantic.TaskArgs[source]¶
Bases:
BaseModel
Show JSON schema
{ "title": "TaskArgs", "type": "object", "properties": { "owner": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "description": "the owner of the task. Using a meaningful description (e.g. user/person/team/role name) to clarify ownership is recommended.", "title": "Owner" }, "email": { "anyOf": [ { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "description": "the 'to' email address(es) used in email alerts", "title": "Email" }, "email_on_failure": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "default": null, "description": "Indicates whether email alerts should be sent when a task failed", "title": "Email On Failure" }, "email_on_retry": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "default": null, "description": "Indicates whether email alerts should be sent when a task is retried", "title": "Email On Retry" }, "retries": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "description": "the number of retries that should be performed before failing the task", "title": "Retries" }, "retry_delay": { "anyOf": [ { "format": "duration", "type": "string" }, { "type": "null" } ], "default": null, "description": "delay between retries", "title": "Retry Delay" }, "start_date": { "anyOf": [ { "format": "date-time", "type": "string" }, { "maxItems": 2, "minItems": 2, "prefixItems": [ { "format": "date-time", "type": "string" }, { "type": "string" } ], "type": "array" }, { "type": "null" } ], "default": null, "description": "The start_date for the task, determines the execution_date for the first task instance. The best practice is to have the start_date rounded to your DAG\u2019s schedule_interval. Daily jobs have their start_date some day at 00:00:00, hourly jobs have their start_date at 00:00 of a specific hour. Note that Airflow simply looks at the latest execution_date and adds the schedule_interval to determine the next execution_date. It is also very important to note that different tasks\u2019 dependencies need to line up in time. If task A depends on task B and their start_date are offset in a way that their execution_date don\u2019t line up, A\u2019s dependencies will never be met. If you are looking to delay a task, for example running a daily task at 2AM, look into the TimeSensor and TimeDeltaSensor. We advise against using dynamic start_date and recommend using fixed ones. Read the FAQ entry about start_date for more information.", "title": "Start Date" }, "end_date": { "anyOf": [ { "format": "date-time", "type": "string" }, { "maxItems": 2, "minItems": 2, "prefixItems": [ { "format": "date-time", "type": "string" }, { "type": "string" } ], "type": "array" }, { "type": "null" } ], "default": null, "description": "if specified, the scheduler won\u2019t go beyond this date", "title": "End Date" }, "depends_on_past": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "default": null, "description": "when set to true, task instances will run sequentially and only if the previous instance has succeeded or has been skipped. The task instance for the start_date is allowed to run.", "title": "Depends On Past" }, "queue": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "description": "which queue to target when running this job. Not all executors implement queue management, the CeleryExecutor does support targeting specific queues.", "title": "Queue" }, "pool": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "description": "the slot pool this task should run in, slot pools are a way to limit concurrency for certain tasks", "title": "Pool" }, "pool_slots": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "description": "the number of pool slots this task should use (>= 1) Values less than 1 are not allowed", "title": "Pool Slots" }, "do_xcom_push": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "default": null, "description": "if True, an XCom is pushed containing the Operator\u2019s result", "title": "Do Xcom Push" }, "task_display_name": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "description": "The display name of the task which appears on the UI.", "title": "Task Display Name" } } }
- Fields:
depends_on_past (bool | None)
do_xcom_push (bool | None)
email (List[str] | None)
email_on_failure (bool | None)
email_on_retry (bool | None)
end_date (datetime.datetime | Tuple[datetime.datetime, str] | None)
owner (str | None)
pool (str | None)
pool_slots (int | None)
queue (str | None)
retries (int | None)
retry_delay (datetime.timedelta | None)
start_date (datetime.datetime | Tuple[datetime.datetime, str] | None)
task_display_name (str | None)
- field owner: str | None = None¶
the owner of the task. Using a meaningful description (e.g. user/person/team/role name) to clarify ownership is recommended.
- field email: List[str] | None = None¶
the ‘to’ email address(es) used in email alerts
- field email_on_failure: bool | None = None¶
Indicates whether email alerts should be sent when a task failed
- field email_on_retry: bool | None = None¶
Indicates whether email alerts should be sent when a task is retried
- field retries: int | None = None¶
the number of retries that should be performed before failing the task
- field retry_delay: timedelta | None = None¶
delay between retries
- field start_date: Annotated[datetime | Tuple[datetime, str], AfterValidator(func=_datetime_or_datetime_and_timezone)] | None = None¶
The start_date for the task, determines the execution_date for the first task instance. The best practice is to have the start_date rounded to your DAG’s schedule_interval. Daily jobs have their start_date some day at 00:00:00, hourly jobs have their start_date at 00:00 of a specific hour. Note that Airflow simply looks at the latest execution_date and adds the schedule_interval to determine the next execution_date. It is also very important to note that different tasks’ dependencies need to line up in time. If task A depends on task B and their start_date are offset in a way that their execution_date don’t line up, A’s dependencies will never be met. If you are looking to delay a task, for example running a daily task at 2AM, look into the TimeSensor and TimeDeltaSensor. We advise against using dynamic start_date and recommend using fixed ones. Read the FAQ entry about start_date for more information.
- field end_date: Annotated[datetime | Tuple[datetime, str], AfterValidator(func=_datetime_or_datetime_and_timezone)] | None = None¶
if specified, the scheduler won’t go beyond this date
- field depends_on_past: bool | None = None¶
when set to true, task instances will run sequentially and only if the previous instance has succeeded or has been skipped. The task instance for the start_date is allowed to run.
- field queue: str | None = None¶
which queue to target when running this job. Not all executors implement queue management, the CeleryExecutor does support targeting specific queues.
- field pool: str | None = None¶
the slot pool this task should run in, slot pools are a way to limit concurrency for certain tasks
- field pool_slots: int | None = None¶
the number of pool slots this task should use (>= 1) Values less than 1 are not allowed
- field do_xcom_push: bool | None = None¶
if True, an XCom is pushed containing the Operator’s result
- field task_display_name: str | None = None¶
The display name of the task which appears on the UI.