Airflow Concepts: Difference between revisions
(→Pool) |
|||
Line 23: | Line 23: | ||
==Control Flow== | ==Control Flow== | ||
{{External|https://airflow.apache.org/docs/apache-airflow/stable/concepts/dags.html#control-flow}} | {{External|https://airflow.apache.org/docs/apache-airflow/stable/concepts/dags.html#control-flow}} | ||
==Dynamic DAG== | ==<span id='Dynamic_DAG'></span>Dynamic DAG or Dynamic Task Mapping== | ||
{{External|https://airflow.apache.org/docs/apache-airflow/stable/concepts/dags.html#dynamic-dags}} | {{External|https://airflow.apache.org/docs/apache-airflow/stable/concepts/dags.html#dynamic-dags}} | ||
The DAGs can be purely declarative, or they can be declared in Python code, by adding tasks dynamically. | The DAGs can be purely declarative, or they can be declared in Python code, by adding tasks dynamically. |
Revision as of 19:40, 11 July 2022
External
Internal
Workflow
DAG
The edges can be labeled in the UI.
SubDAG
A DAG is made of tasks among which there are relations of dependency. The DAG is not concerned about what happens inside the tasks, it is only concerned about how to run them: order, retries, timeouts. etc.
Declaring a DAG
- Via a context manager.
- With the
DAG()
constructor. - With the
@dag
decorator. TO PARSE: https://airflow.apache.org/docs/apache-airflow/stable/concepts/dags.html#the-dag-decorator.
DAG Run
A DAG instantiates in a DAG Run at runtime.
Control Flow
Dynamic DAG or Dynamic Task Mapping
The DAGs can be purely declarative, or they can be declared in Python code, by adding tasks dynamically.
Task
A Task is the basic unit of execution in Airflow. Every task must be assigned to a DAG to run. Tasks have dependencies on each other. There could be upstream dependencies (if B depends on A, A → B, then A is an upstream dependency of B). To be scheduled, a task have all its dependencies met.
Task Relationships
The task relationships, are a key part in using Tasks.
There are two types of relationships: dependency and
Upstream and Downstream Dependency
If a task B has a dependency on task A (A → B), it is said that A is upstream of B and B is downstream of A. The dependencies are the directed edges of the directed acyclic graph.
The upstream term has a very strict semantics: an upstream task is the task that is directly preceding the other task. This concept does not describe the tasks that are higher in the task hierarchy (they are not a direct parent of the task). Same constrains apply to a downstream task, which need to be a direct child of the other task.
Previous and Next
There may also be instances of the same task, but for different data intervals - from other runs of the same DAG. These are previous and next.
Task Types
Airflow has three types of tasks: Operator, Sensor, which is a subclass of Operator, and TaskFlow-decorated Task. All these are subclasses of Airflow's BaseOperator
. Operators and Sensor are templates, and when one is called in a DAG, it is made into a Task.
Operator
An Operator is a predefined task template.
Sensor
A Sensor is a subclass of Operator that wait for an external event to happen.
TaskFlow-decorated Task
Decorated with @task
. A custom Python function packaged up as a Task.
Task Assignment to DAG
Task Instance
The same way a DAG is instantiated at runtime into a DAG Run, the tasks under a DAG are instantiated into Task Instances.
Task States
none
The task has not yet been queued for execution because its dependencies are not yet met.
scheduled
The task dependencies have been met, and the scheduled has determined that the task should run.
queued
The task has been assigned to an executor and it is awaiting a worker.
running
The task is running on a worker or on a local/synchronous executor.
success
The task finished running without errors.
shutdown
The task was externally requested to shut down when it was running.
restarting
The task was externally requested to restart when it was running.
failed
The task had an error during execution and failed to run.
skipped
The task was skipped due to branching, LatestOnly or similar.
upstream_failed
An upstream task failed and the Trigger Rule says we needed it.
up_for_retry
The task failed, but has retry attempts left and will be rescheduled.
up_for_reschedule
The task is a sensor that is in reschedule mode.
sensing
The task is a Smart Sensor.
deferred
The task has been deferred to a trigger.
removed
The task has vanished from the DAG since the run started.
Task Lifecycle
The normal lifecycle of a task instance is none → scheduled → queued → running → success.
Passing Data between Tasks
Tasks pass data among each other using:
- XComs, when the amount of metadata to be exchanged is small.
- Uploading and downloading large files from a storage service.
TaskGroup
This is a pure UI concept.
Task Timeout
Task SLA
An SLA, or a Service Level Agreement, is an expectation for the maximum time a Task should take.
Zombie/Undead Tasks
Per-Task Executor Configuration
XComs
"Cross-communications".