Similar dbt & Airflow concepts#

While dbt is an open source tool for data transformations and analysis, using SQL, Airflow focuses on being a platform for the development, scheduling and monitoring of batch-oriented workflows, using Python. Although both tools have many differences, they also share similar concepts.

This page aims to list some of these concepts and help those who may be new to Airflow or dbt and are considering to use Cosmos.

Airflow naming

dbt naming

Description

Differences

DAG

Workflow

Pipeline (Direct Acyclic Graph) that contains a group of steps

Airflow expects upstream tasks to have passed to run downstream tasks. dbt can run a subset of tasks assuming upstream tasks were run.

Task

Node

Step within a pipeline (DAG or workflow)

In dbt, these are usually transformations that run on a remote database. In Airflow, steps can be anything, running locally in Airflow or remotely.

Language

Language

Programming or declarative language used to define pipelines and steps.

In dbt, users write SQL, YML and Python to define the steps of a pipeline. Airflow expects steps and pipelines are written in Python.

Variables

Variables

Key-value configuration that can be used in steps and avoids hard-coded values

Templating

Macros

Jinja templating used to access variables, configuration and reference steps

dbt encourages using jinja templating for control structures (if and for). Native in Airflow/Python, used to define variables, macros and filters.

Connection

Profile

Configuration to connect to databases or other services

Providers

Adapter

Additional Python libraries that support specific databases or services