Skip to content

Scheduling

  • Released in version: 0.23.0

DAG-Factory offers flexible scheduling options so your DAGs can run on time or based on data availability. Whether you're triggering DAGs on a cron schedule, a time delta, or when an upstream asset is ready, you can configure it easily using the schedule field in your YAML.

Below are the supported scheduling types, each with consistent structure and examples to help you get started.

How to Use

  • Every schedule block must specify a type, such as cron, timedelta, relativedelta, timetable, or assets.
  • The actual configuration goes under the value key.
  • Only one schedule type should be defined per DAG.

Example Overview

Type Description Use Case Example
cron Run based on a cron string Every day at midnight
timedelta Fixed intervals between runs Every 6 hours
relativedelta Calendar-aware schedule (e.g. months) Every 1st of the month
timetable Advanced Airflow timetables Custom trigger logic
assets Trigger based on asset readiness When data X and Y are available
datasets Trigger based on datasets readiness When data X and Y are available

Schema Options

1. Cron-Based Schedule

Corn Schedule
schedule: '@daily'

Or,

Corn Schedule
schedule:
  type: cron
  value: "0 0 * * *"

2. Timedelta Schedule

Timedelta Schedule
schedule:
  type: timedelta
  value:
      seconds: 30

3. RelativeDelta Schedule

Relativedelta Schedule
schedule:
  type: relativedelta
  value:
      month: 1

4. Timetable (Advanced Scheduling)

Timetable Schedule
schedule:
  type: timetable
  value:
    callable: airflow.timetables.trigger.CronTriggerTimetable
    params:
      cron: "* * * * *"
      timezone: UTC

5. Asset-Based Triggering

OR (default when list is provided)

OR Condition
schedule:
  type: assets
  value:
    - uri: s3://dag1/output_1.txt
      extra:
        hi: bye
    - uri: s3://dag2/output_1.txt
      extra:
        hi: bye
OR Condition
schedule:
  type: assets
  value:
    or:
      - uri: s3://dag1/output_1.txt
        extra:
          hi: bye
      - uri: s3://dag2/output_1.txt
        extra:
          hi: bye

AND (explicit composition)

AND Condition
schedule:
  type: assets
  value:
    and:
      - uri: s3://dag1/output_1.txt
        extra:
          hi: bye
      - uri: s3://dag2/output_1.txt
        extra:
          hi: bye

Nested And Or Condition

Nested AND OR Condition
schedule:
  type: assets
  value:
    or:
      - and:
          - uri: s3://dag1/output_1.txt
            extra:
              hi: bye
          - uri: s3://dag2/output_1.txt
            extra:
              hi: bye
      - uri: s3://dag3/output_3.txt
        extra:
          hi: bye

With Watchers

Assert with watcher
schedule:
  type: asset
  value:
    - uri: s3://dag1/output_1.txt
      extra:
        hi: bye
      watchers:
        - callable: airflow.sdk.AssetWatcher
          name: test_asset_watcher
          trigger:
            callable: airflow.providers.standard.triggers.file.FileDeleteTrigger
            params:
              filepath: "/temp/file.txt"

6. Datasets-Based Triggering

schedule: [ 's3://bucket_example/raw/dataset1.json', 's3://bucket_example/raw/dataset2.json' ]