Skip to content

Scheduling

  • Released in version: 0.23.0

DAG-Factory offers flexible scheduling options so your DAGs can run on time or based on data availability. Whether you're triggering DAGs on a cron schedule, a time delta, or when an upstream asset is ready, you can configure it easily using the schedule field in your YAML.

Below are the supported scheduling types, each with consistent structure and examples to help you get started.

How to Use

  • Every schedule block must specify a type, such as cron, timedelta, relativedelta, timetable, or assets.
  • The actual configuration goes under the value key.
  • Only one schedule type should be defined per DAG.

Example Overview

Type Description Use Case Example
cron Run based on a cron string Every day at midnight
timedelta Fixed intervals between runs Every 6 hours
relativedelta Calendar-aware schedule (e.g. months) Every 1st of the month
timetable Advanced Airflow timetables Custom trigger logic
assets Trigger based on asset readiness When data X and Y are available
datasets Trigger based on datasets readiness When data X and Y are available

Schema Options

1. Cron-Based Schedule

Corn Schedule
schedule: '@daily'

2. Timedelta Schedule

Timedelta Schedule
schedule:
  __type__: datetime.timedelta
  seconds: 30

3. RelativeDelta Schedule

Relativedelta Schedule
schedule:
  __type__: dateutil.relativedelta.relativedelta
  hour: 18

4. Timetable (Advanced Scheduling)

Timetable Schedule
schedule:
  __type__: airflow.timetables.trigger.CronTriggerTimetable
  cron: "* * * * *"
  timezone: UTC

5. Asset-Based Triggering

OR (default when list is provided)

OR Condition
schedule:
  __type__: builtins.list
  items:
    - __type__: airflow.sdk.Asset
      uri: s3://dag1/output_1.txt
      extra:
        hi: bye
    - __type__: airflow.sdk.Asset
      uri: s3://dag2/output_1.txt
      extra:
        hi: bye
OR Condition
schedule:
  or:
    - __type__: airflow.sdk.Asset
      uri: s3://dag1/output_1.txt
      extra:
        hi: bye
    - __type__: airflow.sdk.Asset
      uri: s3://dag2/output_1.txt
      extra:
        hi: bye

AND (explicit composition)

AND Condition
schedule:
  and:
    - __type__: airflow.sdk.Asset
      uri: s3://dag1/output_1.txt
      extra:
        hi: bye
    - __type__: airflow.sdk.Asset
      uri: s3://dag2/output_1.txt
      extra:
        hi: bye

Nested And Or Condition

Nested AND OR Condition
schedule:
  or:
    - and:
        - __type__: airflow.sdk.Asset
          uri: s3://dag1/output_1.txt
          extra:
            hi: bye
        - __type__: airflow.sdk.Asset
          uri: s3://dag2/output_1.txt
          extra:
            hi: bye
    - __type__: airflow.sdk.Asset
      uri: s3://dag3/output_3.txt
      extra:
        hi: bye

With Watchers

Assert with watcher
schedule:
  __type__: airflow.sdk.Asset
  uri: s3://dag1/output_1.txt
  extra:
    hi: bye
  watchers:
    - __type__: airflow.sdk.AssetWatcher
      name: test_asset_watcher
      trigger:
        __type__: airflow.providers.standard.triggers.file.FileDeleteTrigger
        filepath: "/temp/file.txt"

6. Datasets-Based Triggering

schedule: [ 's3://bucket_example/raw/dataset1.json', 's3://bucket_example/raw/dataset2.json' ]