Skip to content

KubernetesPodOperator

In DAG Factory, you can use Airflow Kubernetes Provider to create and run Pods on a Kubernetes cluster.

Example DAG

First, define your DAG configuration in a YAML file

kubernetes_pod_dag:
  start_date: 2025-01-01
  schedule_interval: "@daily"
  description: "A DAG that runs a simple KubernetesPodOperator task"
  catchup: false
  tasks:
    - task_id: hello-world-pod
      operator: airflow.providers.cncf.kubernetes.operators.pod.KubernetesPodOperator
      config_file: "path/to/kube/config"
      image: "python:3.12-slim"
      cmds: ["python", "-c"]
      arguments: ["print('Hello from KubernetesPodOperator!')"]
      name: "example-pod-task"
      namespace: "default"
      get_logs: true
      container_resources:
        __type__: kubernetes.client.models.V1ResourceRequirements
        limits:
          cpu: "1"
          memory: "1024Mi"
        requests:
          cpu: "0.5"
          memory: "512Mi"

Then, you can load this YAML configuration dynamically.

import os
from pathlib import Path

# The following import is here so Airflow parses this file
# from airflow import DAG
from dagfactory import load_yaml_dags

DEFAULT_CONFIG_ROOT_DIR = "/usr/local/airflow/dags/"
CONFIG_ROOT_DIR = Path(os.getenv("CONFIG_ROOT_DIR", DEFAULT_CONFIG_ROOT_DIR))

config_file = str(CONFIG_ROOT_DIR / "kpo.yml")
load_yaml_dags(
    globals_dict=globals(),
    config_filepath=config_file,
)