Object Storage
In DAG-Factory, You can define a custom Python class to represent object storage (e.g., S3, GCS, Azure Blob).
object_storage_ops:
default_args:
owner: "custom_owner"
start_date: 2 days
description: "Example of ObjectStorage powered DAG"
schedule: "0 3 * * *"
tasks:
object_storage_ops:
operator: airflow.operators.python.PythonOperator
python_callable: sample.object_storage_ops
op_kwargs:
my_obj_storage:
__type__: airflow.io.path.ObjectStoragePath
__args__:
- file://$CONFIG_ROOT_DIR/data/object_storage_ops.csv