Cosmos Contributing Guide#

All contributions, bug reports, bug fixes, documentation improvements, enhancements are welcome.

As contributors and maintainers to this project, you are expected to abide by the Contributor Code of Conduct.

Learn more about the contributors roles in Contributor roles.

Overview#

To contribute to the cosmos project:

  1. Please create a GitHub Issue describing your contribution

  2. Open a feature branch off of the main branch and create a Pull Request into the main branch from your feature branch

  3. Link your issue to the pull request

  4. Once developments are complete on your feature branch, request a review and it will be merged once approved.

Setup local development on host machine#

This guide will setup astronomer development on host machine, first clone the astronomer-cosmos repo and enter the repo directory:

git clone https://github.com/astronomer/astronomer-cosmos.git
cd astronomer-cosmos/

Then install airflow and astronomer-cosmos using python-venv:

python3 -m venv env && source env/bin/activate
pip3 install "apache-airflow[cncf.kubernetes,openlineage]"
pip3 install -e ".[dbt-postgres,dbt-databricks]"

Set airflow home to the dev/ directory and disabled loading example DAGs:

export AIRFLOW_HOME=$(pwd)/dev/
export AIRFLOW__CORE__LOAD_EXAMPLES=false

Then, run airflow in standalone mode, command below will create a new user (if not exist) and run necessary airflow component (webserver, scheduler and triggerer):

By default airflow will use sqlite as database, you can overwrite this by set variable AIRFLOW__DATABASE__SQL_ALCHEMY_CONN to the sql connection string.

airflow standalone

Once the airflow is up, you can access the Airflow UI at http://localhost:8080.

Note: whenever you want to start the development server, you need to activate the virtualenv and set the environment variables

Using Docker Compose for local development#

It is also possible to just build the development environment using docker compose

To launch a local sandbox with docker compose, first clone the astronomer-cosmos repo and enter the repo directory:

git clone https://github.com/astronomer/astronomer-cosmos.git
cd astronomer-cosmos/

To prevent permission error on Linux, you must create dags, logs, and plugins folders and change owner to the user astro with the user ID 50000. To do this, run the following command:

mkdir -p dev/dags dev/logs dev/plugins
sudo chown 50000:50000 -R dev/dags dev/logs dev/plugins

Then, run the docker compose command:

docker compose -f dev/docker-compose.yaml up -d --build

Once the sandbox is up, you can access the Airflow UI at http://localhost:8080.

Testing application with hatch#

We currently use hatch for building and distributing astronomer-cosmos.

The tool can also be used for local development. The pyproject.toml file currently defines a matrix of supported versions of Python and Airflow for which a user can run the tests against.

For instance, to run the tests using Python 3.10 and Apache Airflow® 2.5, use the following:

hatch run tests.py3.10-2.5:test-cov

It is also possible to run the tests using all the matrix combinations, by using:

hatch run tests:test-cov

The integration tests rely on Postgres. It is possible to host Postgres by using Docker, for example:

docker run --name postgres -p 5432:5432 -p 5433:5433 -e POSTGRES_PASSWORD=postgres postgres

To run the integration tests for the first time, use:

export AIRFLOW_HOME=`pwd`
export AIRFLOW_CONN_AIRFLOW_DB=postgres://postgres:postgres@0.0.0.0:5432/postgres
export DATABRICKS_HOST=''
export DATABRICKS_TOKEN=''
export DATABRICKS_WAREHOUSE_ID=''
export DATABRICKS_CLUSTER_ID=''
export POSTGRES_PORT=5432
export POSTGRES_SCHEMA=public
export POSTGRES_DB=postgres
export POSTGRES_PASSWORD=postgres
export POSTGRES_USER=postgres
export POSTGRES_HOST=localhost
hatch run tests.py3.8-2.5:test-integration-setup
hatch run tests.py3.8-2.5:test-integration

If testing for the same Airflow and Python version, next runs of the integration tests can be:

hatch run tests.py3.8-2.5:test-integration

Pre-Commit#

We use pre-commit to run a number of checks on the code before committing. To install pre-commit, run:

pre-commit install

To run the checks manually, run:

pre-commit run --all-files

Writing Docs#

You can run the docs locally by running the following:

hatch run docs:serve

This will run the docs server in a virtual environment with the right dependencies. Note that it may take longer on the first run as it sets up the virtual environment, but will be quick on subsequent runs.

Building#

We use `hatch` to build the project. To build the project, run:

hatch build

Releasing#

We use GitHub actions to create and deploy new releases. To create a new release, first create a new version using:

hatch version minor

`hatch` will automatically update the version for you. Then, create a new release on GitHub with the new version. The release will be automatically deployed to PyPI.

Note

You can update the version in a few different ways. Check out the hatch docs to learn more.