- Profiles Overview
- AthenaAccessKey
- ClickhouseUserPassword
- GoogleCloudServiceAccountFile
- GoogleCloudServiceAccountDict
- GoogleCloudOauth
- DatabricksToken
- DatabricksOauth
- OracleUserPassword
- PostgresUserPassword
- RedshiftUserPassword
- SnowflakeUserPassword
- SnowflakeEncryptedPrivateKeyFilePem
- SnowflakeEncryptedPrivateKeyPem
- SnowflakePrivateKeyPem
- SparkThrift
- ExasolUserPassword
- TeradataUserPassword
- TrinoLDAP
- TrinoCertificate
- TrinoJWT
- VerticaUserPassword
Profiles Overview#
Cosmos supports two methods of authenticating with your database:
using your own dbt profiles.yml file
using Airflow connections via Cosmos’ profile mappings
If you’re already interacting with your database from Airflow and have a connection set up, it’s recommended to use a profile mapping to translate that Airflow connection to a dbt profile. This is because it’s easier to maintain a single connection object in Airflow than it is to maintain a connection object in Airflow and a dbt profile in your dbt project.
If you don’t already have an Airflow connection, or if there’s no readily-available profile mapping for your database, you can use your own dbt profiles.yml file.
Regardless of which method you use, you’ll need to tell Cosmos which profile and target name it should use. Profile config
is set in the cosmos.config.ProfileConfig
object, like so:
from cosmos.config import ProfileConfig
profile_config = ProfileConfig(
profile_name="my_profile_name",
target_name="my_target_name",
# choose one of the following
profile_mapping=...,
profiles_yml_filepath=...,
)
dag = DbtDag(profile_config=profile_config, ...)
Using a profile mapping#
Profile mappings are utilities provided by Cosmos that translate Airflow connections to dbt profiles. This means that you can use the same connection objects you use in Airflow to authenticate with your database in dbt. To do so, there’s a class in Cosmos for each Airflow connection to dbt profile mapping.
You can find the available profile mappings on the left-hand side of this page. Each profile mapping is imported from
cosmos.profiles
and takes two arguments:
conn_id
: the Airflow connection ID to use.profile_args
: a dictionary of additional arguments to pass to the dbt profile. This is useful for specifying values that are not in the Airflow connection. This also acts as an override for any values that are in the Airflow connection but should be overridden.
Below is an example of using the Snowflake profile mapping, where we take most arguments from the Airflow connection
but override the database
and schema
values:
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
profile_config = ProfileConfig(
profile_name="my_profile_name",
target_name="my_target_name",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="my_snowflake_conn_id",
profile_args={
"database": "my_snowflake_database",
"schema": "my_snowflake_schema",
},
),
)
dag = DbtDag(profile_config=profile_config, ...)
Note that when using a profile mapping, the profiles.yml file gets generated with the profile name and target name
you specify in ProfileConfig
.
Disabling dbt event tracking#
Added in version 1.3.
By default dbt will track events by sending anonymous usage data
when dbt commands are invoked. Users have an option to opt out of event tracking by updating their profiles.yml
file.
If you’d like to disable this behavior in the Cosmos generated profile, you can pass disable_event_tracking=True
to the profile mapping like in
the example below:
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
profile_config = ProfileConfig(
profile_name="my_profile_name",
target_name="my_target_name",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="my_snowflake_conn_id",
profile_args={
"database": "my_snowflake_database",
"schema": "my_snowflake_schema",
},
disable_event_tracking=True,
),
)
dag = DbtDag(profile_config=profile_config, ...)
Dbt profile config variables#
Added in version 1.4.0.
The parts of profiles.yml
, which aren’t specific to a particular data platform dbt docs
from cosmos.profiles import SnowflakeUserPasswordProfileMapping, DbtProfileConfigVars
profile_config = ProfileConfig(
profile_name="my_profile_name",
target_name="my_target_name",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="my_snowflake_conn_id",
profile_args={
"database": "my_snowflake_database",
"schema": "my_snowflake_schema",
},
dbt_config_vars=DbtProfileConfigVars(
send_anonymous_usage_stats=False,
partial_parse=True,
use_experimental_parse=True,
static_parser=True,
printer_width=120,
write_json=True,
warn_error=True,
warn_error_options={"include": "all"},
log_format='text',
debug=True,
version_check=True,
),
),
)
dag = DbtDag(profile_config=profile_config, ...)
Using your own profiles.yml file#
If you don’t want to use Airflow connections, or if there’s no readily-available profile mapping for your database,
you can use your own dbt profiles.yml file. To do so, you’ll need to pass the path to your profiles.yml file to the
profiles_yml_filepath
argument in ProfileConfig
.
For example, the code snippet below points Cosmos at a profiles.yml
file and instructs Cosmos to use the
my_snowflake_profile
profile and dev
target:
from cosmos.config import ProfileConfig
profile_config = ProfileConfig(
profile_name="my_snowflake_profile",
target_name="dev",
profiles_yml_filepath="/path/to/profiles.yml",
)
dag = DbtDag(profile_config=profile_config, ...)