.. This file is autogenerated by `docs/scripts/generate_mappings.py`. Do not edit by hand. VerticaUserPassword =================== Maps Airflow Vertica connections using username + password authentication to dbt profiles. .. note:: Use Airflow connection ``schema`` for vertica ``database`` to keep it consistent with other connection types and profiles. The Vertica Airflow provider hook `assumes this `_. This seems to be a common approach also for `Postgres `_, Redshift and Exasol since there is no ``database`` field in Airflow connection and ``schema`` is not required for the database connection. .. seealso:: https://docs.getdbt.com/reference/warehouse-setups/vertica-setup https://airflow.apache.org/docs/apache-airflow-providers-vertica/stable/connections/vertica.html This profile mapping translates Airflow connections with the type ``vertica`` into dbt profiles. To use this profile, import it from ``cosmos.profiles``: .. code-block:: python from cosmos.profiles import VerticaUserPasswordProfileMapping profile = VerticaUserPasswordProfileMapping( conn_id = 'my_vertica_connection', profile_args = { ... }, ) While the profile mapping pulls fields from Airflow connections, you may need to supplement it with additional ``profile_args``. The below table shows which fields are required, along with those not required but pulled from the Airflow connection if present. You can also add additional fields to the ``profile_args`` dict. .. list-table:: :header-rows: 1 * - dbt Field Name - Required - Airflow Field Name * - ``host`` - True - ``host`` * - ``username`` - True - ``login`` * - ``password`` - True - ``password`` * - ``port`` - False - ``port`` * - ``database`` - True - ``schema`` * - ``autocommit`` - False - ``extra.autocommit`` * - ``backup_server_node`` - False - ``extra.backup_server_node`` * - ``binary_transfer`` - False - ``extra.binary_transfer`` * - ``connection_load_balance`` - False - ``extra.connection_load_balance`` * - ``connection_timeout`` - False - ``extra.connection_timeout`` * - ``disable_copy_local`` - False - ``extra.disable_copy_local`` * - ``kerberos_host_name`` - False - ``extra.kerberos_host_name`` * - ``kerberos_service_name`` - False - ``extra.kerberos_service_name`` * - ``log_level`` - False - ``extra.log_level`` * - ``log_path`` - False - ``extra.log_path`` * - ``oauth_access_token`` - False - ``extra.oauth_access_token`` * - ``request_complex_types`` - False - ``extra.request_complex_types`` * - ``session_label`` - False - ``extra.session_label`` * - ``ssl`` - False - ``extra.ssl`` * - ``unicode_error`` - False - ``extra.unicode_error`` * - ``use_prepared_statements`` - False - ``extra.use_prepared_statements`` * - ``workload`` - False - ``extra.workload`` * - ``schema`` - True - Some notes about the table above: - This table doesn't necessarily show the full list of fields you *can* pass to the dbt profile. To see the full list of fields, see the link to the dbt docs at the top of this page. - If the Airflow field name starts with an ``extra.``, this means that the field is nested under the ``extra`` field in the Airflow connection. For example, if the Airflow field name is ``extra.token``, this means that the field is nested under ``extra`` in the Airflow connection, and the field name is ``token``. - If there are multiple Airflow field names, the profile mapping looks at those fields in order. For example, if the Airflow field name is ``['password', 'extra.token']``, the profile mapping will first look for a field named ``password``. If that field is not present, it will look for ``extra.token``.