I use both `target-postgres` and `target-snowflake...
# troubleshooting
j
I use both
target-postgres
and
target-snowflake
in the same pipeline, where I transform data loaded by Meltano with
dbt
. Snowflake stores table/columns names upper-cased, PostgreSQL and all other DBs lower-cased. In the dbt transformation, I handle column
user
, which is a DB keyword, so I have to enclose it with double quotes -
"user"
. But it does not work with Snowflake, where I would have to use
"USER"
. Anyone knows any kind of workaround for this use case? Isn't it possible to force Meltano to create tables with upper-cased/lower-cased table/column names?
OK, I implemented the following solution in dbt:
Copy code
{% macro get_db_entity_name(entity_name, node) -%}

    {%- if target.type == "snowflake" -%}

        {{ entity_name.upper() }}

    {%- else -%}

        {{ entity_name }}

    {%- endif -%}

{%- endmacro %}
t
I would argue it's a bug in the target but I like the idea of using a dbt macro to work around it. The only other alternative I can think of is to fork the target... 😕