michael_cooper
08/28/2020, 2:08 PMtarget-snowflake
and noticed an error when not defining SF_SCHEMA
in my .env
file. If undefined, I get this critical error:
target-snowflake | ERROR Exception in schema_apply() while processing:
target-snowflake | {"type": "SCHEMA", "stream": "issue_labels", "schema": {"type": ["null", "object"], "properties": {"_sdc_repository": {"type": ["null", "string"]}, "id": {"type": ["null", "number"]}, "node_id": {"type": ["null", "string"]}, "url": {"type": ["null", "string"]}, "name": {"type": ["null", "string"]}, "description": {"type": ["null", "string"]}, "color": {"type": ["null", "string"]}, "default": {"type": ["null", "boolean"]}}}, "key_properties": ["id"]}
target-snowflake |
target-snowflake | CRITICAL (snowflake.connector.errors.ProgrammingError) 002003 (02000): SQL compilation error:
target-snowflake | CRITICAL Schema 'COOPER_DB."tap_github"' does not exist or not authorized.
target-snowflake | CRITICAL [SQL: GRANT USAGE ON SCHEMA "COOPER_DB"."tap_github" TO ROLE SYSADMIN;]
target-snowflake | CRITICAL (Background on this error at: <http://sqlalche.me/e/13/f405>)
When I look at my database, it does create a schama called TAP_GITHUB
, but it still fails. When I define SF_SCHEMA
as TAP_GITHUB
the error does not occur.douwe_maan
08/28/2020, 3:06 PMtarget-snowflake
itself, or would it make more sense to implement a new value_processor: upcase_string
and add it to target-snowflake
schema
?douwe_maan
08/28/2020, 3:11 PMnevin_morgan
08/28/2020, 3:16 PMnevin_morgan
08/28/2020, 3:17 PMdouwe_maan
08/28/2020, 3:18 PMdbt
reads the loader's schema
to set its own source_schema
(https://gitlab.com/meltano/meltano/blob/master/src/meltano/core/bundle/discovery.yml#L760), it would read the processed, upcased versiondouwe_maan
08/28/2020, 3:18 PMschema
, so that it will actually exist when dbt
looks for it as wellnevin_morgan
08/28/2020, 3:20 PMdouwe_maan
08/28/2020, 3:21 PMschema
s at all, we would of course need to document that.douwe_maan
08/28/2020, 3:23 PMmichael_cooper
08/28/2020, 3:23 PMtarget-snowflake
implementation to ensure it changes to uppercase?michael_cooper
08/28/2020, 3:23 PMdouwe_maan
08/28/2020, 3:24 PMdouwe_maan
08/28/2020, 3:25 PMHavealways upcase the providedtarget-snowflake
before using itschema
This is not really an option, because ourintegration reads the loader'sdbt
setting to set its ownschema
(see https://meltano.com/docs/integration.html#pipeline-environment-variables and https://gitlab.com/meltano/meltano/blob/master/src/meltano/core/bundle/discovery.yml#L760), which wouldn't match the actually created schema in this case. Options 1 and 2 don't have that issue.source_schema
douwe_maan
08/28/2020, 3:26 PMShouldn't it be the responsibility ofThat would make sense, but within the context of Meltano that has a downside, since theimplementation to ensure it changes to uppercase?thetarget-snowflake
schema
value listed by meltano config target-snowflake
would not match the actually created schema, which is a feature that our dbt
integration depends on 😞douwe_maan
08/28/2020, 3:27 PMtarget-snowflake
: not just the extractor's namespace, but an uppercase version of that namespacedouwe_maan
08/28/2020, 3:28 PMnevin_morgan
08/28/2020, 3:28 PMnevin_morgan
08/28/2020, 4:42 PMdouwe_maan
08/28/2020, 4:43 PMdouwe_maan
08/28/2020, 4:43 PMvalue_processor
is nest_object
, and you would a new name: function
pair to the same dict: https://gitlab.com/meltano/meltano/-/blob/master/src/meltano/core/setting_definition.py#L9douwe_maan
08/28/2020, 4:45 PMnevin_morgan
08/28/2020, 4:49 PMconfig:
value_processor: uppercase
right?douwe_maan
08/28/2020, 4:49 PMSettingDefinition
, so it'd need to go here: https://gitlab.com/meltano/meltano/blob/master/src/meltano/core/bundle/discovery.yml#L701nevin_morgan
08/28/2020, 4:51 PMdouwe_maan
08/28/2020, 4:51 PMdiscovery.yml
, but it is here: https://gitlab.com/meltano/meltano/-/blob/master/src/meltano/core/plugin/singer/tap.py#L73douwe_maan
08/28/2020, 4:51 PM