Alexandre Pelletier
11/01/2024, 7:15 PMmeltano run tap-mysql target-snowflake
Namely, every time a table at source has a column of type JSON, my loader will (correctly) create a table in Snowflake with column of type VARIANT.
This makes sense, yet when I run the command I get a validation error to the tune of:
'[{"A": 10}, {"B": 12}]' is not of type null, object
Yet when I load this into a VARIANT as you typically can in Snowflake via
select parse_json('[{"A": 10}, {"B": 12}]');
Works just fine.
-----
This type of error seems consistent across all the JSON columns I'm trying to load from my tap. Any ideas why this is happening or how I can fix it?
As a workaround I tried adding a stream map to cast the value of those particular columns to string but that didn't seem to work (I'm probably misunderstanding how stream maps are supposed to be used) and in any case is not ideal for the long term, as ideally JSON types in MySQL should be shoved into Snowflake as a VARIANT as easily as select parse_json(...)
Alexandre Pelletier
11/01/2024, 7:41 PMvalidate_records: false
to my target-snowflake
config has quieted down those errors.
It results in a slightly ugly JSON string like "{\"A\": \"3838\"}"
with escapes but that can be easily fixed via parse_json
in Snowflake
Still curious as to what's going on here as I doubt I'm the first to experience this 😅 with the meltano snowflake loaderEdgar Ramírez (Arch.dev)
11/04/2024, 11:45 PMAlexandre Pelletier
11/05/2024, 3:37 PMextractors:
- name: tap-mysql
variant: transferwise
pip_url: pipelinewise-tap-mysql
metadata:
'*':
replication_method: LOG_BASED
loaders:
- name: target-snowflake
variant: meltanolabs
pip_url: meltanolabs-target-snowflake
config:
load_method: overwrite
validate_records: false
Edgar Ramírez (Arch.dev)
11/05/2024, 5:17 PMAlexandre Pelletier
11/05/2024, 5:39 PM