Hello everyone I'm using `tap-postgres` by `melta...
# plugins-general
m
Hello everyone I'm using
tap-postgres
by
meltanolabs
and I have a column in the postgres table that contains strings like this
{text_sent}
and it's automatically converts into array
['text_sent']
and when I put it into snowflake the target column is VARIANT.
Copy code
When I added to the tap
config:
      stream_maps:
        public-table_name:
          actions: str(record['actions'])
it delivered a string
['text_sent']
into Snowflake. I can add
.replace("['", "{").replace("']", "}")
to convert it back to
{text_sent}
but I hope that it's possible to disable the initial transformation in an array. I would really appreciate if you help me to disable it. Thank you!
e
Hi @mykola_zavada! I can't seem to reproduce this behavior. I create a varchar column with a similar value:
Copy code
postgres@localhost:postgres> select id, a_string from with_strings;
+----+-------------+
| id | a_string    |
|----+-------------|
| 1  | normal      |
| 2  | {text_sent} |
+----+-------------+
SELECT 2
And I get the right schema and value when I `meltano invoke tap-postgres`:
Copy code
{"type": "STATE", "value": {}}
{"type": "SCHEMA", "stream": "public-with_strings", "schema": {"properties": {"id": {"type": ["integer"]}, "a_string": {"type": ["string", "null"]}}, "type": "object", "required": ["id"]}, "key_properties": ["id"]}
{"type": "RECORD", "stream": "public-with_strings", "record": {"id": 1, "a_string": "normal"}, "time_extracted": "2023-10-18T16:32:23.351063+00:00"}
{"type": "RECORD", "stream": "public-with_strings", "record": {"id": 2, "a_string": "{text_sent}"}, "time_extracted": "2023-10-18T16:32:23.352601+00:00"}
{"type": "STATE", "value": {"bookmarks": {"public-with_strings": {}}}}
Can you confirm the type of the column that results in the
['text_sent']
array value? If it's not an array, can you open an issue in https://github.com/meltanolabs/tap-postgres/issues/new?
m
Hi @edgar_ramirez_mondragon, thank you for your help! That was mine stupid mistake. Sorry for bothering you!
e
Ha no worries! Glad you figured it out.