Hi everyone I am using tap-csv & target-postg...
# troubleshooting
g
Hi everyone I am using tap-csv & target-postgres, my source csv contains evaluated unicode characters but when I run tap-csv | target-postgres the data that is written to the postgres table contains escaped unicode characters (ex: \u2013). I have confirmed that my source csv file contains evaluated unicode characters and is encoded in utf-8, i have also included
"encoding": "UTF-8"
in the tap-csv config. Any suggestions what else I can try to stop unicode characters from being escaped in the postgres table? Thank you
j
Heya! Do you have a minimally reproducible example? I can try to reproduce the problem also, if you could share a CSV file that you know is producing unexpected results.
g
omg, i was using the wrong target-postgres, I changed to this one and it handles it correctly https://pypi.org/project/pipelinewise-target-postgres/ I was using this one instead originally.... https://pypi.org/project/target-postgres/ Thank you for asking me for more info, it caused me to look up versions and realize my issue!