Has anybody come across the error shown below and ...
# troubleshooting
h
Has anybody come across the error shown below and found a solution for the same? If yes then please share it.
2022-12-09T09:36:35.564419Z [error    ] Loader failed
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮ │ C:\Users\hansa\.local\pipx\venvs\meltano\lib\site-packages\meltano\core\logging\output_logger.py │ │ :201 in redirect_logging │ │ │ │ 198 │ │ │ *ignore_errors, │ │ 199 │ │ ) │ │ 200 │ │ try: │ │ ❱ 201 │ │ │ yield │ │ 202 │ │ except ignored_errors: # noqa: WPS329 │ │ 203 │ │ │ raise │ │ 204 │ │ except Exception as err: │ │ │ │ ╭────────────────────────────────────────── locals ───────────────────────────────────────────╮ │ │ │ err = RunnerError('Loader failed') │ │ │ │ ignore_errors = () │ │ │ │ ignored_errors = (<class 'KeyboardInterrupt'>, <class 'asyncio.exceptions.CancelledError'>) │ │ │ │ logger = <RootLogger root (INFO)> │ │ │ │ self = <meltano.core.logging.output_logger.Out object at 0x0000024D88169340> │ │ │ ╰─────────────────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ C\Users\hansa\.local\pipx\venvs\meltano\lib\site packages\meltano\core\block\extract load.py46 │ │ 1 in run │ │ │ │ 458 │ │ │ # TODO: legacy
meltano elt
style logging should be deprecated │ │ 459 │ │ │ legacy_log_handler = self.output_logger.out("meltano", logger) │ │ 460 │ │ │ with legacy_log_handler.redirect_logging(): │ │ ❱ 461 │ │ │ │ await self.run_with_job() │ │ 462 │ │ │ │ return │ │ 463 │ │ else: │ │ 464 │ │ │ logger.warning( │ │ …
c
That big Traceback box is actually not the interesting part. All the interesting logs are all the lines that occured before / above that Traceback box
h
2022-12-09T09:36:30.386342Z [warning  ] Failed to create symlink to 'meltano.exe': administrator privilege required
2022-12-09T09:36:32.154608Z [info     ] Environment 'dev' is active
2022-12-09T09:36:34.142790Z [warning  ] No state was found, complete import.
2022-12-09T09:36:35.229813Z [info     ] INFO:tap-csv:Beginning full_table sync of 'table'... cmd_type=elb consumer=False name=tap-csv producer=True stdio=stderr string_id=tap-csv
2022-12-09T09:36:35.230827Z [info     ] INFO:tap-csv:Tap has custom mapper. Using 1 provided map(s). cmd_type=elb consumer=False name=tap-csv producer=True stdio=stderr string_id=tap-csv
2022-12-09T09:36:35.231811Z [info     ] INFO:tap-csv:Properties () were present in the 'table' stream but not found in catalog schema. Ignoring. cmd_type=elb consumer=False name=tap-csv producer=True stdio=stderr string_id=tap-csv
2022-12-09T09:36:35.232816Z [info     ] INFO:tap-csv:INFO METRIC: {"type": "counter", "metric": "record_count", "value": 3, "tags": {"stream": "table"}} cmd_type=elb consumer=False name=tap-csv producer=True stdio=stderr string_id=tap-csv
2022-12-09T09:36:35.459993Z [info     ] time=2022-12-09 15:06:35 name=target_postgres level=INFO message=Table '"table"' does not exist. Creating... CREATE TABLE IF NOT EXISTS biostats."table" ("     "sex"" character varying, " "age"" character varying, " "height (in)"" character varying, " "weight (lbs)"" character varying, "name" character varying, PRIMARY KEY ("name")) cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
2022-12-09T09:36:35.510069Z [info     ] Traceback (most recent call last): cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
2022-12-09T09:36:35.511053Z [info     ]   File "c:\users\hansa\appdata\local\programs\python\python38\lib\runpy.py", line 192, in _run_module_as_main cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
2022-12-09T09:36:35.512051Z [info     ]     return _run_code(code, main_globals, None, cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
2022-12-09T09:36:35.512051Z [info     ]   File "c:\users\hansa\appdata\local\programs\python\python38\lib\runpy.py", line 85, in _run_code cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
2022-12-09T09:36:35.513077Z [info     ]     exec(code, run_globals)    cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
2022-12-09T09:36:35.514046Z [info     ]   File "C:\Users\hansa\OneDrive\Desktop\Reunion (Internship)\Data Pipeline\data-pipeline\.meltano\loaders\target-postgres\venv\Scripts\target-postgres.exe\__main__.py", line 7, in <module> cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
2022-12-09T09:36:35.514046Z [info     ]   File "C:\Users\hansa\OneDrive\Desktop\Reunion (Internship)\Data Pipeline\data-pipeline\.meltano\loaders\target-postgres\venv\lib\site-packages\target_postgres\__init__.py", line 373, in main cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
2022-12-09T09:36:35.515020Z [info     ]     persist_lines(config, singer_messages) cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
2022-12-09T09:36:35.516056Z [info     ]   File "C:\Users\hansa\OneDrive\Desktop\Reunion (Internship)\Data Pipeline\data-pipeline\.meltano\loaders\target-postgres\venv\lib\site-packages\target_postgres\__init__.py", line 219, in persist_lines cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr string_id=target-postgres
`2022-12-09T093635.516056Z [info ] stream_to_sync[stream].sync_table() cmd_type=elb consumer=True name=target-postgres producer=False stdio=stderr st…
Any leads?
c
psycopg2.errors.SyntaxError: zero-length delimited identifier at or near """"
I have no idea what that error might be trying to tell you
s
Sounds like the SQL is broken, and that would be quite strange given that it is auto generated. Maybe some special characters in column names etc,.?
h
id, name, and age are the column names
s
Actually do you happen to have a column without name in your csv?
you can try to do meltano invoke tap-csv first and inspect the output to see how it looks like.
h
No there are no columns without a name\
Here's the output to
meltano invoke tap-csv
s
yeah you see your problem right there. The "sex" column name contains a ", and an empty string.
h
I don't know how this is getting the column names
There are no columns for Sex, Height, or Weight
INFO:tap-csv:Properties () were present in the 'table' stream but not found in catalog schema. Ignoring.
Does this mean something?
s
No, that's not important. So meltano is detecting the schema with Name, Sex, Age, height,... and then fails to create a table based on it. I'm not able to tell you where that schema comes from, but I would imagine it is somewhere in the csv?
h
This is the csv file
s
Maybe you defined something in your project file?
It carries the "biostats" identifier as well in your error message.
h
Interesting
I have another csv file biostats.csv containing every column that it is trying to find
But I have not configured my tap-csv to read from that file
s
Well there you go 😄 How did you configure your tap-csv?
h
[
{ "entity" : "table", "path" : "C:/Users/hansa/OneDrive/Desktop/Reunion (Internship)/dummydata.csv", "keys" : ["id"] } ]
Using this as the
csv_files_definition
s
Hm the tap has the possibility to process all files inside a folder, I'm wondering whether something happened there....
h
Should I try changing the location of the file?
s
Well, you can try to put it into a folder on itself. That should fix it, I'm not sure how this happens.
h
Still facing the same error
Should I try reinstalling tap csv?
s
Hm, I#m not sure this fixes your problem. Could you try to move the file with the biostats to a different folder alltogether and run it again, just to test it out? (Into a folder that's not in the same hierarchy)
h
The location of source.json was the issue
Moving it along with the dummydata.csv solved the problems
Also, my test database in PostGres has not been populated with the data but there are no errors. Ihave run this command
meltano run tap-csv target-postgres
s
So meltano invoke tap-csv now spits out your csv in JSON format right? And meltano run tap-csv target-postgres prints out that it created a new schema right?
h
no it just says that block run has completed
With no errors
But the schema does not show up on my Postgres database that I configured the target-postgres with
Is it the case that only after I run a transformer, I would be able to see it in the database?
s
No, after meltano run tap-csv target-postgres, the data should be in your database.
h
Output of
meltano run tap-csv target-postgres
s
Have you checked the schame "tap_csv"? There shuold be a table called "table".
h
Schema tap_csv should have been created in the database
loaders - name: target-postgres variant: transferwise pip_url: pipelinewise-target-postgres config: host: localhost port: 5432 user: postgres dbname: test ssl: false default_target_schema: tap_csv
Configuration of target-postgres
s
Yep. I mean, as stupid as it sounds, did you reconnect to your database? Refreshed everything?
Because the configuration and the logs are pretty straight forward and say there should be that schema (tap_csv) and a table called "table" inside it.
I know DBeaver for instance is weird with refreshes and sometimes doesn't display newly created schemas.
h
Yup this solved it lol
s
No worries, happens to me all the time -.-
Happy I could help!
h
Thank you so much
Also, is there a tap for json data?
s
At the moment, no there's no JSON specific tap. You could write your own tap, or convert your data into a tappable source.
h
I actually tried it using this documentation (https://docs.meltano.com/tutorials/custom-extractor), to get it from a REST API, but it failed.
Also there is a weird issue of the library pyscopg2 in python version 3.9. You may try to fix this. I had to downgrade to python 3.8 for using target-postgres and it was a hassle.
s
Yeah we're already working on our own target-postgres version to fix as bunch of these issues.
v
You could give the meltano labs target a spin!