For `target-bigquery` I am using GCS Staging - one...
# troubleshooting
j
For
target-bigquery
I am using GCS Staging - one of my taps outputs timestamps as '2020-07-22T053222+0000' which throws an error on the google api
Copy code
[2m2024-07-27T18:05:54.506201Z[0m [[32m[1minfo     [0m] [1mgoogle.api_core.exceptions.BadRequest: 400 Error while reading data, error message: JSON table encountered too many errors, giving up. Rows: 1; errors: 1. Please look into the errors[] collection for more details. File: <gs://REDACTED>; reason: invalid, location: <gs://REDACTED>, message: Error while reading data, error message: JSON table encountered too many errors, giving up. Rows: 1; errors: 1. Please look into the errors[] collection for more details. File: <gs://REDACTED>; reason: invalid, message: Error while reading data, error message: JSON processing encountered too many errors, giving up. Rows: 1; errors: 1; max bad: 0; error percent: 0; reason: invalid, location: <gs://REDACTED>, message: Error while reading data, error message: JSON parsing error in row starting at position 0: Couldn't convert value to timestamp: Could not parse '2020-07-22T05:32:22+0000' as a timestamp. Required format is YYYY-MM-DD HH:MM[:SS[.SSSSSS]] or YYYY/MM/DD HH:MM[:SS[.SSSSSS]] Field: creation_time; Value: 2020-07-22T05:32:22+0000 File: <gs://REDACTED>
How to resolve this? I want to use GCS staging, but the issue doesnt exist when using batch load
In the meantime I am setting this particular tap -> target to batch_load. Wondering if this constitutes raising an issue on github
e
Hi @James Stratford! Which tap and target variant is this?
j
Tap is tap-facebook —variant=airbyte Target is target-bigquery —variant=z3zima
g
Hello, i'm having the same issue. My job failed because of one error in timestamp format. Did you resolved your issue?
j
Hi @Geoffrey LOUASSE for this specific tap I set it to use batch_load instead of GCS Load. I figure it is because in-memory it is a datetime object so it can coerce the correct format. But from a string it needs to be a specific format. You could look into stream maps potentially, it allows you to put a small transform on a stream and key
g
To me it's not working 😕 In my case, Meltano is setting any DATE field in my origin Database (MySQL) to Timestamp, and that's why it's not working, the outputs are DATE but it's expecting TIMESTAMP. I have this for every table in my Mysql BDD :
Copy code
Could not parse '0000-00-00' as a timestamp. Required format is YYYY-MM-DD HH:MM[:SS[.SSSSSS]] or YYYY/MM/DD HH:MM[:SS[.SSSSSS]] Field: date_kbis; Value: 0000-00-0