okay now i’m running into a different issue that i...
# troubleshooting
p
okay now i’m running into a different issue that i can’t see to figure out with the same fb pages tap and the bigquery target. a number of streams in the tap have timestamp fields and the jsonschema properties for those fields have
"format": "date-time"
. bigquery isn’t able to load these as timestamp fields because the string is formatted in a way it doesn’t know how to handle and it throws an error on load. i tried passing a catalog extra json file that is identical to that produced by the tap except without the
"format": "date-time"
requirement for those fields, hoping to load them to BQ as a string instead. i’m running
meltano elt … catalog=path/to/catalog.json
but the bigquery target is still using (and logging) the default json schema. meltano logs right at the start that it found the catalog extra so i know that bit is working. i’ve also provided catalog overrides for other taps with this target before with no issues so not sure what it could be.
i also tried using the target’s
force_fields
config option to ensure those fields are loaded as strings but it seems to have no effect - the load still fails with
Copy code
Couldn't convert value to timestamp: Could not parse '2021-12-07T17:08:30+0000' as a timestamp. Required format is YYYY-MM-DD HH:MM[:SS[.SSSSSS]] or YYYY/MM/DD HH:MM[:SS[.SSSSSS]] Field: created_time; Value: 2021-12-07T17:08:30+0000
on the other hand, modifying the schema files in the tap itself to just remove the
date-time
format seems to work but i’d rather not maintain a fork of the tap if there’s a better solution