abhishek_ajmera
11/21/2022, 6:21 PMtap-mysql to target-bigquery till now which was working great, but now we’re using DynamoDB as well.
Tried to set up tap-dynamodb and after a lot of struggle with roles and policies permissions, was finally able to extract data, but then I got an error that loader failed.
load_job = client.load_table_from_file(\n', ' File "/home/ubuntu/meltano-projects/erp-to-bigquery-2/.meltano/loaders/target-bigquery/venv/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 2431, in load_table_from_file\n raise exceptions.from_http_response(exc.response)\n', 'google.api_core.exceptions.BadRequest: 400 POST <https://bigquery.googleapis.com/upload/bigquery/v2/projects/turing-audio-266016/jobs?uploadType=resumable>: Empty schema specified for the load job. Please specify a schema that describes the data being loaded.\n']
Wasn’t sure how to specify the schema apart from catalog (?), so used meltano invoke to dump catalog into a JSON file, and then`--catalog` flag to specify that as the schema but I guess that doesn’t work either, got the same error.
So does anyone know to specify the schema, or any other way for this data to reach BigQuery?aaronsteers
11/21/2022, 8:39 PMaaronsteers
11/21/2022, 8:40 PMaaronsteers
11/21/2022, 8:40 PMaaronsteers
11/21/2022, 8:41 PMabhishek_ajmera
11/21/2022, 8:43 PMabhishek_ajmera
11/21/2022, 8:44 PMabhishek_ajmera
11/21/2022, 8:48 PMtransferwise variant tomorrow since it seems to have data flattening as an inbuilt functionality.aaronsteers
11/21/2022, 9:11 PM