Hello I am trying load data in to bigquery from m...
# troubleshooting
r
Hello I am trying load data in to bigquery from mysql. So when I do meltano run then it's giving me success. It's creating schema also but it's not inserting any values. I am new to bigquery. Any help will be appreciated. Here is the extractor and loader configuration.
Copy code
- name: tap-mysql_1_xxxxx
    inherit_from: tap-mysql    
    config:
      database: xxxx
      engine: mysql
      filter_dbs: xxxx
      host: xxxxxx
      port: 3306      
      ssl: false
      use_gtid: false
      user: xxxxx
      select:
      - xxxx.*
Copy code
- name: target-bigquery_1_xxxx
    inherit_from: target-bigquery    
    config:
      project: xxxxxxx
      dataset: xxxxx
      credentials_path: 
        /xxx/meltanoxxxxx.json
      location: xxxx
      batch_size: 1000
      method: storage_write_api
      denormalized: true
l
which target version are you using?
The reason I'm asking is because this one https://github.com/z3z1ma/target-bigquery has a bug where it can fail silently if the load job fails. So you might want to check if you have failed loadjobs in your GCP project.
1
r
@Love Eklund yeah I am using that one https://hub.meltano.com/loaders/target-bigquery/
sorry I didn't get "So you might want to check if you have failed loadjobs in your GCP project." Where can I see this ?
l
then its probably the one i linked
here
r
@Love Eklund okay found it thanks.
👍 1
l
If you see failed jobs then it is probably the bug I noticed, I have a fix on it on a fork I've created, I'll try to raise a PR so it can be merged into the main repo
r
yes I see it's failed
l
you can try changing your target to this
Copy code
name: target-bigquery
  variant: z3z1ma
  pip_url:
    git+<https://github.com/loveeklund-osttra/target-bigquery.git@b1302d5>
and reinstalling it and see if that raises a proper error.
r
yeah sure I will have a look
a
@Love Eklund Yeah your fork works for the storage write api while the default one shows success but writes nothing