Hello, I am running an EL job with tap-amazon-sp a...
# troubleshooting
s
Hello, I am running an EL job with tap-amazon-sp and target-bigquery. I keep getting following error;
Copy code
AttributeError: 'str' object has no attribute 'get'
I am attaching a meltano.yml file of my project. p.s. There is a similar question which is anwered which is not clear to me. I'd appreciate if someone can help me modify the .yml file if required.
Copy code
version: 1
default_environment: dev
project_id: a8c7383d-e8b6-465c-9be9-197f40e7955c
environments:
- name: dev
- name: staging
- name: prod
plugins:
  extractors:
  - name: tap-github
    variant: meltanolabs
    pip_url: git+<https://github.com/MeltanoLabs/tap-github.git>
    config:
      flattening_enabled: true
      flattening_max_depth: 2
      organizations:
      - FornaxVentures
      rate_limit_buffer: 5
      skip_parent_streams: true
      start_date: '2023-07-01'
  - name: tap-amazon-sp
    variant: hotgluexyz
    pip_url: git+<https://gitlab.com/hotglue/tap-amazon-seller.git>
    config:
      flattening_enabled: true
      flattening_max_depth: 2
      marketplaces:
      - IN
      processing_status:
      - IN_QUEUE
      - IN_PROGRESS
      - DONE
      report_types:
      - GET_FLAT_FILE_ALL_ORDERS_DATA_BY_LAST_UPDATE_GENERAL
      role_arn: xxxxxxxxx
      sandbox: false
      start_date: '2023-07-01T00:00:00.000000Z'
    select:
    - orders.LastUpdateDate
    - orders.SellerOrderId
  loaders:
  - name: target-jsonl
    variant: andyh1203
    pip_url: target-jsonl
  - name: target-csv
    variant: hotgluexyz
    pip_url: git+<https://github.com/hotgluexyz/target-csv.git>
    config:
      destination_path: ''
      delimiter: ','
      quotechar: '"'
  - name: target-bigquery
    variant: z3z1ma
    pip_url: git+<https://github.com/z3z1ma/target-bigquery.git>
    config:
      credentials_path: xxxxxxxx
      project: xxxxxx
      dataset: reports
      location: us-east1
      batch_size: 1000
      fail_fast: true
      timeout: 300
      denormalized: false
      method: batch_job
      partition_granularity: month
      cluster_on_key_properties: false
      column_name_transforms:
        lower: true
        quote: false
        add_underscore_when_invalid: true
        snake_case: true
      options:
        storage_write_batch_mode: false
        process_pool: true
        max_workers: 10
      upsert: 'false'
      overwrite: 'false'
      flattening_enabled: true
      flattening_max_depth: 2
d
Facing a similar issue
a
Please post more of the error message. By the sounds of the error a dict type is expected but it is receiving a string, but without more context impossible to say
s
Could it be that we have not specified a stream map/ stream map config? Do you happen to have a small example of the stream_map_config supported by target-bigquery
q
@shubham have you managed to get that issue sorted?
s
@quentin_gaborit hey yes. My project had become a little messy so to clean up I reinstalled all the plugins and reconfigured them. I didn't face this issue after that
q
After investigating and with the help of @alexander_butler it looked like it’s caused by the overwrite config param being wrongly coerced to string on the hub repo for the variable overwrite.
s
oh ok. How did you fix it then?
q
Debugging and casting the boolean back to boolean. But to fix it once and for all I need to PR the hub. I will submit it in the week
s
oh ok. thanks