I am trying to perform ELT between following DBs S...
# troubleshooting
b
I am trying to perform ELT between following DBs Source MySQL 5.7 Target: Postgres 11 My current
meltano.yml
is as follows
Copy code
version: 1
default_environment: dev
project_id: 26360f5d-2706-4532-9028-42a97dfac0b4
send_anonymous_usage_stats: false
plugins:
  extractors:
  - name: acadia_db
    inherit_from: tap-mysql
    variant: transferwise
    pip_url: pipelinewise-tap-mysql
    select:
      - acadia-program.*
    metadata:
      '*':
        replication-method: INCREMENTAL
        replication-key: id
  loaders:
  - name: data-warehouse
    inherit_from: target-snowflake
    variant: transferwise
    pip_url: pipelinewise-target-snowflake
    config:
      parallelism: 4
      add_metadata_columns: true
      hard_delete: true
  - name: test-warehouse
    inherit_from: target-postgres
    variant: transferwise
    pip_url: pipelinewise-target-postgres

    config:
      schema_mapping:
        acadia: 'public'
      default_target_schema: public

environments:
- name: dev
I perform ELT against Snowflake via
meltano --log-level=info elt acadia_db data-warehouse
and it works successfully When I perform ELT against Postgres
meltano --log-level=info elt acadia_db test-warehouse
then it fails with following output ```2022-09-14T015628.055787Z [info ] Running extract & load... name=meltano run_id=677274e3-9e86-4a36-8949-d24829afcbb6 state_id=2022-09-14T015625--acadia_db--test-warehouse 2022-09-14T015628.283515Z [warning ] No state was found, complete import. 2022-09-14T015629.889841Z [info ] time=2022-09-13 215629 name=tap_mysql level=INFO message=Server Parameters: version: 5.7.37, wait_timeout: 28800, innodb_lock_wait_timeout: 3600, max_allowed_packet: 4194304, interactive_timeout: 28800 cmd_type=extractor name=acadia_db run_id=677274e3-9e86-4a36-8949-d24829afcbb6 state_id=2022-09-14T015625--acadia_db--test-warehouse stdio=stderr 2022-09-14T015629.892532Z [info ] time=2022-09-13 215629 name=tap_mysql level=INFO message=Server SSL Parameters(blank means SSL is not active): [ssl_version: ], [ssl_cipher: ] cmd_type=extractor name=acadia_db run_id=677274e3-9e86-4a36-8949-d24829afcbb6 state_id=2022-09-14T015625--acadia_db--test-warehouse stdio=stderr 2022-09-14T015630.363037Z [info ] time=2022-09-13 215630 name=tap_mysql level=INFO message=Beginning sync for InnoDB table acadia.program cmd_type=extractor name=acadia_db run_id=677274e3-9e86-4a36-8949-d24829afcbb6 state_id=2022-09-14T015625--acadia_db--test-warehouse stdio=stderr 2022-09-14T015630.363256Z [info ] time=2022-09-13 215630 name=tap_mysql level=INFO message=Stream acadia-program is using incremental replication cmd_type=extractor name=acadia_db run_id=677274e3-9e86-4a36-8949-d24829afcbb6 state_id=2022-09-14T015625--acadia_db--test-warehouse stdio=stderr 2022-09-14T015630.363952Z [info ] Traceback (most recent call last): cmd_type=loader name=test-warehouse run_id=677274e3-9e86-4a36-8949-d24829afcbb6 state_id=2022-09-14T015625--acadia_db--test-warehouse stdio=stderr 2022-09-14T015630.364120Z [info ] File "/Users/binoy/workspace/data-engg/data-lake-workflows/src/meltano/.meltano/loaders/target-postgres/venv/bin/target-postgres", line 8, in <module> cmd_type=loader name=test-warehouse run_id=677274e3-9e86-4a36-8949-d24829afcbb6 state_id=2022-09-14T015625--acadia_db--test-warehouse stdio=stderr 2022-09-14T015630.364350Z [info ] sys.exit(main()) cmd_type=loader name=test-warehouse run_id=677274e3-9e86-4a36-8949-d24829afcbb6 state_id=2022-09-14T015625--acadia_db--test-warehouse stdio=stderr 2022-09-14T015630.364511Z [info ] File "/Users/binoy/workspace/data-engg/data-lake-workflows/src/meltano/.meltano/loaders/target-postgres/venv/lib/python3.9/site-packages/target_postgres/__init__.py", line 373, in main cmd_type=loader name=test-warehouse run_id=677274e3-9e86-4a36-8949-d24829afcbb6 state_id=2022-09-14T015625--acadia_db--test-warehouse stdio=stderr 2022-09-14T015630…
e
The documentation for
schema_mapping
is lacking, but it should look like this: https://github.com/transferwise/pipelinewise-target-postgres/blob/5e8891d7a839c35dbc4e3d3e5dd021db483afac0/target_postgres/db_sync.py#L238-L245 Can you try the following structure:
Copy code
- name: test-warehouse
    inherit_from: target-postgres
    variant: transferwise
    pip_url: pipelinewise-target-postgres

    config:
      schema_mapping:
        acadia:
          target_schema: 'public'
          indices: {}
          target_schema_select_permissions: []
      default_target_schema: public
b
Oh.. No wonder.. I never realized there was full structure for mapping. by keyword
mapping
I assumed it was a
dict
like map
Trying it out, Thanks Edgar
e
yup, it’s rather non-intuitive 😅
b
It worked! Thanks
s
@binoy_shah @edgar_ramirez_mondragon Hi I just read this thread and I didn't quite understand the solution. I am facing the same issue for target-bigquery. Can you guys please help me out set up the schema mapping