Hey all I’m not sure this is the correct way of ap...
# troubleshooting
r
Hey all I’m not sure this is the correct way of approaching it but, I’m trying to define jobs with the same target but different settings for each job. When running them piecewise with
meltano run tap-pipedrive target-s3-parquet
i can supply the environment env
TARGET_S3_PARQUET_PATH=<s3://new/custom/path>
and it works as expected. It would be great if I could replicate that functionality in jobs. I’ve tried the following without success:
Copy code
jobs:
- name: pipedrive-to-s3
  tasks:
  - TARGET_S3_PARQUET_PATH=<s3://new/custom/path> tap-pipedrive target-s3-parquet
and
Copy code
jobs:
- name: pipedrive-to-s3
  tasks:
  - tap-pipedrive target-s3-parquet
  env:
  - TARGET_S3_PARQUET_PATH: <s3://new/custom/path>
is it possible that way or should it try to put
meltano config set ..
in the jobs in some way?
e
env
is not a supported key in jobs at the moment, see https://github.com/meltano/meltano/issues/6386 (PRs welcome!). One approach I’ve used, if it makes sense, is to leverage pipeline env vars to configure the loader with different values for each extractor, for example using
default_target_schema: $MELTANO_EXTRACT__LOAD_SCHEMA
will set BigQuery’s schema to
tap_github
, etc.
r
Ah okay. I don’t think my loader supports schema’s (https://hub.meltano.com/loaders/target-s3-parquet--jkausti/) - I’ll just steer away from
jobs
and define each per run instead, thanks!