rune_andersen
01/27/2023, 12:12 PMmeltano run tap-pipedrive target-s3-parquet
i can supply the environment env TARGET_S3_PARQUET_PATH=<s3://new/custom/path>
and it works as expected. It would be great if I could replicate that functionality in jobs. I’ve tried the following without success:
jobs:
- name: pipedrive-to-s3
tasks:
- TARGET_S3_PARQUET_PATH=<s3://new/custom/path> tap-pipedrive target-s3-parquet
and
jobs:
- name: pipedrive-to-s3
tasks:
- tap-pipedrive target-s3-parquet
env:
- TARGET_S3_PARQUET_PATH: <s3://new/custom/path>
is it possible that way or should it try to put meltano config set ..
in the jobs in some way?edgar_ramirez_mondragon
01/27/2023, 5:49 PMenv
is not a supported key in jobs at the moment, see https://github.com/meltano/meltano/issues/6386 (PRs welcome!).
One approach I’ve used, if it makes sense, is to leverage pipeline env vars to configure the loader with different values for each extractor, for example using default_target_schema: $MELTANO_EXTRACT__LOAD_SCHEMA
will set BigQuery’s schema to tap_github
, etc.rune_andersen
01/30/2023, 8:09 AMjobs
and define each per run instead, thanks!