I have defined a couple of schedules like the following:
schedules:
- name: postgres-to-parquet
interval: '@daily'
extractor: tap-postgres-s002
loader: target-parquet
transform: skip
start_date: 2024-02-01 00:00:00
env:
OUTPUT_FOLDER: s002
- name: salesforce-to-parquet
interval: '@daily'
extractor: tap-salesforce
loader: target-parquet
transform: skip
start_date: 2024-02-01 00:00:00
env:
OUTPUT_FOLDER: salesforce
- name: coc-to-parquet
interval: '@daily'
extractor: tap-postgres-coc
loader: target-parquet
transform: skip
start_date: 2024-02-01 00:00:00
env:
OUTPUT_FOLDER: coc
- name: dh-to-parquet
interval: '@daily'
extractor: tap-postgres-dataharbor
loader: target-parquet
transform: skip
start_date: 2024-02-01 00:00:00
env:
OUTPUT_FOLDER: dataharbor
and I call these with the command:
meltano --environment={staging|production} schedule run <schedule_name>
however, if I check the values that are actually being written to my remote state in azure blob storage I only see the following: