benjamin_mitzkus
09/20/2022, 1:19 PMmeltano schedule run extract-load, everything works as expected. I found that for elt schedules, airflow calls meltano schedule run, but for job schedules, airflow calls meltano run, ignoring all schedule-level settings. Is there any way to define which dbt models I want to run per schedule? Or am I using it wrong?
version: 1
plugins:
extractors: [...]
orchestrators: [...]
transformers: [...]
files: [...]
schedules:
- name: extract-load
interval: 0 0 * * *
job: EL
env:
DBT_MODELS: calendar appdata woocommerce shopify
jobs:
- name: EL
tasks:
- tap-calendar target-postgres
- tap-appdata target-postgres
- tap-woocommerce target-postgres
- tap-shopify target-postgres
- dbt:run
environments: [...]taylor
09/20/2022, 2:49 PMrun should be ignoring a schedule env like that, right?benjamin_mitzkus
09/20/2022, 2:52 PMorchestrate/dags/meltano.py and for job-type schedules, the command is not meltano schedule run which would respect env vars set on that level, but instead meltano run <task>florian.hines
09/20/2022, 2:56 PMmeltano schedule runflorian.hines
09/20/2022, 2:57 PMflorian.hines
09/20/2022, 2:59 PMtaylor
09/20/2022, 3:00 PMenv on schedules in an unexpected way. I totally get why we have the current state though.florian.hines
09/20/2022, 3:01 PMbenjamin_mitzkus
09/20/2022, 3:03 PMorchestrate/dags/meltano.py in a way that meltano schedule run is invoked should be a workaround, right?taylor
09/20/2022, 3:04 PMflorian.hines
09/20/2022, 3:12 PMtask = BashOperator(
task_id=task_id,
bash_command=f"cd {PROJECT_ROOT}; {MELTANO_BIN} run {run_args}",
dag=dag,
env=schedule.get("env", {}),
append_env=True,
)florian.hines
09/20/2022, 3:12 PMenv and append_env being the two new additions.benjamin_mitzkus
09/20/2022, 3:12 PMflorian.hines
09/20/2022, 3:18 PMtaylor
09/20/2022, 3:19 PMflorian.hines
09/20/2022, 3:19 PM