fred_reimer
11/30/2021, 2:57 PMken_payne
11/30/2021, 3:17 PMtaylor
11/30/2021, 3:22 PMfred_reimer
11/30/2021, 3:39 PMfred_reimer
11/30/2021, 11:17 PMINFO - Running: ['airflow', 'tasks', 'run', 'meltano_my_job_id', 'extract_load', '2021-11-30T22:25:35.373904+00:00', '--job-id', '807', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/meltano.py', '--cfg-path', '/tmp/tmpuriwn0ko', '--error-file', '/tmp/tmphxdcks1b']
That is all that is passed to meltano invoke in the container. I just modified it to pull the appropriate repo for the job_id so things can stay separate and we don't need to reboot the core infrastructure every time a job is added. I just need the watcher to update the dags directory on the airflow scheduler pod and it should be done.