Chad
08/23/2024, 5:44 AMvisch
08/23/2024, 1:08 PMaws s3 cp
to copy the dag to the S3 bucket you're talking aboutvisch
08/23/2024, 1:09 PMvisch
08/23/2024, 1:10 PMinitialize
command so meltano run airflow:initialize
It does use the schedules you have defined to make these , code is here https://github.com/meltano/airflow-ext/blob/main/files_airflow_ext/orchestrate/meltano.pyCharles Feduke
08/23/2024, 2:38 PMCharles Feduke
08/23/2024, 2:44 PMorchestrate/dags/meltano.py
Python program which programmatically generates the DAG definitions upon invocation by relying on your complete meltano.yml (I’ve broken mine up across several files for ease of maintenance) - basically the Airflow scheduler (I think that is the correct term) periodically executes meltano.py
and expects a JSON response that describes the DAGs, which it dutifully generates. I have had to heavily modify my meltano.py
to work with the virtualenv I host Meltano in on MWAACharles Feduke
08/23/2024, 2:49 PMmeltano.py
’s create_dags()
and capture its output on your local machine, based on your meltano.yml and them make a Python program that has create_dags():
defined to return the captured output. Then as long as your meltano.yml on your ECS instances is similar/hopefully the same as what you used to generate the JSON output - this Python program can then live in your S3 bucket for MWAA’s DAGsCharles Feduke
08/23/2024, 2:53 PMCharles Feduke
08/23/2024, 2:54 PMCharles Feduke
08/23/2024, 2:57 PMChad
08/23/2024, 11:27 PM