We are looking to move our current production depl...
# infra-deployment
f
We are looking to move our current production deployment using the integrated Airflow orchestration, under Kubernetes, to running the jobs under airplane.dev as Tasks. Airplane Tasks can run as a Docker image, or a simplified Python task, which also builds a docker image, but is designed to run a Python program as the task workload. We'd prefer to use the Python task, as we will need to setup some things for the Meltano ELT job run, such as env vars with sensitive information pulled dynamically from our security solution. Has anyone worked out or are there examples of starting a Meltano run from a Python program? Can we import the meltano module and call the starting point somehow? I'd rather not even consider shelling out and calling another instance of the Python interpreter if at all possible. If it is better or required that we create a "full" Docker image with a bash script to call various utilities to setup for the run we can do that also. Just wondering if anyone is calling Meltano ELT job runs from within a Python program without shelling out...
v
Not sure how deep you want to go here, but yes every orchestrator has to do this their own way. There's examples here https://hub.meltano.com/utilities/ Airflow, Dagster, Cron Dagster's is probably the closest for you https://github.com/quantile-development/dagster-ext/blob/master/meltano/edk/process.py Subprocess out to Meltano is the way, doing a "full" docker image works too. It's not that you can't just call Meltano modules directly you could hook into some internal apis of the
meltano
package and make it work it's just not supported
f
Thanks @visch, you are always helpful! I have no desire (or time!) to write a new orchestrator for Meltano, so it looks like I'll just go with the full Docker image and call the normal Meltano executable with the proper arguments for an ELT run.
v
Got it 😄 , I tend to write a wrapper python script to lightly wrap around the Meltano calls I do on whatever machine I'm running on but if you can use Docker I think that's the right way.