We are looking to move our current production deployment using the integrated Airflow orchestration, under Kubernetes, to running the jobs under
airplane.dev as Tasks. Airplane Tasks can run as a Docker image, or a simplified Python task, which also builds a docker image, but is designed to run a Python program as the task workload. We'd prefer to use the Python task, as we will need to setup some things for the Meltano ELT job run, such as env vars with sensitive information pulled dynamically from our security solution.
Has anyone worked out or are there examples of starting a Meltano run from a Python program? Can we import the meltano module and call the starting point somehow? I'd rather not even consider shelling out and calling another instance of the Python interpreter if at all possible.
If it is better or required that we create a "full" Docker image with a bash script to call various utilities to setup for the run we can do that also. Just wondering if anyone is calling Meltano ELT job runs from within a Python program without shelling out...