Hi @michael_lan , @mark_johnston , we run this configuration successfully. In short, we have created a Docker Image based on the Meltano Docker image, we have deployed in it our meltano.yml file for the systems we want to connect to and any other required connectivity requirements e.g. Oracle Thick Client. Don’t store any credentials or values against the meltano.yml settings in this container just the config.
This docker image is deployed in ECR / ECS. To run Meltano we simply call the Airflow ECSOperator to invoke the image. We have chosen to use Fargate and a ECS task so this is only running the container for the lifetime of the ingestion. You can pass in the appropriate command when calling the ECSOperator to invoke the meltano cli with an appropriate tap / target. To get the settings for your taps and targets either set them in the environment settings when invoking tap / target or use a utility like chamber to obtain them at runtime from the AWS SSM Parameter store.
You will also need to set an appropriate Meltano store to hold your state. Example RDS Postgress or Aurora in AWS.
I hope this helps give you an overview of a possible architecture. There are most likely other approaches like invoking the meltano cli installed on an EC2 instance.