Is there a way to auto-scale the infrastructure wh...
# infra-deployment
n
Is there a way to auto-scale the infrastructure where meltano based on the ETL pipeline load? Does GCP composer support deployment of meltano projects in its cluster? As I understand the airflow instance uses the BashOperator to start the meltano ETL pipeline, assumes that meltano is already installed on the same machine...is this possible in GCP composer? Are there any article to help us implement auto scaling or How to use composer with meltano? Thanks
f
Yea, I'm using the KubernetesExecutor:
Copy code
plugins:
  orchestrators:
    - name: airflow
      pip_url: psycopg2-binary apache-airflow[kubernetes]==2.1.2 --constraint <https://raw.githubusercontent.com/apache/airflow/constraints-2.1.2/constraints-${MELTANO__PYTHON_VERSION}.txt>
      config:
        core:
          executor: KubernetesExecutor
        kubernetes:
          delete_worker_pods: true
          namespace: meltano
          in_cluster: true
          worker_container_repository: custom_registry/meltano-afworker
          worker_container_tag: latest
          pod_template_file: /project/.meltano/run/airflow/pod-template-file.yml
  files:
    - name: airflow
      pip_url: git+<https://gitlab.com/meltano/files-airflow.git>
I'm working on moving the workers to Fargate pods. Someone here had a repo I built off of, but it's pretty far from that now (everything Terraform).