proud-pillow-55935
03/23/2021, 4:54 PMDockerOperator
that could potentially be used as well as the KubernetesPodOperator
to actually trigger our Meltano pipeline to run in a container.
Based on this use case, where have people found success/where would people recommend we start? We are currently running Airflow in GCP's managed service (aka Composer). Thank you!great-gold-98639
03/23/2021, 5:23 PMKubernetesPodOperator
to trigger our meltano pipelines and then store the state externally. I would say if you have an existing composer instance, it makes sense to use it and not have to add additional infra for meltano.
https://meltano.slack.com/archives/C01QM86B83A/p1614871705229100proud-pillow-55935
03/23/2021, 8:11 PMKubernetesPodOperator
to run our Meltano pipeline. Having trouble connecting to the database currently, but I think that is just a networking issuegreat-gold-98639
03/23/2021, 9:07 PMps = subprocess.Popen(
"meltano elt {tap} {target}..."
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
for line in ps.stdout:
<http://logging.info|logging.info>(line)
That way it streams the logs back out of the pod as they happenproud-pillow-55935
03/23/2021, 9:37 PMgreat-gold-98639
03/23/2021, 10:10 PMFROM meltano/meltano:v1.70.0
ENV APP /meltano_image
WORKDIR $APP
# Copy some shell scripts like installing taps/targets
COPY helpers/ helpers/
# Copy our meltano directory (contains our meltano.yml, catalogs, etc)
COPY meltano/ meltano/
# Our entrypoint file that parses arguments from the KubernetesPodOperator, and runs the meltano command
COPY meltano.py meltano.py
RUN exec ./helpers/install_taps.sh
ENTRYPOINT ["python", "meltano.py"]
We then build the image and push it to gcr.proud-pillow-55935
03/23/2021, 10:12 PMgreat-gold-98639
03/24/2021, 3:14 PMproud-pillow-55935
03/24/2021, 4:32 PMgreat-gold-98639
03/30/2021, 8:37 PMproud-pillow-55935
03/30/2021, 8:48 PMgreat-gold-98639
03/30/2021, 8:58 PM