Emre Üstündağ
01/21/2025, 1:10 PMEdgar Ramírez (Arch.dev)
01/21/2025, 2:57 PMIf I build a new docker image, and register it in aws ecr, then create a task definition from my meltano project's image, and run two tasks (one for airflow ui and one for airflow scheduler) from this task definition, will everything be fine? You may say "just try it" but I am not sure this approach is the best practise
That should work. As for what is "best practice", if you're on AWS you might wanna use a native Airflow deployment with MWAA and orchestrate the Meltano task with the ECR operator.
So I need to know to setup this infrastructure with the proper deployment process. I also dont know how to create a CI/CD workflow.
That depends somewhat on what platform you're developing on and whether you're familiar with IaC tools like Terraform. I'm sure there are guides out there on how to do CI/CD for AWS infra using Terraform.
Lastly, I am using target-clickhouse as a loader now. My clickhouse db is in an aws fargate service. Can I use dbt with clickhouse?
Yeah, although you'll have to create a custom plugin definition in your
meltano.yml
with a pip URL that points to both meltano-dbt-ext and dbt-clickhouse.
should I try to use dbt cloud to transform data outside the project?
That's honestly another option if you'd like having the dbt Cloud UI.
Emre Üstündağ
01/21/2025, 4:14 PMEdgar Ramírez (Arch.dev)
01/21/2025, 5:30 PMmost likely I will face different challenges during deployment 🙂As expected with this sort of thing, hard to get exactly right the first time but feel free to ask in #C0699V48BPF, #C069CSV7NHY or even #C069CQNHDNF.
Emre Üstündağ
01/21/2025, 6:49 PMEmre Üstündağ
01/27/2025, 1:16 PMEmre Üstündağ
01/29/2025, 12:10 PMEdgar Ramírez (Arch.dev)
01/29/2025, 5:02 PM