Is there a way to mark which schedules need to run...
# getting-started
a
Is there a way to mark which schedules need to run and which don't? I need this because I am going to use one image for all clients, and clients usually do not connect all products So, i need select which tap will work for this client Or maybe I can somehow manage with env the list of schedules that need to be added to the orchestrator? That is, something like schedules: - name: daily-gitlab-load disable: true interval: '@daily' job: tap-gitlab-to-target-postgres-with-dbt
a
Often, users will manage enabled/disabled status in the 3rd party orchestrator, such as in Airflow UI. We don't as of now have a feature in Meltano to disable schedules. We do have a feature request to tie schedules to the environments here.
Does this help?
a
Then, purely theoretically, I can transfer a list of products available for receiving data to env, and check in the interval field if the product is not in the list - that means I set the launch every February 30 (never) Or I can pass a description for the DAG somewhere in the orchestrator properties
I don't know where to look for information or examples about setting up orchestrators with meltano. The fact is that I'm going to raise a new meltano project for each of our partners and transfer all the configuration data for tap there. But I want to have one orchestrator where I can do something like this 1. I click on a specific partner 2. I see a list of all his jobs At the same time, we are not satisfied with the option if the elt operation will be displayed in the orchestrator as one block, we want to do it so that there are 3 related tasks get -> transform -> save. If you can give me some advice or examples, I will be incredibly grateful to you.
@aaronsteers Perhaps there is an opportunity to somehow skip receiving data from tap? I could pass a flag in the tap config, and then somehow skip the extraction inside the tap description? Something like this Or maybe it`s possible with select operator?
a
Circling back on this post-holidays... were you able to get the orchestration-level / environment-level omissions you were looking for?
a
I also had the long weekend, and so far I have no results. But here's what I came up with I will have dagster and meltano and cloudFunctions. There will be one cloudFunction that will monitor when it needs to start the synchronization (meltano) and processing process, this function will start the execution dagster via graphQL call with the environment configuration. The dagster will call meltano, after the end of the execution, dagster will call the next cloud function that will get events from the meltano-target, and work with this data in a single format
This is my plan. But I do not know how it has the right to life