Hey everyone!
Thank you for being such a great community!
I have another question today.
My plan is to have the ETL process running on a k8s cluster where Dagster will execute multiple concurrent Meltano jobs (one job one table from postgres to snowflake).
Is the incremental state stored in S3 going to be properly updated in this way? As I understand it's going to be the same job name based on tap and target names. While I was typing it I started to think that probably I should use uniq job names per table and in this way Meltano will store state data in different json files...