Hello, I'm using meltano to extract files from csv...
# troubleshooting
v
Hello, I'm using meltano to extract files from csv and postgres and loading on csv. After this, i'm extracting the csv and loading in a postgres DB, How can i do that automaticly version: 1 default_environment: dev project_id: 61596f47-a523-4d72-8166-c601203313b7 environments: - name: dev - name: staging - name: prod plugins: extractors: - name: tap-postgres variant: meltanolabs pip_url: git+https://github.com/MeltanoLabs/tap-postgres.git config: sqlalchemy_url: postgresql://northwind_user:thewindisblowing@localhost:5432/northwind filter_schemas: [public] - name: tap-csv variant: meltanolabs pip_url: git+https://github.com/MeltanoLabs/tap-csv.git config: files: - entity: order_details path: ../data keys: - order_id loaders: - name: target-csv variant: meltanolabs pip_url: git+https://github.com/MeltanoLabs/target-csv.git config: validate_records: false add_record_metadata: false file_naming_scheme: data/postgres/{stream_name}/{datestamp}/dados.csv default_target_schema: public default_target_table: dados_table - name: target-postgres variant: meltanolabs pip_url: meltanolabs-target-postgres config: sqlalchemy_url: postgresql://postgres:1234@localhost:5432/banco_destino default_target_schema: public default_target_table: dados_table add_record_metadata: false activate_version: false
e
You probably want to orchestrate Meltano with something like Airflow, Dagster, ECS, etc. You could even get away with using GitHub Actions. However you end up doing it, you probably want to make sure the tasks share the same filesystem or you somehow backup the files and load them again to the fs for the csv -> postgres pipeline. • https://docs.meltano.com/guide/orchestration/https://engineering.widen.com/blog/Dagster-+-Meltano/https://dagster.io/blog/dagster-meltano-integration-tutorial