Desiree Cox
04/17/2024, 5:57 PMitersize
to increase the size of the batch tap-postgres reads from the source, and batch_size_rows
to increase the number of records loaded to Snowflake at a time. Are there other parameters worth tuning here?
Thank you in advance for any advice or feedback!dylan_just
04/18/2024, 3:44 AMdylan_just
04/18/2024, 3:45 AMdylan_just
04/18/2024, 3:46 AMDesiree Cox
04/18/2024, 3:36 PMmeltano run tap target
command, but it looks like the strategy is more like issuing one meltano run tap target --select=tablename
per table, defining each table as a pipeline, and scheduling the pipelines, is that closer to it?dylan_just
04/18/2024, 9:40 PMdylan_just
04/18/2024, 9:43 PMdylan_just
04/18/2024, 9:48 PMDesiree Cox
04/18/2024, 9:49 PMdylan_just
04/18/2024, 9:58 PMmappers:
- name: meltano-map-transformer
variant: meltano
pip_url: git+<https://github.com/MeltanoLabs/meltano-map-transform.git@v0.7.0>
mappings:
- name: mapper-mydb
config:
stream_maps:
public-mytable:
__key_properties__:
- team_id
- user_id
- created
On large tables, we have a different problem - reading too many rows in one go causing problems on the source data, so we use the limit
parameter to limit to 250k rows at a time.dylan_just
04/18/2024, 10:00 PM