I am running a custom tap that extracts from a RES...
# random
t
I am running a custom tap that extracts from a REST API, in which all records have a primary key, so incremental syncs are working, and everything is great until the source/data producer deletes a record, and we are left with orphaned records that skew the data. Previously, we were using a custom Python script to compare source data to the destination table on every upload. Any idea on how to handle hard deletes from the source within Meltano and dbt? We are using a custom extractor and loading it to Postgres.
e
Hi @Tyler P! The usual that is handled in the Singer ecosystem using
ACTIVATE_VERSION
version messages. There a feature request for the Singer SDK to implement it: https://github.com/meltano/sdk/issues/18. Let me know and give the issue a 👍 if you think that'd help your use case.
t
@Edgar Ramírez (Arch.dev) this sounds like it would do the trick.
e
Cool. I've timeboxed a block next week to dive into it. It's long due but I hope to make good progress on it.
t
Awesome. Thanks a bunch @Edgar Ramírez (Arch.dev)
👍 1