We're interested in approaches to backfill large a...
# singer-tap-development
j
We're interested in approaches to backfill large amounts of data from an API that's not super performant. The situation is that if I request all historical records from this API endpoint it takes ~10 minutes to respond, so we want to grab batches of data (say 1-2 days), paginating through each batch and then grabbing the next. One idea we had was to override
request_records()
to loop through a time window in chunks, but it feels like there might be an easier pattern that we're not seeing.