I see. would the memory consumption be roughly constant even if the volume of data fluctuates?
I would assume that the memory usage would fluctuate
⢠with # of rows transferred (e.g. if on some days the source system sees an abnormal amount of updates or a full refresh is triggered). i'm basing this assumption because bulk loading would require rows to be buffered on the target side, and I'm not sure if the tap will generally fetch each "page" of data, pipe it over and then free that memory before fetching another "page" of data.
⢠as additional fields are added to the stream (since each record now needs more memory)