Is `elt.buffer_size` the size for one row? We are ...
# troubleshooting
s
Is
elt.buffer_size
the size for one row? We are working with nested JSON, so I have a feeling we have one gigantic row that is more than the buffer size which is causing failures in our pipelines
v
yes
It also impacts how much buffer size is available for multiple records too due to some interesting things in asyncio with pipes, but generally yes (is my understanding at least 😄 )
s
Okay, cool, I am a little worried that misconfiguring that will cause failures down the road...I was able to load the data set after increasing it to 50 mb, though...I am thinking I can set different buffersizes if needed for the data sets that I know have larger rows.