Hi I am using mysql extractor and mysql loader. W...
# troubleshooting
r
Hi I am using mysql extractor and mysql loader. When I try to transfer 1 million records then my server storage gets full with 100%. So am I doing anything wrong or am I missing any configurations ? I can see 1 million records are also loaded. But it's not cleaning up the memory in logs and my process is also not getting completed and it stays in the same state. Do I need server with enough storage and RAM ? Can anyone help me out with this issue. Thanks
e
Hi @Rutul Saraiya! Are you running meltano with
DEBUG
log level?
r
Hi @Edgar Ramírez (Arch.dev) I am new to some of the setups. I am running this in dev environment. Not sure where to change DEBUG mode. Thanks
e
Ok, can you trying running with a higher log level than the default, e.g.
meltano --log-level=error ...
. I need to confirm that only that (and not a full-blown logging config file) is needed to reduce the logs volume that's saved to disk.
r
Thanks for the reply, I will check and let you know.
I run the command
meltano --log-level=error
but still it is filling up my disk and not able to complete the process. I haven't tried the configuration you mentioned. I will try and let you know. Actually my server is getting hang and there is no other way then deleting the logs. So I might need to run small chunk and then need to try to update that configuration in etl.log file
Do you have any suggestion for this to get resolved minimum how much of memory and storage is required ? I am using EC2 t3.large but in that it's not working