Hi there - I'm just looking into saving state - is...
# getting-started
j
Hi there - I'm just looking into saving state - is there a way I can store state into snowflake? I am happy to scrape and dump a file from the local system into Snowflake. I am on a machine where the file system resets after every run.
p
a
I like this idea - I had wondered whether the it might work. Presently
MELTANO_DATABASE_URI
works with postgres, but is just sqlalchemy under the hood. So a Snowflake based system db might work. Perhaps the @meltano guys have thought about this or a generic sqlalchemy state backend.
j
Yeah my original idea was a bit more ghetto. After meltano runs scrape and dump the local filesystem file to an internal stage in snowflake. Then before the next time it runs pull it back out. Stages in snowflake are like S3 buckets, to my understanding? If I did that would the only file I need to back up be the meltano.db file?
p
Another option is to use cloud storage as the backend, and have the data also be queryable in Snowflake, either by making the storage location an external table or by setting up an EL job from there to Snowflake.
e
@aaron_phethean the problem with trying to leverage the system db as a generic sqlalchemy state backend is you have to make the migrations incredibly generic (that's hard! you can look up the PR that added mssql support). FWIW I have PoC to add support for arbitrary state backends as packaging plugins: https://github.com/meltano/meltano/pull/8367
👀 1
Maybe that 👆 could be used to create a generic sqlalchemy state backend, or specific ones for snowflake, etc.