Hey everyone. During the local development I want ...
# troubleshooting
p
Hey everyone. During the local development I want to have a different state_backend than the production. • When I hardcode the s3 url (A) it works great. • When I am passing through env variable the MELTANO_ENVIRONMENT, it doesn't work. (B). Through the logs I am able to see the passed value of the MELTANO_ENVIRONMENT, the process is able to write into the right path of s3, to create empty tables in my destination (Snowflake), but it doesn't fill them up. It feels that some part of the meltano project can correctly read the variable and the full s3 path, and in other parts it is unable to locate it and thus failing to insert new entries. • Is this a known issue? A)
Copy code
state_backend:
	uri: <s3://el-meltano-state-bucket/dev>
B)
Copy code
state_backend:
	uri: <s3://el-meltano-state-bucket/$MELTANO_ENVIRONMENT>
1
e
Hi @Paschalis Dimitriou! I can't reproduce your issue. Using
$MELTANO_ENVIRONMENT
in
state_backend
uri works as expected. Do you see a log message like
Writing state to AWS S3
?
p
Thank you for your prompt answer @Edgar Ramírez (Arch.dev), I appreciate. It was a bug on our side. • For the community knowledge: We were inheriting the load configuration of snowflake and together with the inherritance we were re-specifying variant: meltanolabs pip_url: meltanolabs-target-snowflake in the inherrited target destinations. That was causing the mis-behavior of the MELTANO_ENVIRONMENT. (not sure if this needs to be fixed on the meltano project, since it is a rare mis-use case).
👍 1