Hi Folks, I'm facing this challenge for the first...
# infra-deployment
h
Hi Folks, I'm facing this challenge for the first time, and not sure how other's have tackled it. We dockerize meltano and run the meltano etl processes through aws batch. Secrets are passed as environment variables from aws secretsmanager to the container. We have to ingest some information from google bigquery. We're looking at the tap variants. All require a _client_secrets.json_ file to be available on the container (e.g. tap-bigquery - Meltano Hub) How do other folks inject sensitive files into the docker container at runtime?
Or, is there a way to specify the secrets themselves as an environment variable rather than point to a file?
v
I think the tap should support passing those in as configs. But if you just need to make it work I'd mount a volume and point to that
m
We run dockerized meltano in Kubernetes with Argo Workflows. Our secrets are stored in AWS Secrets Manager. We use the External Secrets Operator to expose those as Kubernetes secrets and then mount them as environment variables for meltano to access.
h
Thank you for the insight folks
s
@haleemur_ali for injecting file as secrets, we store them as strings in secrets manager and then run a script on codebuild to extract them and save them as files in our running container. That way we can mimic development environment in production 😄