Hello again (I'm on :fire: this week) Has anyone ...
# singer-tap-development
s
Hello again (I'm on 🔥 this week) Has anyone every used the target-bigquery loader in production? If so, were you able to set credentials as variables instead of a json keyfile? If so, how?
a
I just used the keyfile approach. This could be a dead end, but you might look into the underlying GCP library and see if there's an env var way to authenticate. If the file based auth path is not provided, the target may (?) fall back on the library's native auth protocols, similar to
boto3
for AWS parsing creds from
AWS_ACCESS_KEY_ID
, etc., even if they are not explicitly sent through the config.
s
Thanks! That's an awesome idea. I also started using @alexander_butler's target, and he was kind enough to implement this feature for me, so finally I have no work at all to do 😉 🎉
@aaronsteers could you expand on how you implemented the keyfile approach? For now, my best guess is turn a json into a string, store it in secrets manager, and then read the file in prodution, turn the secret BACK into a file with a python script and store it in a place where my application can read it
a
@Stéphane Burwash - Your path sounds similar to how I would approach it. My own experience with this bigquery target was still in POC phases, so I do not have a deploy solution that would be reusable. But again, yes, your proposed path sounds like how I'd approach it.
s
Awesome thanks. I'll try to share my approach once it works
Copy code
import subprocess
import json

values = [
    ("SECRET_#1_NAME", "RESULTING_FILE_#1_NAME")
]

def create_keyfile(aws_secret_name, associated_file_name):
    data = subprocess.run(["aws",
                            "secretsmanager",
                            "get-secret-value",
                            "--secret-id",
                            "SECRET_ARN"],
                            capture_output=True,
                            text=True)
    y = json.loads(data.stdout)
    ss = json.loads(y["SecretString"])
    secret = json.loads(ss[aws_secret_name])
    with open(f'.secrets/tmp_{associated_file_name}.json', 'w') as st:
        json.dump(secret, st)

for t in values:
    create_keyfile(t[0], t[1])
Tadahhh, this is going in my .codebuild