sean_glynn
01/04/2022, 4:42 PMmeltano.yml
pipeline declaration.
As per the docs, we can give pip_url
a GH url in the following format within the extractor configuration.:
git+<https://github.com/private-org/repo.git>
This works well with all public repos however, the tap code I want to clone is located within a private org repo and I cannot access this without the correct auth.
I have been trying to use GH API Token for auth, which works well when I hardcode the token value within my meltano.yml. Example:
pip_url: git+https://"sample-token-xxx"@github.com/private-org/tap-repo.git
I have tested this and it works as expected but I need to now find a way to pass a secret/env variable to the meltano install
command.
I have tried to export the env value and define my extractor url like this:
version: 1
send_anonymous_usage_stats: false
project_id: tap_cloudflare_graphql
plugins:
extractors:
- name: tap_cloudflare_graphql
namespace: tap_cloudflare_graphql
pip_url: git+https://$GITHUB_TOKEN@github.com/private-org/tap-repo.git
-but the secret/env value never gets rendered when I run the meltano install
command.
I have already tried exporting this as a USER/SYSTEM environment variable and also adding as a meltano configuration:
meltano config meltano set GITHUB_TOKEN ${GITHUB_TOKEN}
-but still this does not seem to render the token value within the meltano.yml file.
I wanted to ask the community if it was possible to reference a GitHub API Key within the meltano.yml
pipeline file?
OR
Is there another way to add GH auth when referencing a GH repo as an extractor within my pipeline declaration?visch
01/04/2022, 4:51 PMsean_glynn
01/04/2022, 4:55 PMmeltano install
command will be run inside a container (Apologies I should have mentioned this).
The container will:
⢠Install extractor/loaders defined within pipeline yaml via meltano install
⢠Run meltono elt
visch
01/04/2022, 4:55 PMsean_glynn
01/04/2022, 4:59 PMvisch
01/04/2022, 4:59 PMvisch
01/04/2022, 4:59 PMsean_glynn
01/04/2022, 5:00 PMvisch
01/04/2022, 5:05 PMaaron_phethean
01/05/2022, 11:53 AMsean_glynn
01/14/2022, 8:18 AMaaron_phethean
01/14/2022, 9:46 AMsean_glynn
01/14/2022, 10:37 AM:latest-python3.8
| :latest-python3.9
). We mount our secrets, init the container via an /entrypoint
script (which is a common init script that installs our dependancies and initializes our prerequisite steps, required for that pipeline) - before running the ELT job.
We share this /entrpoint
script with our docker-compose script for local pipeline development (Before we want to run our pipelines in Argo).
I would love to share this and get feedback on this approach (Once I have this deployment in a better state as it is still WIP š )
Thank you guys for your feedback thus far !aaron_phethean
01/14/2022, 1:21 PM