```datawarehouse-1 | self.load_and_parse_macr...
# troubleshooting
i
Copy code
datawarehouse-1  |     self.load_and_parse_macros(project_parser_files)
datawarehouse-1  |   File "/project/.meltano/utilities/dagster/venv/lib/python3.10/site-packages/dbt/parser/manifest.py", line 683, in load_and_parse_macros
datawarehouse-1  |     block = FileBlock(self.manifest.files[file_id])
datawarehouse-1  | KeyError: 'dbt_<snowflake://macros/metadata.sql>'
I keep randomly getting errors like this trying to materialize my dbt models in dagster (in a container). It seems to be something with the manifest (My assumption is the manifest is corrupted or screwed up once it's loaded in Dagster - but I've successfully gotten this working before and it just kind of randomly started breaking again), but another strange thing is this only fails in one of my environments in Azure (One time it failed in prod but worked in staging and now it works in prod but not in staging). It will also fail with stack trace of
macros/adapters.sql
as well. Any ideas? I run this as an entrypoint to my docker container (parse_manifest runs 'parse --quiet'):
Copy code
meltano invoke dbt-snowflake:parse_manifest

meltano invoke dagster:start_docker
dagster in my meltano.yml:
Copy code
- name: dagster
    variant: quantile-development
    pip_url: dagster-ext==0.1.3 dbt-core==1.9.3 dbt-snowflake==1.9.3 dagster==1.9.7
a
Not sure if this is any use based on a google of the error? https://discourse.getdbt.com/t/keyerror-in-dbt-snowflake-within-docker-container/17744
i
Yeah I found that issue and tried something similar. I think I may have found the fix - I had a corrupted target directory (in addition to my expected target/ dir) that I removed and it seems to work now.
👍 1