I’m trying to upgrade `dbt` to the adapter-specifi...
# plugins-general
l
I’m trying to upgrade
dbt
to the adapter-specific
dbt-postgres
and I am getting this error when running airflow, or the
meltano schedule run <schedule name>
. Has anyone worked around this issue? Does the adapter-specific transformer only work with the
meltano run
command?
v
Could we see your
meltano.yml
?
Using
dbt-postgres
should work, and it works on
elt
or
run
from my experience (I think!)
l
Copy code
version: 1
send_anonymous_usage_stats: false
project_id: <project_id>
plugins:
  extractors:
  - name: tap-mysql
    variant: transferwise
    pip_url: git+<https://github.com/transferwise/pipelinewise-tap-mysql.git@v1.4.3>
    config:
      export_batch_rows: 10000
  - name: tap-core-mysql
    inherit_from: tap-mysql
    metadata:
      '*':
        replication-method: LOG_BASED
  - name: tap-ejabberd-mysql
    inherit_from: tap-mysql
    metadata:
      '*':
        replication-method: LOG_BASED
  loaders:
  - name: target-postgres
    variant: transferwise
    pip_url: pipelinewise-target-postgres
    config:
      add_metadata_columns: true
      hard_delete: true
      batch_size_rows: 10000
      max_batch_rows: 5000
      persist_empty_tables: true
      dbname: warehouse
  - name: target-core-postgres
    inherit_from: target-postgres
    config:
      default_target_schema: core
  - name: target-ejabberd-postgres
    inherit_from: target-postgres
    config:
      default_target_schema: ejabberd
  orchestrators:
  - name: airflow
    variant: apache
    pip_url: stripe==2.58.0 psycopg2==2.8.6 apache-airflow[crypto]==2.1.2 apache-airflow==2.1.2
      --constraint <https://raw.githubusercontent.com/apache/airflow/constraints-2.1.2/constraints-${MELTANO__PYTHON_VERSION}.txt>
    config:
      webserver.instance_name: $ENVIRONMENT
      scheduler.catchup_by_default: false
      core.executor: $AIRFLOW__CORE__EXECUTOR
      core.parallelism: 2
      core.max_active_runs_per_dag: 1
      core.dag_concurrency: 1
      logging.logging_level: $AIRFLOW__LOGGING__LOGGING_LEVEL
      core.fernet_key: $AIRFLOW__CORE__FERNET_KEY
  transformers:
  - name: dbt-postgres
    variant: dbt-labs
    pip_url: dbt-core~=1.1.0 dbt-postgres~=1.1.0
  files:
  - name: airflow
    pip_url: git+<https://gitlab.com/meltano/files-airflow.git>
    update:
      orchestrate/dags/meltano.py: false
  - name: files-airflow
    variant: meltano
    pip_url: git+<https://github.com/meltano/files-airflow.git>
schedules:
- name: core-mysql-postgres
  interval: 5/10 * * * *
  extractor: tap-core-mysql
  loader: target-core-postgres
  transform: skip
  start_date: 2021-01-07 12:33:54.135169
- name: ejabberd-mysql-postgres
  interval: '*/10 * * * *'
  extractor: tap-ejabberd-mysql
  loader: target-ejabberd-postgres
  transform: skip
  start_date: 2021-01-07 12:34:39.904913
- name: core-mysql-postgres-transform
  interval: 0 */4 * * *
  extractor: tap-core-mysql
  loader: target-core-postgres
  transform: only
  start_date: 2021-02-10 18:48:29.609494
This is the
meltano.ym
file. For some reason the command
meltano schedule run
doesn’t work with
dbt-postgres
If I use the
meltano run
command, it runs no problem.
e
@lars You’re using the legacy schedule syntax that worked with
meltano elt
. You’ll have a to add a job and then add a new schedule that references that job
Copy code
meltano schedule add <schedule_name> --job my_job --interval "@daily"