hi,
Airflow scheduler is not picking state from postgress DB instead it uses local .meltano db. Attaching my yml file( I have replace connection parameter)
After successfull, data is entered in
run table , but not picking to start job
=======================================
version: 1
project_id: meltano-abc-1101
default_environment: prod
plugins:
extractors:
- name: tap-appd_state
namespace: tap_appd_state
variant: ""
executable: tap-appd_state
pip_url: -e ./tap-appd_state
capabilities:
- state
- catalog
- discover
config:
api_url:
https://xyx
api_key: xyz
loaders:
- name: kafka-loader
namespace: kafka_loader
pip_url: -e ./loader-kafka
executable: loader-kafka
config:
kafka_brokers: 172.16.24.23:9092
topic_prefix: wm_appdynamics_7751_7851
orchestrators:
- name: airflow
variant: apache
pip_url: "apache-airflow==2.1.2 --constraint
https://raw.githubusercontent.com/apache/airflow/constraints-2.1.2/constraints-${MELTANO__PYTHON_VERSION}.txt"
files:
- name: files-airflow
variant: meltano
pip_url: git+
https://github.com/meltano/files-airflow.git
schedules:
- name: meltano-abc-1101
interval: '*/5 * * * *'
extractor: tap-appd_state
loader: kafka-loader
transform: skip
start_date: 2022-12-30 05
1354
environments:
- name: prod
env:
MELTANO_DATABASE_URI:
postgresql://USER:PASSWORD@IPADDRESS:PORT/DBNAME
===========meltano.yml end================
command to start
meltano invoke airflow scheduler
meltano version 2.7.1
Python 3.9.7