I'm having some issues adding an airflow admin use...
# infra-deployment
m
I'm having some issues adding an airflow admin user after launching everything in docker. Attempting to add a user after starting the containers with the following command yields the below error. It's also a a little confusing to me as to why the command keeps trying to connect to the meltano-system-db since we're adding an airflow user, which should interact with the airflow db?
Copy code
docker run \                                                             
  --volume "$(pwd)":/project \
  meltano-poc:v1.2 \
  invoke airflow:create-admin \
    --username admin \
    --firstname Admin \
    --lastname User \
    --role Admin \
    --email <mailto:admin@blueprintprep.com|admin@blueprintprep.com> \
    --password password || true

2025-07-02T12:02:36.439901Z [info     ] Environment 'dev' is active   
2025-07-02T12:03:51.608683Z [info     ] DB connection failed. Will retry after 5s. Attempt 1/3
2025-07-02T12:05:11.661274Z [info     ] DB connection failed. Will retry after 5s. Attempt 2/3
2025-07-02T12:06:31.731806Z [info     ] DB connection failed. Will retry after 5s. Attempt 3/3
2025-07-02T12:07:51.796795Z [error    ] Could not connect to the database after 3 attempts. Max retries exceeded.
Need help fixing this problem? Visit <http://melta.no/> for troubleshooting steps, or to
join our friendly Slack community.

(psycopg2.OperationalError) connection to server at "meltano-system-db" (143.244.220.150), port 5432 failed: Connection refused
        Is the server running on that host and accepting TCP/IP connections?

(Background on this error at: <https://sqlalche.me/e/20/e3q8>)
Also tried running the above command with the airflow network specified. I've confirmed that the airflow postgres db is online via datagrip. Env variables in the docker-compose file for database URIs include:
Copy code
x-meltano-env: &meltano-env
  MELTANO_DATABASE_URI: <postgresql+psycopg2://postgres:postgres@meltano-system-db/meltano>

x-airflow-env: &airflow-env
  AIRFLOW__CORE__SQL_ALCHEMY_CONN: <postgresql+psycopg2://postgres:postgres@airflow-metadata-db/airflow>
Also unsure if there's a way to have Docker create a new admin user on startup instead of having to run a command after the fact. My understanding is that that would be somewhat complex to do since the ENTRYPOINT parameter in the Dockerfile is already occupied by the meltano command.
Here's the rest of my docker-compose file for reference:
Copy code
services:
  meltano:
    <<: *meltano-image
    command: dragon
    environment:
      <<: *meltano-env
    volumes:
      - meltano_elt_logs_data:/project/.meltano/logs/elt
    expose:
      - 5000
    ports:
      - 5000:5000
    depends_on:
      - meltano-system-db
    networks:
      - meltano
    restart: on-failure

  meltano-system-db:
    image: postgres
    environment:
      POSTGRES_PASSWORD: postgres # CHANGE ME
      POSTGRES_DB: meltano
      PGDATA: /var/lib/postgresql/data/pgdata
    volumes:
      - meltano_postgresql_data:/var/lib/postgresql/data
    expose:
      - 5432
    networks:
      - meltano
    restart: unless-stopped

  airflow-scheduler:
    <<: *meltano-image
    command: invoke airflow scheduler
    environment:
      <<: 
        - *airflow-env
        - *meltano-env
    volumes:
      - meltano_elt_logs_data:/project/.meltano/logs/elt
    expose:
      - 8793
    depends_on:
      - meltano-system-db
      - airflow-metadata-db
    networks:
      - meltano
      - airflow
    restart: unless-stopped
  
  airflow-webserver:
    <<: *meltano-image
    command: invoke airflow webserver
    environment:
      <<: 
        - *airflow-env
        - *meltano-env
    expose:
      - 8080
    ports:
      - 8080:8080
    depends_on:
      - meltano-system-db
      - airflow-metadata-db
    networks:
      - meltano
      - airflow
    restart: unless-stopped
  
  airflow-metadata-db:
    image: postgres
    environment:
      POSTGRES_PASSWORD: postgres # CHANGE ME
      POSTGRES_DB: airflow
      PGDATA: /var/lib/postgresql/data/pgdata
    volumes:
      - airflow_postgresql_data:/var/lib/postgresql/data
    expose:
      - 5432
    ports: # To access locally via DataGrip; changed to 5434 forwarding port instead of default postgres 5432 to avoid port conflict with meltano db
      - 5434:5432
    networks:
      - airflow
    restart: unless-stopped


networks:
  meltano:
  airflow:

volumes:
  meltano_postgresql_data:
    driver: local
  meltano_elt_logs_data:
    driver: local
  airflow_postgresql_data:
    driver: local