Looking at the doc for <containerized deployment> ...
# troubleshooting
d
Looking at the doc for containerized deployment and I was wondering if there are any recommended requirements for the external metadata DB. PostgreSQL, any particular minimum version, recommended size, etc.?
e
Hi @dave_lyons! The metadata does very little in most cases, unless you have millions of concurrent pipelines, I'd go with the smallest postgres instance available in your cloud provider. As for the version, probably >=9.6 should be safe.
d
@edgar_ramirez_mondragon thanks! I added
MELTANO_DATABASE_URI
to my
env
, but have been running with the default local db. How do I get it to restart and use the PostgreSQL settings in
env
? Do I simply run
meltano install
and it will catch the env setting?
e
@dave_lyons yup, running
meltano install
with the new URI set should run migrations on that db
d
would
meltano config meltano set database_uri postgresql://<username>:<password>@<host>:<port>/<database>
also be sufficient or the install is necessary?
e
yes,
meltano config ...
would also run migrations
d
Does the metadata database contain links to logs? Because once I switched, Meltano forgot all my run history (but Meltano Airflow did not?)
e
Pipeline (singer) state is stored in the meltano db, so if you had schedules or elt runs with `--job_id`s, they would've stayed in the old db.
Meltano forgot all my run history
Do you mean as it appears in the UI?
d
yes, in the UI
e
cool. yeah, job history would come from the
job
table in the db. @alexmarple can probably confirm
a
can confirm that the UI pulls from the jobs table in the db.