I have a question - is it possible to write all th...
# troubleshooting
x
I have a question - is it possible to write all the meltano elt logs to stdout?
t
Meltano currently outputs some things to both stdout and stderr, but we noted here that we should just be putting everything to stdout.
x
How to set it inside docker container?
t
I’m not sure I understand your question @xinge_li. What are you trying to set?
x
So I tried to deploy meltano on aws and something was wrong with airflow scheduler, however, I was not able to see what exactly was wrong but only the task was marked as failed. I would like to ask whether it is possible to see the elt logs.
t
are you able to run the airflow webserver?
meltano invoke airflow webserver
x
yes
t
you should be able to view the airflow logs through that. other logs will be stored in the
.meltano/
folder on the filesystem.
x
Is it possible to store meltano elt logs to e.g cloudWatch?
t
from within Meltano itself, I don’t believe so, but if you’re running Meltano with docker you should be able to ship the logs to cloudwatch from docker (since the docker logs should just be the meltano logs too).
I’ve never done this though
x
I see, I can give it a try, thanks!
k
It would be very useful to be able to configure Meltano to write the ELT logs to STDOUT. I'm running in Heroku and they don't support mounting volumes in docker containers, so there's simply no way to get to the logs in
.meltano
.
The separate containers can't see each others' file systems either, so scheduler logs aren't visible in either the Meltano Web UI or Airflow's.
t
@kat_crichton-seager can you make an issue around this? would be good to document the need
k
Thanks @taylor. I've raised this ticket; https://gitlab.com/meltano/meltano/-/issues/2948
d
#thread-necromancy Do we think this could be solved by mounting a Docker volume in the
$pwd/.meltano
directory? So we can run the copies of the base container for Meltano UI, Airflow UI and Airflow Scheduler, and ensure they are all reading/writing from the same volume
I tried something like this with Terraform But it still doesn't seem to work properly
t
I’m not sure - @edgar_ramirez_mondragon any thoughts?
e
That should work afaict. I wonder what's not working? Perhaps the mount sync is not bidirectional?
d
I get this kind of an error from within the Airflow webserver
Copy code
*** Log file does not exist: /project/.meltano/run/airflow/logs/meltano_airtable-remediate/extract_load/2022-03-01T11:50:00+00:00/1.log
*** Fetching from: <http://ip-10-47-109-250.ap-southeast-2.compute.internal:8793/log/meltano_airtable-remediate/extract_load/2022-03-01T11:50:00+00:00/1.log>
*** Failed to fetch log file from worker. HTTPConnectionPool(host='ip-10-47-109-250.ap-southeast-2.compute.internal', port=8793): Max retries exceeded with url: /log/meltano_airtable-remediate/extract_load/2022-03-01T11:50:00+00:00/1.log (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f7010724f40>, 'Connection to ip-10-47-109-250.ap-southeast-2.compute.internal timed out. (connect timeout=5)'))
Looks like we might need to mount to some other path?
/project/.meltano/run/airflow/logs/meltano_airtable-remediate/extract_load/2022-03-01T11:50:00+00:00/1.log
or
/log/meltano_airtable-remediate/extract_load/2022-03-01T11:50:00+00:00/1.log
It sort of looks like it's trying to get the log via an HTTP request to a
/log
web endpoint
e
Hey @david_tout any luck on this front? We're trying to improve our logging as well @monika_rajput @connor_flynn @jo_pearson
d
Hi @edward_ryan Not yet I'm afraid. I beleive my problem is somewhat related to my container/task definition in ECS
e
Ah got it How are you currently storing Meltano logs?
Also, unrelated question, what's best way to downgrade meltano to a previous version? Would I just install the specific meltano version using pip?
d
We're not storing them unfortunately...They're just on the container. We don't have a strong need to go back historically, but if it was easy to fire them at CloudWatch Logs we would definitely do that
I'm just guessing re: old version, but I'd imagine that the normal pip syntax would probably work. We've had no issues just using latest whenever we build a new container.
e
Thanks!!