+1 to this, we've been running quite a lot of Meltano jobs in kubernetes via Argo workflows as has been pretty good.
One thing to note is pretty much any configuration for a tap or target can be overridden with environment variables, so I tend to keep the static config such as installed taps in a meltano.yml in a Dockerfile, and override run specific settings with environment variables.
I also was pretty quick to use a postgres db for storing job state as I wanted to be able to introspect the state of the jobs