Regarding state and incremental updates (tap-postgres, target-snowflake) - is it okay to pass the same job_id if I am orchestrating this with something aside from Airflow? I.e. every hour we run
Yep! The job_id is just keeping track of the state (ie bookmarks and other metadata about the runs). So using the same job_id will just tell Meltano to find the most recent run for that job.