Steven Searcy
05/02/2025, 3:17 PMvisch
05/02/2025, 4:02 PMvisch
05/02/2025, 4:04 PMSteven Searcy
05/02/2025, 4:10 PMversion: 1
default_environment: ${MELTANO_ENV}
project_id: 17a04890-5b7e-4473-b2ff-d4346303690a
environments:
- name: dev
- name: staging
- name: prod
plugins:
extractors:
- name: tap-csv
variant: meltanolabs
pip_url: git+<https://github.com/MeltanoLabs/tap-csv.git>
config:
files:
- entity: users
path: ./fixtures/users.csv
keys: [id]
stream_maps:
users:
id: int(user_id)
full_name: first_name + " " + last_name
email: email_address
signup_source: source
is_active: bool(active.lower() == "true")
__else__: __NULL__
loaders:
- name: target-postgres
variant: meltanolabs
pip_url: meltanolabs-target-postgres
config:
host: ${PG_HOST}
port: ${PG_PORT}
user: ${PG_USER}
password: ${PG_PASSWORD}
database: ${PG_DATABASE}
default_target_schema: public
first_name
and last_name
from the CSV and concatenating them for the actual Postgres column. Am I reaching here, and is this something that should be done in dbt instead? Or is this indeed a good use case for stream maps.Steven Searcy
05/02/2025, 4:16 PMMatt Menzenski
05/06/2025, 3:57 PM<schema_name>-<table_name>
(schema name and table name separated by a hyphen). If your variant of target-postgres supports this (I believe the MeltanoLabs variant does), you might be able to do this entirely with stream maps.Matt Menzenski
05/06/2025, 3:59 PMSteven Searcy
05/06/2025, 4:03 PM