Looking into meltano again. Is it possible to encr...
# random
h
Looking into meltano again. Is it possible to encrypt data on write to S3 via a kms key when using meltano. How would that usually be implemented
v
I'd guess this isn't done on a target at the moment (i could be wrong). Normally you'd implement this on the target side with a config value like target-s3 , config would be something like encrypt:true, encrypt_kmskey:yadayada
t
This seems like a good use-case for the stream map part of the SDK @aaronsteers?
v
https://github.com/transferwise/pipelinewise-target-s3-csv I don't see listed on the hub yet but that would do it too if you're not trying to go to athena
e
Server side encryption can easily be set up at the bucket level in aws so your files are encrypted at rest. For client side encryption, the encryption sdk could be used in a target, which none does afaik. Or, less than ideal, you use a target that outputs to a local file and then you encrypt the file on copy to s3.
a
@haleemur_ali - To summarize, and add on to comments from @visch and @edgar_ramirez_mondragon, there are lots of good approaches, and actually most targets that support S3 also support encryption. Do you want to say more about the data's final landing spot - would you be loading into Snowflake or Redshift, for instance, or is S3 the "final" landing zone for you? Generally, targets like Snowflake and Redshift have configurable S3 landing options, including encryption/compression.
h
thanks folks, the data landing in s3 will be used in both athena & spark, so s3 / athena would be the landing zone, and target-athena seems most promising