Hi all, Does the Metaflow Python package support ...
# dev-metaflow
t
Hi all, Does the Metaflow Python package support S3 buckets (used as METAFLOW_DATASTORE_SYSROOT_S3 and METAFLOW_DATATOOLS_S3ROOT in the Metaflow config) with object-level encryption via a customer managed KMS key? AFAIK when a bucket forces object-level encryption, you need to provide a couple of more things when uploading files. Example for the CLI:
aws s3 cp hi.txt <s3://bucket-name/hi.txt> --sse aws:kms --sse-kms-key-id alias/my-key
And an example for boto3 to give you the idea:
s3.Object(BUCKET_NAME, "tmp/hi.txt").put(Body=b"Hi!", ServerSideEncryption="aws:kms", SSEKMSKeyId=KMS_KEY_ID)
My use case: I’m using Metaflow with AWS and my company’s policy requires all S3 buckets to have object-level encryption via a customer managed KMS key.
1
if it blocks you from trying metaflow I can send you a patch for metaflow to supports that today. And we'll probably add that functionality in the upstream metaflow in one of the upcoming versions
t
Hi Oleg, that’s great to hear! Do you maybe know if the update will be released by the end of this year?
a
possibly, we're kind of conservative with upstream changes to make sure everything is always backwards compatible, but at the same time this particular one is a pretty small change
👍 1