Hi all! I'm setting up a Metaflow/Argo-workflow st...
# ask-metaflow
s
Hi all! I'm setting up a Metaflow/Argo-workflow stack on our Kubernetes cluster. I'm using Minio for the Datatools and Datastore and am seeing a credentials error when the Workflow is triggered (following the 'Episode 8: Autopilot' tutorial). The error is
Copy code
botocore.exceptions.NoCredentialsError: Unable to locate credentials
• I'm running this to execute:
python 02-statistics/stats.py argo-workflows create --max-workers 2
• I have my ~/.metaflowconfig/config.json set with METAFLOW_DATASTORE_SYSROOT_S3, METAFLOW_DATATOOLS_S3ROOT, METAFLOW_DEFAULT_DATASTORE (is there a • IMO Argo Workflow is configured correctly because it can read/write to the Minio Bucket. I'm looking through the metaflow service code and see the plugins/aws code uses boto but I don't see anyway obvious way to pass Minio's aws_access_key_id and aws_secret_access_key. Any guidance is appreciated... just point me in the right direction. I feel like I'm missing something obvious.
āœ… 1
s
s
Thank you... This might be the magic ENV I've been looking for: METAFLOW_KUBERNETES_SECRETS Stay tuned
excited 1
šŸš€ That appears to have gotten me past that error! Thank you @square-wire-39606 (name checks out) Now on to the next error! šŸ™‚
šŸ™Œ 1
s
awesome! always happy to help!