Hi. I'm testing a workflow against a kubernetes cl...
# ask-metaflow
w
Hi. I'm testing a workflow against a kubernetes cluster, where I leveraged the
outerbounds/metaflow-tools/k8s/helm/metaflow
helm chart. I have a minio running in the same cluster and modified the
forward_metaflow_ports.py
script to forward that minio-api to
localhost:9000
... so I pass
METAFLOW_S3_ENDPOINT_URL:<http://localhost:9000>
when running a flow. However, that also gets passed through to the kubernetes pod ... and so the job errors out with
fatal error: Could not connect to the endpoint URL: "<http://localhost:9000/minio-metaflow-bucket/metaflow/>...
in the pod logs ... since the pod knows nothing about
localhost:9000
. Instead of
localhost:9000
, I'm going to need
minio.default.svc.cluster.local:9000
for the pod to communicate back to my minio in the cluster ... I tried adding
METAFLOW_S3_ENDPOINT_URL=minio.default.svc.cluster.local:9000
as part of secret (including access key and secret for minio) on k8s, leveraging the
METAFLOW_KUBERNETES_SECRETS
variable , but that doesn't seem to get picked up in the pod. Could someone suggest how I might resolve this? Thanks alot.