great-egg-84692
08/06/2025, 4:18 PMMETAFLOW_S3_ENDPOINT_URL
when running a flow locally with python flow.py run
and when creating an argo workflow using python flow.py argo-workflows create
?ancient-application-36103
08/06/2025, 4:19 PMMETAFLOW_S3_ENDPOINT_URL
in your env or metaflow configgreat-egg-84692
08/06/2025, 4:19 PMMETAFLOW_SERVICE_URL
, which is solved with an extra METAFLOW_SERVICE_INTERNAL_URL
, but there appears no METAFLOW_S3_INTERNAL_ENDPOINT_URL
ancient-application-36103
08/06/2025, 4:19 PMgreat-egg-84692
08/06/2025, 4:20 PMyou can setBut this will be used for bothin your env or metaflow configMETAFLOW_S3_ENDPOINT_URL
run
and argo-workflows create
, right? I need it to be different depending on the command, is there already a way to do so? feel this could be a common problem.ancient-application-36103
08/06/2025, 4:22 PMancient-application-36103
08/06/2025, 4:22 PMpython flow.py run
can also run the workload on kubernetesgreat-egg-84692
08/06/2025, 4:22 PMis the goal to use different urls when running the task locally vs running it as a kubernetes pod?correct
ancient-application-36103
08/06/2025, 4:22 PMgreat-egg-84692
08/06/2025, 4:23 PMMETAFLOW_SERVICE_URL
and METAFLOW_SERVICE_INTERNAL_URL
, is that right? so that when creating argo workflow, the env var is set as
"METAFLOW_SERVICE_URL": SERVICE_INTERNAL_URL,
https://github.com/Netflix/metaflow/blob/27c6aaefb3966fafadda68f2831ab3ca5510c92f/metaflow/plugins/argo/argo_workflows.py#L1677ancient-application-36103
08/06/2025, 4:25 PMgreat-egg-84692
08/06/2025, 4:27 PMdevtools
yet. Could you clarify please a bit more how it solves the problem of accessing diff s3 endpoint depending on local vs argo?
k8s_yaml(encode_yaml({
'apiVersion': 'v1',
'kind': 'Secret',
'metadata': {'name': 'minio-secret'},
'type': 'Opaque',
'stringData': {
'AWS_ACCESS_KEY_ID': 'rootuser',
'AWS_SECRET_ACCESS_KEY': 'rootpass123',
'AWS_ENDPOINT_URL_S3': '<http://minio.default.svc.cluster.local:9000>',
}
}))
ancient-application-36103
08/06/2025, 4:28 PMancient-application-36103
08/06/2025, 4:28 PMancient-application-36103
08/06/2025, 4:28 PMgreat-egg-84692
08/06/2025, 4:32 PMpython flow.py run
and when it feels ready, they create an argo workflow template with python flow.py argo-workflows create
. I'm not sure how to set diff METAFLOW_S3_ENDPOINT_URL
values depending on run
or argo-workflows create
.
Am I missing some thing obvious?
(btw, this may not matter much, but we're using azure instead of aws)ancient-application-36103
08/06/2025, 4:33 PMancient-application-36103
08/06/2025, 4:33 PMgreat-egg-84692
08/06/2025, 4:35 PMi am assuming you are also setting up a metaflow config for the user?we set the configs in
METAFLOW_HOME/config_{prod|staging}.json
and
metaflow_extensions/org/config/mfextinit_org.py
great-egg-84692
08/06/2025, 4:36 PMalso, how are you setting up the S3_ENDPOINT_URL today?we're using the same endpoint for both local and remote currently, but this is unideal.
ancient-application-36103
08/06/2025, 4:36 PMgreat-egg-84692
08/06/2025, 4:37 PMMETAFLOW_S3_ENDPOINT_URL
in the METAFLOW_HOME/config_{prod|staging}.json
.ancient-application-36103
08/06/2025, 4:37 PMancient-application-36103
08/06/2025, 4:38 PMancient-application-36103
08/06/2025, 4:38 PMancient-application-36103
08/06/2025, 4:40 PMgreat-egg-84692
08/06/2025, 4:46 PMargo-workflows create
too?ancient-application-36103
08/06/2025, 5:01 PMancient-application-36103
08/06/2025, 5:01 PM