adorable-truck-38791
09/01/2025, 1:55 PMmetaflow-dev up
command... it seems to be mostly working, but it seems to keep asking for my password when it's starting all of the services. the weirder thing is that it keeps saying my password is wrong, so i'm not even sure what password it's trying to ask for (is it something related to the minikube/argo roles or something like that? I have no idea)- any thoughts on what I should be trying to fix this?adorable-truck-38791
09/01/2025, 1:58 PM(stratum) ruben@ruben-A6:~/stratum-bio/deploy$ python test-metaflow.py argo-workflows create
Metaflow 2.17.3 executing ParameterFlow for user:ruben
Validating your flow...
The graph looks good!
Running pylint...
Pylint not found, so extra checks are disabled.
Deploying ParameterFlow to Argo Workflows...
It seems this is the first time you are deploying parameterflow to Argo Workflows.
A new production token generated.
The namespace of this production flow is
production:parameterflow-0-gtqo
To analyze results of this production flow add this line in your notebooks:
namespace("production:parameterflow-0-gtqo")
If you want to authorize other people to deploy new versions of this flow to Argo Workflows, they need to call
argo-workflows create --authorize parameterflow-0-gtqo
when deploying this flow to Argo Workflows for the first time.
See "Organizing Results" at <https://docs.metaflow.org/> for more information about production tokens.
Exception ignored in: <gzip on 0x7ece5065a710>
Traceback (most recent call last):
File "/home/ruben/anaconda3/envs/stratum/lib/python3.13/gzip.py", line 359, in close
fileobj.write(self.compress.flush())
ValueError: I/O operation on closed file.
S3 access denied:
<s3://metaflow-test/metaflow/ParameterFlow/data/0a/0a2e0ebd6a909c3a1c0c12bb03f2df23bea995ad>
it seems like from the metaflow-dev
video/documentation, I shouldn't need to set up AWS S3 storage, but maybe I'm wrong? any thoughts on this? (i thought the MinIO usage meant that it's trying to get by with a local isolated artifact storage)adorable-truck-38791
09/01/2025, 7:26 PM(stratum) ruben@ruben-A6:~/stratum-bio/deploy$ python test-metaflow.py argo-workflows create
Metaflow 2.17.3 executing ParameterFlow for user:ruben
Validating your flow...
The graph looks good!
Running pylint...
Pylint not found, so extra checks are disabled.
Deploying ParameterFlow to Argo Workflows...
It seems this is the first time you are deploying parameterflow to Argo Workflows.
A new production token generated.
The namespace of this production flow is
production:parameterflow-0-gtqo
To analyze results of this production flow add this line in your notebooks:
namespace("production:parameterflow-0-gtqo")
If you want to authorize other people to deploy new versions of this flow to Argo Workflows, they need to call
argo-workflows create --authorize parameterflow-0-gtqo
when deploying this flow to Argo Workflows for the first time.
See "Organizing Results" at <https://docs.metaflow.org/> for more information about production tokens.
Exception ignored in: <gzip on 0x730892d2b8b0>
Traceback (most recent call last):
File "/home/ruben/anaconda3/envs/stratum/lib/python3.13/gzip.py", line 359, in close
fileobj.write(self.compress.flush())
ValueError: I/O operation on closed file.
S3 access failed:
S3 operation failed.
Key requested: <s3://metaflow-test/metaflow/ParameterFlow/data/0a/0a2e0ebd6a909c3a1c0c12bb03f2df23bea995ad>
Error: Could not connect to the endpoint URL: "<http://minio.default.svc.cluster.local:9000/metaflow-test/metaflow/ParameterFlow/data/0a/0a2e0ebd6a909c3a1c0c12bb03f2df23bea995ad>"
adorable-truck-38791
09/01/2025, 7:43 PM<http://localhost:9000>
instead of that crazy URL)adorable-truck-38791
09/01/2025, 7:46 PM... argo-workflows {create,trigger}
)