TLDR: how to switch to other metaflow METAFLOW_DAT...
# ask-metaflow
e
TLDR: how to switch to other metaflow METAFLOW_DATASTORE_SYSROOT_S3? Context: I am using company's notebook server with an image that is supposignly attached to my "stage" role. We run argo flows and metaflow artifacts are stored in s3 datalake. We're at our own fork of metaflow which corresponds (I think) to version 2.1.2850. The problem when I initiate metaflow in notebook, it sees "dev" setup and specifically metaflow flows and runs for "dev" namespace (this is k8s namespace; I think it maps to "configs" in metaflow, not metaflow namespace. When I run
metaflow configure show
it indeed shows that
METAFLOW_DATASTORE_SYSROOT_S3
is set to dev path on s3. However, when I changed that to staging path, there is no effect - it still shows dev flows and runs. How to change that?
d
Configuration values can come from a lot of places. It’s hard for me to say exactly but generally, values are read from the environment, from configs based on your profile and from extensions. Without extensions, the order is “env var trumps configuration trumps default”. With extensions, it’s kind of up to the extension but extension values will override non extension ones but how the extension gets that value is up to it.
Whr do you mean “change that”?
e
Change that - to be able to "switch" metaflow module in the notebook to use (look for artifacts from) from different METAFLOW_DATASTORE_SYSROOT_S3
fwiw
METAFLOW_DATASTORE_SYSROOT_S3
env variable is correct; however
Metaflow().flows
show 'wrong' list of flows (from dev DATASTORE)