important-bear-42262
05/22/2025, 7:23 AMdef load_model(model_name):
with S3(run=Flow('MyFlow').latest_successful_run) as s3:
response = s3.get(model_name)
model_bytes = io.BytesIO(response.blob)
model = joblib.load(model_bytes)
return model
It is trying to store file metaflow.s3.ee54mdiw
when I call it. Is it possible to change path where it is going to store it?dry-beach-38304
05/22/2025, 7:36 AMS3
takes a tmproot
argument. It defaults to the value set in METAFLOW_TEMPDIR which defaults to .
in most cases.important-bear-42262
05/22/2025, 10:08 AMimportant-bear-42262
05/22/2025, 10:09 AMFlow('MyFlow').latest_successful_run
it seems like it is trying to write to current working dir. Can I change it ?important-bear-42262
05/22/2025, 10:09 AMimportant-bear-42262
05/22/2025, 11:17 AMENV METAFLOW_TEMPDIR=/tmp
and after that WORKDIR /app
, but it still tries to go with /app
dir.
>>> import os
>>> os.getenv("METAFLOW_TEMPDIR")
'/tmp'
>>>
>>> from metaflow import Flow
>>> import os
>>> Flow('AdgroupClustering').latest_successful_run
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.9/site-packages/metaflow/client/core.py", line 2207, in latest_successful_run
if run.successful:
File "/usr/local/lib/python3.9/site-packages/metaflow/client/core.py", line 1928, in successful
return end.successful
File "/usr/local/lib/python3.9/site-packages/metaflow/client/core.py", line 1267, in successful
return self["_success"].data
File "/usr/local/lib/python3.9/site-packages/metaflow/client/core.py", line 920, in data
obj = filecache.get_artifact(ds_type, location[6:], meta, *components)
File "/usr/local/lib/python3.9/site-packages/metaflow/client/filecache.py", line 207, in get_artifact
_, obj = next(
File "/usr/local/lib/python3.9/site-packages/metaflow/datastore/task_datastore.py", line 384, in load_artifacts
for (key, blob) in self._ca_store.load_blobs(to_load.keys()):
File "/usr/local/lib/python3.9/site-packages/metaflow/datastore/content_addressed_store.py", line 135, in load_blobs
with self._storage_impl.load_bytes([p for _, p in load_paths]) as loaded:
File "/usr/local/lib/python3.9/site-packages/metaflow/plugins/datastores/s3_storage.py", line 122, in load_bytes
s3 = S3(
File "/usr/local/lib/python3.9/site-packages/metaflow/plugins/datatools/s3/s3.py", line 70, in _inner_func
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/metaflow/plugins/datatools/s3/s3.py", line 573, in __init__
self._tmpdir = mkdtemp(dir=tmproot, prefix="metaflow.s3.")
File "/usr/local/lib/python3.9/tempfile.py", line 379, in mkdtemp
_os.mkdir(file, 0o700)
OSError: [Errno 30] Read-only file system: '/app/metaflow.s3.223906lc'