Hello everyone, I have this function: ``` def ...
# ask-metaflow
i
Hello everyone, I have this function:
Copy code
def load_model(model_name):
        with S3(run=Flow('MyFlow').latest_successful_run) as s3:
            response = s3.get(model_name)
            model_bytes = io.BytesIO(response.blob)
            model = joblib.load(model_bytes)
        return model
It is trying to store file
metaflow.s3.ee54mdiw
when I call it. Is it possible to change path where it is going to store it?
d
yes:
S3
takes a
tmproot
argument. It defaults to the value set in METAFLOW_TEMPDIR which defaults to
.
in most cases.
thankyou 1
i
still not working
when I execute
Flow('MyFlow').latest_successful_run
it seems like it is trying to write to current working dir. Can I change it ?
so the Flow object is doing some downloading and writing and it is not letting me change path
So in my Dockerfile I tried setting
ENV METAFLOW_TEMPDIR=/tmp
and after that
WORKDIR /app
, but it still tries to go with
/app
dir.
Copy code
>>> import os
>>> os.getenv("METAFLOW_TEMPDIR")
'/tmp'
>>> 
>>> from metaflow import Flow
>>> import os
>>> Flow('AdgroupClustering').latest_successful_run
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.9/site-packages/metaflow/client/core.py", line 2207, in latest_successful_run
    if run.successful:
  File "/usr/local/lib/python3.9/site-packages/metaflow/client/core.py", line 1928, in successful
    return end.successful
  File "/usr/local/lib/python3.9/site-packages/metaflow/client/core.py", line 1267, in successful
    return self["_success"].data
  File "/usr/local/lib/python3.9/site-packages/metaflow/client/core.py", line 920, in data
    obj = filecache.get_artifact(ds_type, location[6:], meta, *components)
  File "/usr/local/lib/python3.9/site-packages/metaflow/client/filecache.py", line 207, in get_artifact
    _, obj = next(
  File "/usr/local/lib/python3.9/site-packages/metaflow/datastore/task_datastore.py", line 384, in load_artifacts
    for (key, blob) in self._ca_store.load_blobs(to_load.keys()):
  File "/usr/local/lib/python3.9/site-packages/metaflow/datastore/content_addressed_store.py", line 135, in load_blobs
    with self._storage_impl.load_bytes([p for _, p in load_paths]) as loaded:
  File "/usr/local/lib/python3.9/site-packages/metaflow/plugins/datastores/s3_storage.py", line 122, in load_bytes
    s3 = S3(
  File "/usr/local/lib/python3.9/site-packages/metaflow/plugins/datatools/s3/s3.py", line 70, in _inner_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/metaflow/plugins/datatools/s3/s3.py", line 573, in __init__
    self._tmpdir = mkdtemp(dir=tmproot, prefix="metaflow.s3.")
  File "/usr/local/lib/python3.9/tempfile.py", line 379, in mkdtemp
    _os.mkdir(file, 0o700)
OSError: [Errno 30] Read-only file system: '/app/metaflow.s3.223906lc'