Hi, is it possible to use `IncludeFile` with argo ...
# ask-metaflow
f
Hi, is it possible to use
IncludeFile
with argo workflow? I was expecting that if I provide a local file, metaflow would automatically upload it to the some remote artifact storage location like the S3 bucket. But this doesn't seem to be the case.
1
d
It should. What is happening instead?
f
The flow's parameter won't even show up properly on UI.
Copy code
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/root/services/ui_backend_service/data/cache/client/cache_server.py", line 364, in <module>
    cli(auto_envvar_prefix='MFCACHE')
  File "/opt/latest/lib/python3.11/site-packages/click/core.py", line 1128, in __call__
    return self.main(*args, **kwargs)
  File "/opt/latest/lib/python3.11/site-packages/click/core.py", line 1053, in main
    rv = self.invoke(ctx)
  File "/opt/latest/lib/python3.11/site-packages/click/core.py", line 1395, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/opt/latest/lib/python3.11/site-packages/click/core.py", line 754, in invoke
    return __callback(*args, **kwargs)
  File "/root/services/ui_backend_service/data/cache/client/cache_server.py", line 358, in cli
    Scheduler(store, max_actions).loop()
  File "/root/services/ui_backend_service/data/cache/client/cache_server.py", line 326, in loop
    self.cleanup_if_necessary()
  File "/root/services/ui_backend_service/data/cache/client/cache_server.py", line 290, in cleanup_if_necessary
    self.cleanup_workers()
  File "/root/services/ui_backend_service/data/cache/client/cache_server.py", line 298, in cleanup_workers
    self.cleanup_pool()
  File "/root/services/ui_backend_service/data/cache/client/cache_server.py", line 304, in cleanup_pool
    self.pool = multiprocessing.Pool(
  File "/usr/local/lib/python3.11/multiprocessing/context.py", line 119, in Pool
    return Pool(processes, initializer, initargs, maxtasksperchild,
  File "/usr/local/lib/python3.11/multiprocessing/pool.py", line 215, in __init__
    self._repopulate_pool()
  File "/usr/local/lib/python3.11/multiprocessing/pool.py", line 306, in _repopulate_pool
    return self._repopulate_pool_static(self._ctx, self.Process,
  File "/usr/local/lib/python3.11/multiprocessing/pool.py", line 329, in _repopulate_pool_static
    w.start()
  File "/usr/local/lib/python3.11/multiprocessing/process.py", line 121, in start
    self._popen = self._Popen(self)
  File "/usr/local/lib/python3.11/multiprocessing/context.py", line 281, in _Popen
    return Popen(process_obj)
  File "/usr/local/lib/python3.11/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/usr/local/lib/python3.11/multiprocessing/popen_fork.py", line 71, in _launch
    code = process_obj._bootstrap(parent_sentinel=child_r)
  File "/usr/local/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.11/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python3.11/multiprocessing/pool.py", line 125, in worker
    result = (True, func(*args, **kwds))
  File "/root/services/ui_backend_service/data/cache/client/cache_worker.py", line 29, in execute_action
    execute(tempdir, action_cls, request)
  File "/root/services/ui_backend_service/data/cache/client/cache_worker.py", line 51, in execute
    res = action_cls.execute(
  File "/root/services/ui_backend_service/data/cache/get_data_action.py", line 122, in execute
    results[target_key] = cacheable_exception_value(ex)
  File "/root/services/ui_backend_service/data/cache/utils.py", line 104, in cacheable_exception_value
    return json.dumps([False, ex.__class__.__name__, str(ex), get_traceback_str()])
  File "/root/services/ui_backend_service/data/cache/get_data_action.py", line 116, in execute
    result = cls.fetch_data(target, stream_output)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/services/ui_backend_service/data/cache/get_parameters_action.py", line 54, in fetch_data
    for artifact_name, artifact in step.task.artifacts._asdict().items():
                                   ^^^^^^^^^^^^^^^^^^^

AttributeError: 'NoneType' object has no attribute 'artifacts'
a
@fast-vr-44972 you should be able to use include file with the argo-workflows trigger command. We will look into the issue with the metaflow ui.
argo ui doesn’t support uploading files. we have a different ui for managing deployed flows, but that’s currently not in open source.