Hi! Is there a pattern/best practice for using a ...
# ask-metaflow
f
Hi! Is there a pattern/best practice for using a shared library across multiple flows? We've developed a bunch of utility functions, and are constructing a single repo for all the data science work, The idea is that there will be a directory for these shared utility functions, then a directory containing directories for flows attached to projects. Something like:
Copy code
main dir
|-dist
|-utility dir
+-projects dir
  |-project 1
  | +-flow directory
  | project 2
    +- flow directory
We want the flows to be able to use the library in the utility directory, and are looking at having the utilities being built as part of makefiles in each project directory. The root level makefile will create a python package and place it into the
dist
directory to be locally available. What is the appropriate/correct/accepted method for having those utility functions get into the code that's sent to remote compute? Will the imports inside step functions "just work", or will we need to be using some internal pypi/conda distribution mechanism?