Hi everyone! I am new to MetaFlow and trying to un...
# ask-metaflow
c
Hi everyone! I am new to MetaFlow and trying to understand how to deploy a flow using remote resources on different compute backends. In this tutorial, there is a flow with steps assigned either locally, on AWS Batch or on remote k8s. How is MetaFlow set up in this case? How does the config.json look like to support both AWS Batch and k8s? Also, in cases where this flow is deployed (for instance, on Argo), on which compute would each of these steps run? Thank you very much for your help!
a
Hey! You would typically configure an AWS Batch environment, and a K8S environment and put the configuration values in config.json (e.g. for k8s, there is a bunch of variables with names starting with
METAFLOW_KUBERNETES_
., like METAFLOW_KUBERNETES_NAMESPACE) You can refer to respective docs (k8s/aws batch) to find out specific variables. If you use an external orchestrator like Argo or Step functions, you'd typically use one or the other type of compute, for example if you plan to use Argo you'd typically only use k8s. While you can mix and match as shown there, most users, if they are a kubernetes shop, just use Argo and @kubernetes. If they don't have kubernetes based platform internally, they opt for batch+step functions.