Hey, we want to evaluate Metaflow on GCP and I am ...
# ask-metaflow
e
Hey, we want to evaluate Metaflow on GCP and I am checking out deployment options. I see there is a full example provided which includes the deployment of the infrastructure as well. I noticed in metaflow-tools there is also a Helm chart, and I was curious if someone has tried deploying Metaflow to GKE instead? Given there are some hardcoded variables in the template file
Copy code
- name: METAFLOW_DEFAULT_DATASTORE
              value: "s3"
there should be some care given. Though it seems this could be overwritten by
metaflow-ui.metadatadbEnvVars
. Another question I had -- we don't use Argo but have Airflow with KubernetesExecutor deployed already. As far as I understand the full GCP example, this is not really a problem, since the deployment itself doesn't rely on any Argo settings. However, it seems like config.json is read from the local machine itself (where the flow is ran from). Is there an example showing what settings need to be used to get it working with Airflow? Or is no config needed? Based on the docs I understand a valid CONN_ID env variable suffices. However, we'd still need to come up with an approach of automatically pushing generated DAGs to GCS bucket. So I am again curious if someone has some experience with this? Thanks!