early-accountant-74956
10/16/2023, 2:36 PM- name: METAFLOW_DEFAULT_DATASTORE
value: "s3"
there should be some care given. Though it seems this could be overwritten by metaflow-ui.metadatadbEnvVars
.
Another question I had -- we don't use Argo but have Airflow with KubernetesExecutor deployed already. As far as I understand the full GCP example, this is not really a problem, since the deployment itself doesn't rely on any Argo settings. However, it seems like config.json is read from the local machine itself (where the flow is ran from). Is there an example showing what settings need to be used to get it working with Airflow? Or is no config needed? Based on the docs I understand a valid CONN_ID env variable suffices. However, we'd still need to come up with an approach of automatically pushing generated DAGs to GCS bucket. So I am again curious if someone has some experience with this?
Thanks!