Hey we would like to use `trigger_on_finish` for c...
# ask-metaflow
b
Hey we would like to use
trigger_on_finish
for connecting flows together. We are currently using batch though and currently cannot completely switch to a proper kubernetes setup. However we do have a small k3s setup. I was wondering if it is possible to just use argo / k3s for orchestrating but have the pods still submit jobs to batch? I guess I am wondering what would happen if I publish an argo workflow that has
@batch
steps, will it just submit them and watch for their completion? That would be great because I could just use our small k3s setup for the orchestration layer but still use batch for compute.
s
Hi! Right now, we haven’t implemented AWS Batch support for Argo Workflows and Airflow - so for event triggering Kubernetes will be your best bet.
You could embed the deployer API in flows deployed on k3s to actively invoke the step functions flows. It’s a bit clumsy but would work
b
okay thanks!