Hey All, For our workflow orchestrations, we use ...
# ask-metaflow
q
Hey All, For our workflow orchestrations, we use Metaflow with AWS Batch. When triggering workflows with large input parameters, we receive the following error on AWS Batch jobs: "Container Overrides length must be at most 8192." In AWS, there is no option to increase the "Container Overrides length". Is there an alternative way to pass large input parameters while using Metaflow with AWS Batch? Best regards, Raaj
1
a
Hi! What version of metaflow are you on?
q
Hello, Thanks for the response. Please find the details below. Metaflow Service - netflixoss/metaflow_metadata_service:v2.4.12 Metaflow SDK - metaflow-2.11.15
s
can you try again with a newer version of metaflow - 2.11.15 is quite old
q
Sure, will try and post you. Thank you!
👍🏼 1
Hi there, we are facing the same issue with "Metaflow 2.13.9" as well. We use AWS Batch for compute and AWS Step Functions for orchestration. Here when we run following "python3 paramerflow.py step-functions trigger --alpha 'very-large-input'" we encounter error on step function execution that states ""Container Overrides length must be at most 8192."
a
do you have a lot of parameters etc.? step functions has a limit of 8192 characters for container overrides that is hard to work around - this was one of the reasons why we started supporting argo-workflows.
👍 1
h
We used to get around this by having all our parameters in a single file using IncludeFile, and a second parameter for all the overrides.
this 1
👍 1
q
Thank you all for the suggestions! We will use the IncludeFile option here.