Hello team, I think there's a bug in `argo-workfl...
# ask-metaflow
a
Hello team, I think there's a bug in
argo-workflows trigger
, where it doesn't interpret a
JSONType
parameter correctly. It turns the parameter into a
String
. For example, I have a parameter like this:
Copy code
input_s3_batches = Parameter(
     "input_s3_batches", default=config.kuberay.extract_pipeline.input_s3_batches, type=JSONType
)
Yaml config looks something like this:
Copy code
input_s3_batches:
    - upstream_batch_tag: "0_128"
      input_s3_path: <S3_PATH>
    - upstream_batch_tag: "128:256"
      input_s3_path: <S3_PATH>
    - upstream_batch_tag: "256:384"
      input_s3_path: <S3_PATH>
    - upstream_batch_tag: "384:434"
      input_s3_path: <S3_PATH>
However, when I go to the Argo Workflows UI and submit the workflow manually from the submit button, it works fine and treats the
input_s3_batches
parameter appropriately. Is there a workaround for this issue?
a
hi! we are actively triaging it. we received a similar bug report early this morning
a
Ok thanks. I found a workaround for now.