Hello! I'm using Metaflow locally. I'm running `...
# ask-metaflow
l
Hello! I'm using Metaflow locally. I'm running
metaflow-service
using Docker Compose and I configured
.metaflow/config.json
to use an S3 bucket. I was following Outerbounds' Introduction to Metaflow and everything was working as expected: artifacts were being stored in S3 and I could see the artifacts in the Metaflow UI. But in this episode of the tutorial I got a
ValueError: not enough values to unpack (expected 2, got 1)
. I found out that it has something to do with the amount of artifacts and S3 If I run the following code it works (two parameters are commented)
Copy code
from metaflow import FlowSpec, Parameter, step


class ParallelTreesFlow(FlowSpec):
    max_depth = Parameter("max_depth", default=None)
    random_state = Parameter("seed", default=21)
    n_estimators = Parameter("n-est", default=10)
    min_samples_split = Parameter("min-samples", default=2)
    # eval_metric = Parameter("eval-metric", default='mlogloss')
    # k_fold = Parameter("k", default=5)

    @step
    def start(self):
        print("ok 1")
        self.next(self.end)

    @step
    def end(self):
        print("ok 2")


if __name__ == "__main__":
    ParallelTreesFlow()
But if I try to use more than 4 parameters I get the
ValueError
1
Fixed by updating metaflow from
2.12.22
to
2.12.25