Facing an issue with memory specifications not wor...
# ask-metaflow
f
Facing an issue with memory specifications not working for a pipeline with steps running on AWS batch. I am defining my flow in a class inherited from FlowSpec with the step definitions like this:
Copy code
@retry(times=0)
    @timeout(minutes=1080)
    @batch(
        cpu=1,
        memory=16_000,
        gpu=0,
    )
    @environment(vars={"DISABLE_PARALLELISM": "0", "AWS_ROLE_SESSION_TIMEOUT": "3600"})
    @step
    def start(self):
        self.is_train = False
        print(self.configs)
        self.config = merge_configs(self.configs)
        run_steps("start", self)
        self.next(self.load_train_data, self.load_test_data)
However, the flow is deployed to AWS step functions where it always ends up getting the default 4gb memory for the steps even though I've set it to be 16GB as shown above