How do I calculate the max memory I can request fo...
# ask-metaflow
b
How do I calculate the max memory I can request for a job? Example: We are using
g5.8xlarge
which according to docs has 128 GiB (137439 MB) of memory. If I create a
@batch(memory=128 * 1024)
I can see that in the batch console that the container is allocating
131.072MB
but it does not fit on the
g5.8xlarge
and picks a larger one. I have to ask for
@batch(memory=120 * 1024)
for it to fit. I am a bit confused about the units here. Does metaflow or docker add some overhead to the requested memory? How do I precisely calculate my memory allowances based on the instance types I have?
a
@brave-fall-84099 - AWS Batch reserves 32MiB of RAM on each host. Also, you would have to ensure that there are no other workloads running on that instance to fit your workload