Hi all :slightly_smiling_face: TL;DR - dag and st...
# ask-metaflow
b
Hi all 🙂 TL;DR - dag and stdout are not shown on my deployment, even though bucket si accessible . ---- • I deployed metaflow on EKS. I'm using aurora postgres and Amazon S3 • I can see my workflows on the UI, the issue I have is that I cannot see the DAG, nor stdout. • I cannot find any helpful error log, • I can confirm that I can list all resources in the S3 bucket from the metaflow-ui pod (though strangely there was no boto3 on that python env) --- Will appreciate any help
anyone? @dry-beach-38304 @straight-shampoo-11124
d
I don’t have much of an idea on that one. @bulky-afternoon-92433 might though. He is more in tune with the metadata service. The DAG does come from an artifact (so S3) and so does stdout/err. If you have ruled out permission issues I am not quite sure. There should be some debugging message too or at least the option to turn some on.
t
sorry it took a while to get to this. For accessing artifacts, the UI uses the metaflow client library under the hood. Few questions to assist with the debugging: • what version of the ui-service are you running? is it the official release image or did you build your own? • what version of the metaflow-ui (frontend) are you running? the number of cancelled dag requests in one of the screenshots is a bit concerning for S3 access related issues, the backend routes that rely on it should be responding with a descriptive error message, so if requests keep hanging and no errors are visible then I suspect something else is interfering