Hello, We would like, as part of real-time monito...
# ask-metaflow
s
Hello, We would like, as part of real-time monitoring of our Metaflow workflows, to log through calls to an external API every time a step finishes, or when a workflow concludes, or when an error occurs in a specific task. This logging is done through a library we've developed that sends messages to a PostgreSQL database. We've taken inspiration from what can be done with other scheduling tools like Airflow and Prefect, namely using callback functions. The equivalent in Metaflow would be a decorator. So, we've tried to make custom decorators on our end, but the only implementation that worked involves putting the following code in an
__init__.py
file:
Copy code
from metaflow.decorators import step, _import_plugin_decorators 
from metaflow.plugins import STEP_DECORATORS 
from .custom_step_decorator import TestDecorator 

STEP_DECORATORS.append(TestDecorator) 

_import_plugin_decorators(globals())
This allows us to import the decorator into the workflow code itself and apply it to the steps. However, we find this approach complex and are not sure how it fits into deployment on AWS with Step Functions. What would be the best way to proceed? On a related note, the package we use for logging is on GCP Artifact Registry. Is there a straightforward way to import it with GCP credentials, perhaps via the PyPI tag? How can we import this package for the entire workflow without having to do it for each step? Thank you for your response and have a great day!