Dear team! I’m experiencing some troubles with changing metadata from local to S3. Running on sagemaker, with batch and with the following configuration:
{
“METAFLOW_BATCH_JOB_QUEUE”: “test-test-test”,
“METAFLOW_DATASTORE_SYSROOT_S3”: “
s3://path_to_bucket”,
“METAFLOW_DATATOOLS_S3ROOT”: “
s3://path_to_bucket/metaflow”,
“METAFLOW_DEFAULT_DATASTORE”: “s3",
“METAFLOW_ECS_S3_ACCESS_IAM_ROLE”: “aws_batch_service_role”,
“METAFLOW_EVENTS_SFN_ACCESS_IAM_ROLE”: “my_eventbridge_role”,
“METAFLOW_SFN_DYNAMO_DB_TABLE”: “metaflow”,
“METAFLOW_SFN_IAM_ROLE”: my_sfn_iam_role”
}.
from metaflow import get_metadata
print(get_metadata())
prints local@/metaflow, although all the metadata is successfully saved in S3. Running script with batch. I’ve installed metaflow simply in terminal via pip (metaflow version: 2.10.11).
Can you help me please solve this issue? Thank you in advance!