acceptable-coat-97152
11/22/2022, 12:45 PMS3(s3root=s3://...)
directly it works perfectly while using the config with S3(run=self)
doesn’t.
# This works:
with S3(s3root='s3://<my-bucket-name>/metaflow/') as s3:
url = s3.put_files([('stockfish', 'stockfish')])
print("File saved to", url)
# This doesn't work:
with S3(run=self) as s3:
url = s3.put_files([('stockfish', 'stockfish')])
print("File saved to", url)
My config:
{
"METAFLOW_BATCH_JOB_QUEUE": "JobQueue<hash>",
"METAFLOW_DATASTORE_SYSROOT_S3": "s3://<my-bucket-name>/metaflow",
"METAFLOW_DATATOOLS_SYSROOT_S3": "s3://<my-bucket-name>/metaflow/data",
"METAFLOW_DEFAULT_DATASTORE": "s3",
"METAFLOW_ECS_S3_ACCESS_IAM_ROLE": "<My-Batch-IAM-Role>"
}
Run Command:
AWS_PROFILE=<my-profile> python helloaws.py run
Error Message:
2022-11-22 13:37:16.949 [1669120599412313/start/1 (pid 29059)] File "/Users/basti/opt/anaconda3/envs/deepcheat/lib/python3.9/genericpath.py", line 155, in _check_arg_types
2022-11-22 13:37:16.949 [1669120599412313/start/1 (pid 29059)] raise TypeError("Can't mix strings and bytes in path components") from None
2022-11-22 13:37:16.949 [1669120599412313/start/1 (pid 29059)] TypeError: Can't mix strings and bytes in path components
Any help or ideas would be highly appreciated 🙂