When you run `python flow.py run --with batch`, ho...
# dev-metaflow
l
When you run
python flow.py run --with batch
, how much is metaflow directly talking to AWS? (as opposed to using the Metadata service as a proxy). Does your laptop: 1. directly upload/download artifacts to S3? Or does is send them to the metadata service? 2. submit jobs to AWS batch? (or does the metadata service do that?) 3. access DynanoDB 4. Create/submit step function state machines? Does an AWS profile need to be configured for each user so their client can do the talking it needs to?
1
For the last question: I'm blind. Are there docs on the needed permissions? I'd like to create a "metaflow developer" role I can assign to users.
l
Hey Eric 👋 quick answers to your questions : 1. Yes it will directly upload them to s3 2. Yes, Metaflow will directly submit jobs to batch 3. Yes , the Metaflow task executed in batch via step functions will directly talk to dynamo 4. Yes Metaflow will directly submit the state machine to sfn. The Metaflow framework will directly talk to the AWS services so you will need to have the right AWS permissions where ever you are executing the flow / submitting to step functions. On the topic of how to set Roles and permissions, @average-beach-28850 and @ancient-application-36103 can give a better insight on different styles of provisioning permissions to users.
l
@little-apartment-49355 thank you! This is exactly what I was wanting to know. Also, I totally just saw your open-source Metaflow slackbot on GitHub. Fun!