bitter-planet-74915
09/20/2023, 10:00 AMfrom metaflow import FlowSpec, Parameter, step
import os
os.environ['METAFLOW_DEFAULT_DATASTORE'] = 's3' # or it should be 'local'?
# os.environ['METAFLOW_DATASTORE_SYSROOT_S3'] = 's3'
class ParameterFlow(FlowSpec):
alpha = Parameter('alpha',
help='Learning rate',
default=0.01)
@step
def start(self):
print('alpha is %f' % self.alpha)
self.next(self.end)
@step
def end(self):
print('alpha is still %f' % self.alpha)
if __name__ == '__main__':
ParameterFlow()
My CLI command from here (https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-airflow#pushing-a-flow-to-production):
python parameter_flow.py --with retry airflow create parameter_dag.py
I got this error:
Airflow Exception:
Datastore type `local` is not supported with `airflow create`. Please choose from datastore of type `azure`or `s3`or `gs` when calling `airflow create`
I still don't quite understand what I need to change. Sorry guys! And thank you in advance!