Guys! Hey! I try my best to set up data storage w...
# ask-metaflow
b
Guys! Hey! I try my best to set up data storage with on-prem s3 and local Metaflow service. I used this code:
Copy code
from metaflow import FlowSpec, Parameter, step

import os
os.environ['METAFLOW_DEFAULT_DATASTORE'] = 's3' # or it should be 'local'?
# os.environ['METAFLOW_DATASTORE_SYSROOT_S3'] = 's3'


class ParameterFlow(FlowSpec):
    alpha = Parameter('alpha',
                      help='Learning rate',
                      default=0.01)

    @step
    def start(self):
        print('alpha is %f' % self.alpha)
        self.next(self.end)

    @step
    def end(self):
        print('alpha is still %f' % self.alpha)

if __name__ == '__main__':
    ParameterFlow()
My CLI command from here (https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-airflow#pushing-a-flow-to-production):
Copy code
python parameter_flow.py --with retry airflow create parameter_dag.py
I got this error:
Copy code
Airflow Exception:
    Datastore type `local` is not supported with `airflow create`. Please choose from datastore of type `azure`or `s3`or `gs` when calling `airflow create`
I still don't quite understand what I need to change. Sorry guys! And thank you in advance!