Hey, does anyone have experience sharing a snowfla...
# ask-metaflow
h
Hey, does anyone have experience sharing a snowflake connector among multiple steps in a flow?
Copy code
@step
    def start(self):
        from snowflake.connector import connect
        # Setup connector once -> authenticate once
        self.conn = connect(...)

    @step
    def query_data(self):
       ....
When trying something along the lines of what's above, I run into:
Cannot pickle dump artifact named "snowflake_conn"
Which makes sense, but makes me wonder whether using a shared connector is even a good idea.
1
c
I think it is better to reconstruct the connector in each @step that needs access. It’ll be easier to work with, and isolates where you send info needed to authenticate to the database.
h
Yeah, will keep like that. What I was trying to avoid was authenticating multiple times via sso, but just figured out how to set up caching for the snowflake connector
c
Nice idea! Would you be able to share an example/template for how the flow looks when you use caching?
h
actually nothing changes in the flow, it was just necessary to add an extra dependency (
secure-local-storage
) for the snowflake connector installation:
Copy code
snowflake-connector-python = {extras = ["pandas", "secure-local-storage"], version = "^3.12.4"}
noice 1
atm we're just executing locally, not sure what this will look like for remote execution, let's see when we get there 🙂