Happy festivus metaflow peeps. Thought it was abo...
# dev-metaflow
c
Happy festivus metaflow peeps. Thought it was about time I asked again for an update to see if there is any progress for this ? We re name our flows and all the dependancies every few months - but it is unbelievable painful and the names are meaningful so it huts a lot to change it.
1
a
@careful-dress-39510 an easier solution compared to renaming flows would be to alter `METAFLOW_DATASTORE_SYSROOT_S3`'s value in your metaflow config ever so often so that the s3 cas datastore has a different prefix every few months. You can make this value a function of the current calendar quarter so that it auto-updates when a new quarter rolls in. Here is a stack overflow link that discusses how to set up an env var that is dependent on the current time (in zsh) that can serve as an inspiration.
c
So would this mean it uses a different folder (key) in the same s3 bucket and I can just delete the old one? I assume this would just be changed at build time, so it would require the model to undertake a build through CI every quatre?
what about a little script like this in my CI?
Copy code
import json

import pandas as pd


def update():
    with open("config.json", "r") as f:
        config = json.loads(f.read())

    sys_root = config["METAFLOW_DATASTORE_SYSROOT_S3"]
    today = pd.Timestamp("today")
    updated_sys_root = f"{sys_root}/{today.year}/{today.quarter}"

    config["METAFLOW_DATASTORE_SYSROOT_S3"] = updated_sys_root

    with open("config.json", "w") as f:
        f.write(json.dumps(config))


if __name__ == "__main__":
    update()
Did it python because that's the kind of thing that you can waste 6 hours trying to do it properly in bash and then realise you should have just written a python script 🤷‍♂️
s
Yeah - something of this sorts should work just fine