Does anybody have practical experience deploying Kedro pipelines as Argo Workflows? I have a couple thoughts/questions around the approach currently recommended in https://kedro.readthedocs.io/en/stable/deployment/argo
+ Based on your experience, are nodes the correct level for containerization? Should it be one modular pipeline per step instead? The whole pipeline in one step?
+ Did you consider passing data between workflow steps (see https://github.com/argoproj/argo-workflows/blob/master/examples/artifact-passing.yaml
)? Would it be an issue if all intermediate data passing happened like this?
+ Did the suggested approach (or whatever approach you took) not satisfy certain needs? Could some aspects have been easier?
Please also feel free to include any other information w.r.t. your experience deploying to Argo Workflows. Thanks!!