1. sorry for my mistake this got confused by my team with the session.py that appears on the side. (none of my team put his hands on kedro so some questions are wrong assumptions perhaps)
2. yes I use kerdo-airflow plugin
3. I am struggling on this part, I have tried following most of what I find online but nothing comes closer to making kedro DAG runs (generic directional tips that don't translate into an operational tutorial perhaps) I start thinking we shouldn't run kedro pipelines in airflow
4. Yes this is related to kedro-airflow auto-generated dag.
Here is the summary of what I am trying to do:
1- I want to use kedro to make the data science team make production-ready code from the get go
2- As an MLOps I want to automate the process so that the kedro pipeline can be made into a DAG without so much friction and can be orchestrated
3- This kedro DAG should be able to run stand-alone (conf files & data should be read from a bucket -or from DWH or redis- and not from local storage) -cannot be pushing data and files cluttering the repo so leveraging bucket storage-
What's your take on my approach?