Hello guys ! I finally got it to work. The problem...
# plugins-integrations
d
Hello guys ! I finally got it to work. The problem was on my end... In fact, as I was also tracking the git hash with MLflow, I had to add the git folder in the Docker container which of course I didn't do ... Anyway, execution works now ! Just had one question : When I run the pipeline locally without Airflow, I have the same run ID for each of versioned datasets. However, when I run it with Airflow, it creates different run IDs which makes it difficult to track and reproduce the outputs. Can you please help me get the same run ID with Airflow ? You can reproduce this behaviour with this repo : https://github.com/Downfor-u/kedro-airflow-simple-dag Here are the commands I execute for Airflow run : 1/
Copy code
kedro package
2/
Copy code
docker-compose up postgres
3/ Open another terminal :
Copy code
docker-compose up init_db
4/ In the new terminal :
Copy code
docker-compose up scheduler webserver
Thank you in advance !
2 Views