Title
#plugins-integrations
f

Flow

05/02/2022, 5:23 PM
Are there any known issues with the "Apache Airflow with Astronomer" guide? I followed the guide with minor adjustments (new astro cli) but when running the DAG I get:
{standard_task_runner.py:92} ERROR - Failed to execute job 9 for task split (maximum recursion depth exceeded while calling a Python object; 337)
5:24 PM
Additionally when trying it with a "standard" docker-compose airflow the memory of the celery worker explodes.
8:30 AM
This seems to happen when the KedroSession is created
d

Downforu

05/03/2022, 8:49 AM
Hello, I do not use Airflow with Astronomer, but your memory problem might be solved if you update
disable_existing_loggers
to :
disable_existing_loggers: True
in the
logging.yml
file. Check noklam's answer : https://discord.com/channels/778216384475693066/908346260224872480/967074855235244113
n

noklam

05/03/2022, 8:59 AM
This is interesting, I have never seen this error. Are you running just the example? It would be great if you can make a repository and share what you changed.
f

Flow

05/03/2022, 9:26 AM
sure I'll package it up and send it over in the afternoon
8:52 PM
https://github.com/fdroessler/kedro-astro-bug - starting with the astrocloud commandline
astrocloud dev start
and then running the job in the UI creates the error
1:36 PM
@noklam did you get a chance to look at this?
n

noklam

05/13/2022, 5:49 PM
Sorry, I will try to find some time next week.
10:11 AM
@Flow I try to clone your repository and it seems run successfully without any issue. I am running on a Mac machine.
astrocloud dev start
UI trigger the new_kedro_project DAG -> All 3 tasks are run successfully.
f

Flow

05/16/2022, 10:20 AM
Ok weird I’ll check again