Are there any known issues with the "Apache Airflo...
# plugins-integrations
f
Are there any known issues with the "Apache Airflow with Astronomer" guide? I followed the guide with minor adjustments (new astro cli) but when running the DAG I get:
{standard_task_runner.py:92} ERROR - Failed to execute job 9 for task split (maximum recursion depth exceeded while calling a Python object; 337)
Additionally when trying it with a "standard" docker-compose airflow the memory of the celery worker explodes.
This seems to happen when the KedroSession is created
d
Hello, I do not use Airflow with Astronomer, but your memory problem might be solved if you update
Copy code
disable_existing_loggers
to :
Copy code
disable_existing_loggers: True
in the
Copy code
logging.yml
file. Check noklam's answer : https://discord.com/channels/778216384475693066/908346260224872480/967074855235244113
n
This is interesting, I have never seen this error. Are you running just the example? It would be great if you can make a repository and share what you changed.
f
sure I'll package it up and send it over in the afternoon
https://github.com/fdroessler/kedro-astro-bug - starting with the astrocloud commandline
astrocloud dev start
and then running the job in the UI creates the error
@noklam did you get a chance to look at this?
n
Sorry, I will try to find some time next week.
@Flow I try to clone your repository and it seems run successfully without any issue. I am running on a Mac machine.
astrocloud dev start
UI trigger the new_kedro_project DAG -> All 3 tasks are run successfully.
f
Ok weird I’ll check again
63 Views