Hi, I think you have declared your PipelineML obje...
# advanced-need-help
u
Hi, I think you have declared your PipelineML object as in this demo: https://github.com/Galileo-Galilei/kedro-mlflow-tutorial/blob/4c85c357162a85093f0875fe3085fbd9ebe2e4be/src/kedro_mlflow_tutorial/hooks.py#L60-L70 You can see that the it is possible to specify the ``conda_env`` here. It accepts either a path or a dictionnary. I suggest in the tutorial to use ``{your_kedro_package}=={__version__} ``, because in an enterprise setup, I often deploy the code of my package in an internal Nexus /Pypi, so it can be downloaded when needed. It won't work for you if you haven't publish your package on PyPI fist, because conda tries to to pip install it. 4 solutions: - pass the dictionary of your requirements instead of the default one in the ``pipeline_ml_factory`` call - pass the path to your requirements.txt or conda.yml file ``pipeline_ml_factory`` call - publish your package on PyPI so conda can download it (not recommended for public projects, but likely the best solution in an enterprise setup) - create an empty conda env, activate it, install manually your package inside it (``pip install -e /path/to/kedro/package/src``) and call ``mlflow serve`` inside it.