https://kedro.org/ logo
Title
n

nd0rf1n

04/11/2022, 8:05 PM
Hello, everybody! It's my first day playing with Kedro and I'm getting the following error, when I'm running
kedro run
during the "Extend the data processing pipeline" step of the spaceflights tutorial (that's where you add the
pandas.ParquetDataset
to the catalog):
kedro.io.core.DataSetError: Class `pandas.ParquetDataset` not found or one of its dependencies has not been installed.
Any ideas on what the issue is? I've spent quite a few hours playing around with conda environments, tried both on Windows and WSL, but I keep getting the same error. Googling around has not helped either.
d

datajoely

04/11/2022, 8:11 PM
Have you done these steps of the tutorial?
Kedro ships the smallest possible version of itself
And we need you to install the extra parts as and when it's needed
n

nd0rf1n

04/11/2022, 8:13 PM
I've done these. I've also tried
pip install "kedro[pandas]"
pip install "kedro[all]"
pip install "kedro[pandas.ParquetDataset]"
none made any difference
d

datajoely

04/11/2022, 8:14 PM
Without the quotes?
n

nd0rf1n

04/11/2022, 8:15 PM
i did try that but let me give it another shot
d

datajoely

04/11/2022, 8:15 PM
The other option is pip install
kedro["pandas"]
n

nd0rf1n

04/11/2022, 8:15 PM
nah, all requirements already satisfied
n

noklam

04/11/2022, 8:16 PM
Can u check if you have pyarrow installed?
d

datajoely

04/11/2022, 8:16 PM
Pip freeze
n

nd0rf1n

04/11/2022, 8:16 PM
pyarrow is installed--i think it comes as a pandas dependency?
d

datajoely

04/11/2022, 8:17 PM
When this comes up it's usually two things: 1) dependencies not installed 2) they are installed but to the wrong environment
n

nd0rf1n

04/11/2022, 8:17 PM
i have created a new conda env specifically for playing around with kedro
i'm pretty sure this is not the issue
d

datajoely

04/11/2022, 8:18 PM
If requirements are installed to the environment kedro shouldn't have an issue
Can you post the full stack trace?
All we're doing here is a thin wrapper ontop of pandas
n

nd0rf1n

04/11/2022, 8:19 PM
what does "full stack trace" refer to?
d

datajoely

04/11/2022, 8:19 PM
The full error that you get when you try and run
Also the results of pip freeze would be helpful
n

nd0rf1n

04/11/2022, 8:22 PM
the output of
kedro run
d

datajoely

04/11/2022, 8:25 PM
If you type
which kedro
what do you get?
n

nd0rf1n

04/11/2022, 8:26 PM
it's the path to kedro in the current conda env:
/home/thanos/miniconda3/envs/test-kedro/bin/kedro
d

datajoely

04/11/2022, 8:27 PM
This is very weird
n

nd0rf1n

04/11/2022, 8:27 PM
we agree on that!
n

noklam

04/11/2022, 8:27 PM
I think it was a typo
Try
pandas.ParquetDataSet
d

datajoely

04/11/2022, 8:27 PM
With the space?
n

noklam

04/11/2022, 8:27 PM
The error shown u have ParquetDataset instead
Capital "S"
d

datajoely

04/11/2022, 8:28 PM
Ah gotcha
It's in your catalog entry!
Good spot young @noklam
Yeah dataset is titlecase
n

nd0rf1n

04/11/2022, 8:29 PM
it should be
pandas.ParquetDataSet
right?
d

datajoely

04/11/2022, 8:29 PM
Yes
n

nd0rf1n

04/11/2022, 8:30 PM
wow, good catch!
thanks, guys!
i've wasted more than 4 hours on this
d

datajoely

04/11/2022, 8:30 PM
No worries shout if you have any other problems
n

nd0rf1n

04/11/2022, 8:30 PM
πŸ˜‚
d

datajoely

04/11/2022, 8:30 PM
I'm going to put some thought in how we could give you a better error message
Good luck!
n

nd0rf1n

04/11/2022, 8:31 PM
actually, it failed again
d

datajoely

04/11/2022, 8:31 PM
What error?
n

nd0rf1n

04/11/2022, 8:31 PM
in a different way though
d

datajoely

04/11/2022, 8:31 PM
Different is good
n

nd0rf1n

04/11/2022, 8:31 PM
true that
a KeyError
i think i can look into that
d

datajoely

04/11/2022, 8:32 PM
πŸ‘Œ
n

nd0rf1n

04/11/2022, 8:32 PM
if i get stuck i'll reach out
@datajoely and @noklam, i really appreciate the help
what a cool community!
wow, it's been a very long time
n

noklam

04/11/2022, 8:33 PM
Glad it helpsπŸ˜‰
n

nd0rf1n

04/11/2022, 8:33 PM
haven't had this since the time when I was playing around with Julia and those guys were all over the place to help people
again, thank you both!