FileNotFoundError

18 views
Skip to first unread message

andrea...@gmail.com

unread,
Oct 8, 2019, 4:50:30 PM10/8/19
to Hops
When I try to do a test run and set my project path using

from hops import hdfs
project_path = hdfs.project_path() + 'Resources/cats_and_dogs_CNN_datasets'

and then make the function call 

training_set = train_datagen.flow_from_directory(project_path,
                                                     target_size=(64, 64),
                                                     batch_size=32,
                                                     class_mode='binary')

I get the error message

FileNotFoundError: [Errno 2] No such file or directory: 'hdfs://10.0.104.196:8020/Projects/andreas_cnn/Resources/cats_and_dogs_CNN_datasets'

I've double checked that all directories exist and are correctly spelled. I'm out of clues right now and could need some help.

Best Regards,

Andreas

Jim Dowling

unread,
Oct 9, 2019, 3:29:41 AM10/9/19
to andrea...@gmail.com, Hops
Do you know if train_datagen.flow_from_directory supports HDFS?
It looks like a keras image processing library. It may not support HDFS.
> --
> You received this message because you are subscribed to the Google Groups "Hops" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to hopshadoop+...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/hopshadoop/7ebc77b9-c4cb-41c7-bdd8-702a9f384e84%40googlegroups.com.



--
________________________________________________
Dr. Jim Dowling
http://www.jimdowling.info
email: dowli...@gmail.com
phone: +46 73 2505883

Jim Dowling

unread,
Oct 9, 2019, 3:31:35 AM10/9/19
to andrea...@gmail.com, Hops
You can can copy your files to the local filesystem and then use the
files from the keras library:
https://github.com/logicalclocks/hops-examples/blob/master/notebooks/ml/Filesystem/HopsFSOperations.ipynb

Jim Dowling

unread,
Oct 9, 2019, 3:38:29 AM10/9/19
to andrea...@gmail.com, Hops
Try this instead. It will copy the dataset to the local FS, then
return the local path.

from hops import hdfs
project_path = hdfs.project_path() + 'Resources/cats_and_dogs_CNN_datasets'


training_set = train_datagen.flow_from_directory(hdfs.copy_to_local(project_path),
target_size=(64, 64),
batch_size=32,
class_mode='binary')

andrea...@gmail.com

unread,
Oct 10, 2019, 5:45:17 PM10/10/19
to Hops
Thank you, Jim, this seems to work. However, is there perhaps a function that supports Hdfs? I did some googling but didn't find anything about this. 

/Andreas
> > > To unsubscribe from this group and stop receiving emails from it, send an email to hopsh...@googlegroups.com.

Steffen Grohsschmiedt

unread,
Oct 11, 2019, 9:28:22 AM10/11/19
to andrea...@gmail.com, Hops
Hi Andreas,

There is unfortunately none for Keras.

/Steffen

To unsubscribe from this group and stop receiving emails from it, send an email to hopshadoop+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hopshadoop/b8718f0c-c081-4051-83f7-6097edb78b00%40googlegroups.com.


--
Steffen Grohsschmiedt
------------------------------------
Head of Cloud, Logical Clocks 
W: logicalclocks.com | Hopsworks Docs: Hopsworks
Reply all
Reply to author
Forward
0 new messages