epoch size

46 views
Skip to first unread message

W

unread,
Feb 8, 2018, 4:51:29 PM2/8/18
to clojure-cortex
Hi,

I'm new to ML and clojiure/cortex.

Can someone explain to me what the parameter epoch-size is in the call to cortex.experiment.train/train-n ?  

Thanks,

W

W

unread,
Feb 8, 2018, 6:17:12 PM2/8/18
to clojure-cortex

I'm sorry, epoch-size is actually used in experiment-util/infinite-class-balanced-dataset

kiran karkera

unread,
Feb 9, 2018, 10:21:37 AM2/9/18
to clojure-cortex
One epoch is a unit of the training cycle where the network 'sees' all the instances in the training dataset exactly once.
Passing the epoch-count parameter tells the network to train for (and stop training after) N epochs.

W

unread,
Feb 9, 2018, 12:40:54 PM2/9/18
to clojure-cortex
Thank you Kiran.  

The parameter I wanted to ask about is actually Epoch-Size, not Epoch-Count. I dug around the code a bit and found that it is used in data-set preprocessing, for example, in experiment-util/infinite-class-balanced-dataset, to shuffle and data and generate an infinite dataset. 

I would like to know more about how to set this parameter, when the default value should not be used.

-W

kiran karkera

unread,
Feb 10, 2018, 2:09:56 AM2/10/18
to clojure-cortex
Hi,
As the name suggests, infinite-class-balanced-dataset is an abstraction of an infinite dataset.
The epoch-size parameter is used to simulate the notion of an epoch. What it does is partition the infinite dataset (into default of 1024 instances per partition).

I think tuning this parameter may be useful depending on the size of your Gpu memory. The largest epoch size that fits in Gpu memory might train quickest. This is however an oversimplification, I do not know if it's possible (using cortex APIs) to accurately measure what epoch size fits in Gpu memory.

Regards
Kiran

Reply all
Reply to author
Forward
0 new messages