relation between "batch_size","epoch","step" in tensorflow?

5,766 views
Skip to first unread message

Eric Yue

unread,
Nov 27, 2016, 4:18:51 AM11/27/16
to Discuss
I'm confused about the "batch_size","epoch","step"  in tensorflow .

I think a "epoch" is a full loop on train dataset , a "batch_size" is a small sample of train dataset , and a "step" if a loop on batch_size. Am I right?




Raymond Chua

unread,
Nov 27, 2016, 5:37:35 AM11/27/16
to Eric Yue, Discuss
Hi Eric,
Batch size is the number of samples you put into for each training round.
So for each epoch, you can split your training sets into multiple batches. 
For example, I have 1000 images. If I set my batch size to 1, then for each epoch (training round), my input into the network will be 1 x 1000 images. If set my batch size to 2, then it will be 2 x 500 images. Meaning, for each epoch, I will run two rounds, each round using 500 images. Step is just the learning rate that you use for your optimizer. Usually, we start with 0.001 or 0.01. I recommend that you watch Andrew Ng's Machine Learning videos on Coursera to get a good understanding on ML if you want to have a good overall understanding.
Best,
Raymond 

On Sun, Nov 27, 2016 at 10:18 AM, Eric Yue <hi.moo...@gmail.com> wrote:
I'm confused about the "batch_size","epoch","step"  in tensorflow .

I think a "epoch" is a full loop on train dataset , a "batch_size" is a small sample of train dataset , and a "step" if a loop on batch_size. Am I right?




--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss+unsubscribe@tensorflow.org.
To post to this group, send email to dis...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/discuss/CAN4bGgOaPGFQU8BLXOJBDXDYmTsW53xx%3DwNrhAQ%3DU4_snt%3DOcw%40mail.gmail.com.

Eric Yue

unread,
Nov 27, 2016, 5:53:17 AM11/27/16
to Raymond Chua, Discuss
thanks for the explanation:)   but the "step" in tensorflow training is 100,200,300....  so I think it's same as the "train round" you mentioned , is it ?

Raymond Chua

unread,
Nov 27, 2016, 6:32:45 AM11/27/16
to Eric Yue, Discuss
Hi Eric,
Yes, in that case, it should be the training round per epoch. 

Sent from my iPhone

On 27 Nov 2016, at 11:53 AM, Eric Yue <hi.moo...@gmail.com> wrote:

thanks for the explanation:)   but the "step" in tensorflow training is 100,200,300....  so I think it's same as the "train round" you mentioned , is it ?

--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss+u...@tensorflow.org.

To post to this group, send email to dis...@tensorflow.org.

Efrem Braun

unread,
Mar 14, 2018, 9:10:26 PM3/14/18
to Discuss, ray.r...@gmail.com
I believe that the answer given by Raymond is flipped. Per https://developers.google.com/machine-learning/glossary/#epoch, in each epoch N/batch_size training iterations are done. So if batch_size is 1, then for each epoch I will run 1000 training iterations, each iteration using 1 image. If batch_size is 2, then for each epoch I will run 500 training iterations, each iteration using 2 image.
Reply all
Reply to author
Forward
0 new messages